WorldWideScience

Sample records for pre-launch algorithm development

  1. GPM Ground Validation: Pre to Post-Launch Era

    Science.gov (United States)

    Petersen, Walt; Skofronick-Jackson, Gail; Huffman, George

    2015-04-01

    NASA GPM Ground Validation (GV) activities have transitioned from the pre to post-launch era. Prior to launch direct validation networks and associated partner institutions were identified world-wide, covering a plethora of precipitation regimes. In the U.S. direct GV efforts focused on use of new operational products such as the NOAA Multi-Radar Multi-Sensor suite (MRMS) for TRMM validation and GPM radiometer algorithm database development. In the post-launch, MRMS products including precipitation rate, accumulation, types and data quality are being routinely generated to facilitate statistical GV of instantaneous (e.g., Level II orbit) and merged (e.g., IMERG) GPM products. Toward assessing precipitation column impacts on product uncertainties, range-gate to pixel-level validation of both Dual-Frequency Precipitation Radar (DPR) and GPM microwave imager data are performed using GPM Validation Network (VN) ground radar and satellite data processing software. VN software ingests quality-controlled volumetric radar datasets and geo-matches those data to coincident DPR and radiometer level-II data. When combined MRMS and VN datasets enable more comprehensive interpretation of both ground and satellite-based estimation uncertainties. To support physical validation efforts eight (one) field campaigns have been conducted in the pre (post) launch era. The campaigns span regimes from northern latitude cold-season snow to warm tropical rain. Most recently the Integrated Precipitation and Hydrology Experiment (IPHEx) took place in the mountains of North Carolina and involved combined airborne and ground-based measurements of orographic precipitation and hydrologic processes underneath the GPM Core satellite. One more U.S. GV field campaign (OLYMPEX) is planned for late 2015 and will address cold-season precipitation estimation, process and hydrology in the orographic and oceanic domains of western Washington State. Finally, continuous direct and physical validation

  2. Pre-Launch Tasks Proposed in our Contract of December 1991

    Science.gov (United States)

    1998-01-01

    We propose, during the pre-EOS phase to: (1) develop, with other MODIS Team Members, a means of discriminating different major biome types with NDVI and other AVHRR-based data; (2) develop a simple ecosystem process model for each of these biomes, BIOME-BGC; (3) relate the seasonal trend of weekly composite NDVI to vegetation phenology and temperature limits to develop a satellite defined growing season for vegetation; and (4) define physiologically based energy to mass conversion factors for carbon and water for each biome. Our final core at-launch product will be simplified, completely satellite driven biome specific models for net primary production. We will build these biome specific satellite driven algorithms using a family of simple ecosystem process models as calibration models, collectively called BIOME-BGC, and establish coordination with an existing network of ecological study sites in order to test and validate these products. Field datasets will then be available for both BIOME-BGC development and testing, use for algorithm developments of other MODIS Team Members, and ultimately be our first test point for MODIS land vegetation products upon launch. We will use field sites from the National Science Foundation Long-Term Ecological Research network, and develop Glacier National Park as a major site for intensive validation.

  3. The GPM Ground Validation Program: Pre to Post-Launch

    Science.gov (United States)

    Petersen, W. A.

    2014-12-01

    NASA GPM Ground Validation (GV) activities have transitioned from the pre to post-launch era. Prior to launch direct validation networks and associated partner institutions were identified world-wide, covering a plethora of precipitation regimes. In the U.S. direct GV efforts focused on use of new operational products such as the NOAA Multi-Radar Multi-Sensor suite (MRMS) for TRMM validation and GPM radiometer algorithm database development. In the post-launch, MRMS products including precipitation rate, types and data quality are being routinely generated to facilitate statistical GV of instantaneous and merged GPM products. To assess precipitation column impacts on product uncertainties, range-gate to pixel-level validation of both Dual-Frequency Precipitation Radar (DPR) and GPM microwave imager data are performed using GPM Validation Network (VN) ground radar and satellite data processing software. VN software ingests quality-controlled volumetric radar datasets and geo-matches those data to coincident DPR and radiometer level-II data. When combined MRMS and VN datasets enable more comprehensive interpretation of ground-satellite estimation uncertainties. To support physical validation efforts eight (one) field campaigns have been conducted in the pre (post) launch era. The campaigns span regimes from northern latitude cold-season snow to warm tropical rain. Most recently the Integrated Precipitation and Hydrology Experiment (IPHEx) took place in the mountains of North Carolina and involved combined airborne and ground-based measurements of orographic precipitation and hydrologic processes underneath the GPM Core satellite. One more U.S. GV field campaign (OLYMPEX) is planned for late 2015 and will address cold-season precipitation estimation, process and hydrology in the orographic and oceanic domains of western Washington State. Finally, continuous direct and physical validation measurements are also being conducted at the NASA Wallops Flight Facility multi

  4. Modeling in the State Flow Environment to Support Launch Vehicle Verification Testing for Mission and Fault Management Algorithms in the NASA Space Launch System

    Science.gov (United States)

    Trevino, Luis; Berg, Peter; England, Dwight; Johnson, Stephen B.

    2016-01-01

    Analysis methods and testing processes are essential activities in the engineering development and verification of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS). Central to mission success is reliable verification of the Mission and Fault Management (M&FM) algorithms for the SLS launch vehicle (LV) flight software. This is particularly difficult because M&FM algorithms integrate and operate LV subsystems, which consist of diverse forms of hardware and software themselves, with equally diverse integration from the engineering disciplines of LV subsystems. M&FM operation of SLS requires a changing mix of LV automation. During pre-launch the LV is primarily operated by the Kennedy Space Center (KSC) Ground Systems Development and Operations (GSDO) organization with some LV automation of time-critical functions, and much more autonomous LV operations during ascent that have crucial interactions with the Orion crew capsule, its astronauts, and with mission controllers at the Johnson Space Center. M&FM algorithms must perform all nominal mission commanding via the flight computer to control LV states from pre-launch through disposal and also address failure conditions by initiating autonomous or commanded aborts (crew capsule escape from the failing LV), redundancy management of failing subsystems and components, and safing actions to reduce or prevent threats to ground systems and crew. To address the criticality of the verification testing of these algorithms, the NASA M&FM team has utilized the State Flow environment6 (SFE) with its existing Vehicle Management End-to-End Testbed (VMET) platform which also hosts vendor-supplied physics-based LV subsystem models. The human-derived M&FM algorithms are designed and vetted in Integrated Development Teams composed of design and development disciplines such as Systems Engineering, Flight Software (FSW), Safety and Mission Assurance (S&MA) and major subsystems and vehicle elements

  5. JPSS-1 VIIRS Pre-Launch Radiometric Performance

    Science.gov (United States)

    Oudrari, Hassan; McIntire, Jeff; Xiong, Xiaoxiong; Butler, James; Efremova, Boryana; Ji, Jack; Lee, Shihyan; Schwarting, Tom

    2015-01-01

    The Visible Infrared Imaging Radiometer Suite (VIIRS) on-board the first Joint Polar Satellite System (JPSS) completed its sensor level testing on December 2014. The JPSS-1 (J1) mission is scheduled to launch in December 2016, and will be very similar to the Suomi-National Polar-orbiting Partnership (SNPP) mission. VIIRS instrument was designed to provide measurements of the globe twice daily. It is a wide-swath (3,040 kilometers) cross-track scanning radiometer with spatial resolutions of 370 and 740 meters at nadir for imaging and moderate bands, respectively. It covers the wavelength spectrum from reflective to long-wave infrared through 22 spectral bands [0.412 microns to 12.01 microns]. VIIRS observations are used to generate 22 environmental data products (EDRs). This paper will briefly describe J1 VIIRS characterization and calibration performance and methodologies executed during the pre-launch testing phases by the independent government team, to generate the at-launch baseline radiometric performance, and the metrics needed to populate the sensor data record (SDR) Look-Up-Tables (LUTs). This paper will also provide an assessment of the sensor pre-launch radiometric performance, such as the sensor signal to noise ratios (SNRs), dynamic range, reflective and emissive bands calibration performance, polarization sensitivity, bands spectral performance, response-vs-scan (RVS), near field and stray light responses. A set of performance metrics generated during the pre-launch testing program will be compared to the SNPP VIIRS pre-launch performance.

  6. Overhead-Aware-Best-Fit (OABF) Resource Allocation Algorithm for Minimizing VM Launching Overhead

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Hao [IIT; Garzoglio, Gabriele [Fermilab; Ren, Shangping [IIT, Chicago; Timm, Steven [Fermilab; Noh, Seo Young [KISTI, Daejeon

    2014-11-11

    FermiCloud is a private cloud developed in Fermi National Accelerator Laboratory to provide elastic and on-demand resources for different scientific research experiments. The design goal of the FermiCloud is to automatically allocate resources for different scientific applications so that the QoS required by these applications is met and the operational cost of the FermiCloud is minimized. Our earlier research shows that VM launching overhead has large variations. If such variations are not taken into consideration when making resource allocation decisions, it may lead to poor performance and resource waste. In this paper, we show how we may use an VM launching overhead reference model to minimize VM launching overhead. In particular, we first present a training algorithm that automatically tunes a given refer- ence model to accurately reflect FermiCloud environment. Based on the tuned reference model for virtual machine launching overhead, we develop an overhead-aware-best-fit resource allocation algorithm that decides where and when to allocate resources so that the average virtual machine launching overhead is minimized. The experimental results indicate that the developed overhead-aware-best-fit resource allocation algorithm can significantly improved the VM launching time when large number of VMs are simultaneously launched.

  7. Aero-Assisted Pre-Stage for Ballistic and Aero-Assisted Launch Vehicles

    Science.gov (United States)

    Ustinov, Eugene A.

    2012-01-01

    A concept of an aero-assisted pre-stage is proposed, which enables launch of both ballistic and aero-assisted launch vehicles from conventional runways. The pre-stage can be implemented as a delta-wing with a suitable undercarriage, which is mated with the launch vehicle, so that their flight directions are coaligned. The ample wing area of the pre-stage combined with the thrust of the launch vehicle ensure prompt roll-out and take-off of the stack at airspeeds typical for a conventional jet airliner. The launch vehicle is separated from the pre-stage as soon as safe altitude is achieved, and the desired ascent trajectory is reached. Nominally, the pre-stage is non-powered. As an option, to save the propellant of the launch vehicle, the pre-stage may have its own short-burn propulsion system, whereas the propulsion system of the launch vehicle is activated at the separation point. A general non-dimensional analysis of performance of the pre-stage from roll-out to separation is carried out and applications to existing ballistic launch vehicle and hypothetical aero-assisted vehicles (spaceplanes) are considered.

  8. JPSS-1 VIIRS Pre-Launch Radiometric Performance

    Science.gov (United States)

    Oudrari, Hassan; Mcintire, Jeffrey; Xiong, Xiaoxiong; Butler, James; Ji, Qiang; Schwarting, Tom; Zeng, Jinan

    2015-01-01

    The first Joint Polar Satellite System (JPSS-1 or J1) mission is scheduled to launch in January 2017, and will be very similar to the Suomi-National Polar-orbiting Partnership (SNPP) mission. The Visible Infrared Imaging Radiometer Suite (VIIRS) on board the J1 spacecraft completed its sensor level performance testing in December 2014. VIIRS instrument is expected to provide valuable information about the Earth environment and properties on a daily basis, using a wide-swath (3,040 km) cross-track scanning radiometer. The design covers the wavelength spectrum from reflective to long-wave infrared through 22 spectral bands, from 0.412 m to 12.01 m, and has spatial resolutions of 370 m and 740 m at nadir for imaging and moderate bands, respectively. This paper will provide an overview of pre-launch J1 VIIRS performance testing and methodologies, describing the at-launch baseline radiometric performance as well as the metrics needed to calibrate the instrument once on orbit. Key sensor performance metrics include the sensor signal to noise ratios (SNRs), dynamic range, reflective and emissive bands calibration performance, polarization sensitivity, bands spectral performance, response-vs-scan (RVS), near field response, and stray light rejection. A set of performance metrics generated during the pre-launch testing program will be compared to the sensor requirements and to SNPP VIIRS pre-launch performance.

  9. Flight Testing of the Space Launch System (SLS) Adaptive Augmenting Control (AAC) Algorithm on an F/A-18

    Science.gov (United States)

    Dennehy, Cornelius J.; VanZwieten, Tannen S.; Hanson, Curtis E.; Wall, John H.; Miller, Chris J.; Gilligan, Eric T.; Orr, Jeb S.

    2014-01-01

    The Marshall Space Flight Center (MSFC) Flight Mechanics and Analysis Division developed an adaptive augmenting control (AAC) algorithm for launch vehicles that improves robustness and performance on an as-needed basis by adapting a classical control algorithm to unexpected environments or variations in vehicle dynamics. This was baselined as part of the Space Launch System (SLS) flight control system. The NASA Engineering and Safety Center (NESC) was asked to partner with the SLS Program and the Space Technology Mission Directorate (STMD) Game Changing Development Program (GCDP) to flight test the AAC algorithm on a manned aircraft that can achieve a high level of dynamic similarity to a launch vehicle and raise the technology readiness of the algorithm early in the program. This document reports the outcome of the NESC assessment.

  10. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Patterson, Jonathan; Teare, David; Johnson, Stephen

    2015-01-01

    The engineering development of the new Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these spacecraft systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex system engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in specialized Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model based algorithms and their development lifecycle from inception through Flight Software certification are an important focus of this development effort to further insure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. NASA formed a dedicated M&FM team for addressing fault management early in the development lifecycle for the SLS initiative. As part of the development of the M&FM capabilities, this team has developed a dedicated testbed that

  11. STS-105/Discovery/ISS 7A.1: Pre-Launch Activities, Launch, Orbit Activities and Landing

    Science.gov (United States)

    2001-01-01

    The crew of Space Shuttle Discovery on STS-105 is introduced at their pre-launch meal and at suit-up. The crew members include Commander Scott Horowitz, Pilot Rick Sturckow, and Mission Specialists Patrick Forrester and Daniel Barry, together with the Expedition 3 crew of the International Space Station (ISS). The Expedition 3 crew includes Commander Frank Culbertson, Soyuz Commander Vladimir Dezhurov, and Flight Engineer Mikhail Tyurin. When the astronauts depart for the launch pad in the Astrovan, their convoy is shown from above. Upon reaching the launch pad, they conduct a walk around of the shuttle, display signs for family members while being inspected in the White Room, and are strapped into their seats onboard Disciovery. The video includes footage of Discovery in the Orbiter Processing Facility, and some of the pre-launch procedures at the Launch Control Center are shown. The angles of launch replays include: TV-1, Beach Tracker, VAB, Pad A, Tower 1, UCS-15, Grandstand, OTV-70, Onboard, IGOR, and UCS-23. The moment of docking between Discovery and the ISS is shown from inside Discovery's cabin. While in orbit, the crew conducted extravehicular activities (EVAs) to attach an experiments container, and install handrails on the Destiny module of the ISS. The video shows the docking and unloading of the Leonardo Multipurpose Logistics Module (MPLM) onto the ISS. The deployment of a satellite from Discovery with the coast of the Gulf of Mexico in the background is shown. Cape Canaveral is also shown from space. Landing replays include VAB, Tower 1, mid-field, South End SLF, North End SLF, Tower 2, Playalinda DOAMS, UCS-23, and Pilot Point of View (PPOV). NASA Administrator Dan Goldin meets the crew upon landing and participates in their walk around of Discovery. The video concludes with a short speech by commander Horowitz.

  12. Design and Flight Performance of the Orion Pre-Launch Navigation System

    Science.gov (United States)

    Zanetti, Renato

    2016-01-01

    Launched in December 2014 atop a Delta IV Heavy from the Kennedy Space Center, the Orion vehicle's Exploration Flight Test-1 (EFT-1) successfully completed the objective to test the prelaunch and entry components of the system. Orion's pre-launch absolute navigation design is presented, together with its EFT-1 performance.

  13. Design Optimization of Space Launch Vehicles Using a Genetic Algorithm

    National Research Council Canada - National Science Library

    Bayley, Douglas J

    2007-01-01

    .... A genetic algorithm (GA) was employed to optimize the design of the space launch vehicle. A cost model was incorporated into the optimization process with the goal of minimizing the overall vehicle cost...

  14. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The development of the Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large complex systems engineering challenge being addressed in part by focusing on the specific subsystems handling of off-nominal mission and fault tolerance. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML), the Mission and Fault Management (M&FM) algorithms are crafted and vetted in specialized Integrated Development Teams composed of multiple development disciplines. NASA also has formed an M&FM team for addressing fault management early in the development lifecycle. This team has developed a dedicated Vehicle Management End-to-End Testbed (VMET) that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. The flexibility of VMET enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the algorithms utilizing actual subsystem models. The intent is to validate the algorithms and substantiate them with performance baselines for each of the vehicle subsystems in an independent platform exterior to flight software test processes. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test processes. Risk reduction is addressed by working with other organizations such as S

  15. STAR Algorithm Integration Team - Facilitating operational algorithm development

    Science.gov (United States)

    Mikles, V. J.

    2015-12-01

    The NOAA/NESDIS Center for Satellite Research and Applications (STAR) provides technical support of the Joint Polar Satellite System (JPSS) algorithm development and integration tasks. Utilizing data from the S-NPP satellite, JPSS generates over thirty Environmental Data Records (EDRs) and Intermediate Products (IPs) spanning atmospheric, ocean, cryosphere, and land weather disciplines. The Algorithm Integration Team (AIT) brings technical expertise and support to product algorithms, specifically in testing and validating science algorithms in a pre-operational environment. The AIT verifies that new and updated algorithms function in the development environment, enforces established software development standards, and ensures that delivered packages are functional and complete. AIT facilitates the development of new JPSS-1 algorithms by implementing a review approach based on the Enterprise Product Lifecycle (EPL) process. Building on relationships established during the S-NPP algorithm development process and coordinating directly with science algorithm developers, the AIT has implemented structured reviews with self-contained document suites. The process has supported algorithm improvements for products such as ozone, active fire, vegetation index, and temperature and moisture profiles.

  16. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASAs Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The engineering development of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS) requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The nominal and off-nominal characteristics of SLS's elements and subsystems must be understood and matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex systems engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model-based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model-based algorithms and their development lifecycle from inception through FSW certification are an important focus of SLS's development effort to further ensure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. To test and validate these M&FM algorithms a dedicated test-bed was developed for full Vehicle Management End-to-End Testing (VMET). For addressing fault management (FM

  17. Evaluation of focused ultrasound algorithms: Issues for reducing pre-focal heating and treatment time.

    Science.gov (United States)

    Yiannakou, Marinos; Trimikliniotis, Michael; Yiallouras, Christos; Damianou, Christakis

    2016-02-01

    Due to the heating in the pre-focal field the delay between successive movements in high intensity focused ultrasound (HIFU) are sometimes as long as 60s, resulting to treatment time in the order of 2-3h. Because there is generally a requirement to reduce treatment time, we were motivated to explore alternative transducer motion algorithms in order to reduce pre-focal heating and treatment time. A 1 MHz single element transducer with 4 cm diameter and 10 cm focal length was used. A simulation model was developed that estimates the temperature, thermal dose and lesion development in the pre-focal field. The simulated temperature history that was combined with the motion algorithms produced thermal maps in the pre-focal region. Polyacrylimde gel phantom was used to evaluate the induced pre-focal heating for each motion algorithm used, and also was used to assess the accuracy of the simulation model. Three out of the six algorithms having successive steps close to each other, exhibited severe heating in the pre-focal field. Minimal heating was produced with the algorithms having successive steps apart from each other (square, square spiral and random). The last three algorithms were improved further (with small cost in time), thus eliminating completely the pre-focal heating and reducing substantially the treatment time as compared to traditional algorithms. Out of the six algorithms, 3 were successful in eliminating the pre-focal heating completely. Because these 3 algorithms required no delay between successive movements (except in the last part of the motion), the treatment time was reduced by 93%. Therefore, it will be possible in the future, to achieve treatment time of focused ultrasound therapies shorter than 30 min. The rate of ablated volume achieved with one of the proposed algorithms was 71 cm(3)/h. The intention of this pilot study was to demonstrate that the navigation algorithms play the most important role in reducing pre-focal heating. By evaluating in

  18. Design optimization of space launch vehicles using a genetic algorithm

    Science.gov (United States)

    Bayley, Douglas James

    The United States Air Force (USAF) continues to have a need for assured access to space. In addition to flexible and responsive spacelift, a reduction in the cost per launch of space launch vehicles is also desirable. For this purpose, an investigation of the design optimization of space launch vehicles has been conducted. Using a suite of custom codes, the performance aspects of an entire space launch vehicle were analyzed. A genetic algorithm (GA) was employed to optimize the design of the space launch vehicle. A cost model was incorporated into the optimization process with the goal of minimizing the overall vehicle cost. The other goals of the design optimization included obtaining the proper altitude and velocity to achieve a low-Earth orbit. Specific mission parameters that are particular to USAF space endeavors were specified at the start of the design optimization process. Solid propellant motors, liquid fueled rockets, and air-launched systems in various configurations provided the propulsion systems for two, three and four-stage launch vehicles. Mass properties models, an aerodynamics model, and a six-degree-of-freedom (6DOF) flight dynamics simulator were all used to model the system. The results show the feasibility of this method in designing launch vehicles that meet mission requirements. Comparisons to existing real world systems provide the validation for the physical system models. However, the ability to obtain a truly minimized cost was elusive. The cost model uses an industry standard approach, however, validation of this portion of the model was challenging due to the proprietary nature of cost figures and due to the dependence of many existing systems on surplus hardware.

  19. Preliminary application of a novel algorithm to monitor changes in pre-flight total peripheral resistance for prediction of post-flight orthostatic intolerance in astronauts

    Science.gov (United States)

    Arai, Tatsuya; Lee, Kichang; Stenger, Michael B.; Platts, Steven H.; Meck, Janice V.; Cohen, Richard J.

    2011-04-01

    Orthostatic intolerance (OI) is a significant challenge for astronauts after long-duration spaceflight. Depending on flight duration, 20-80% of astronauts suffer from post-flight OI, which is associated with reduced vascular resistance. This paper introduces a novel algorithm for continuously monitoring changes in total peripheral resistance (TPR) by processing the peripheral arterial blood pressure (ABP). To validate, we applied our novel mathematical algorithm to the pre-flight ABP data previously recorded from twelve astronauts ten days before launch. The TPR changes were calculated by our algorithm and compared with the TPR value estimated using cardiac output/heart rate before and after phenylephrine administration. The astronauts in the post-flight presyncopal group had lower pre-flight TPR changes (1.66 times) than those in the non-presyncopal group (2.15 times). The trend in TPR changes calculated with our algorithm agreed with the TPR trend calculated using measured cardiac output in the previous study. Further data collection and algorithm refinement are needed for pre-flight detection of OI and monitoring of continuous TPR by analysis of peripheral arterial blood pressure.

  20. An algorithm on simultaneous optimization of performance and mass parameters of open-cycle liquid-propellant engine of launch vehicles

    Science.gov (United States)

    Eskandari, M. A.; Mazraeshahi, H. K.; Ramesh, D.; Montazer, E.; Salami, E.; Romli, F. I.

    2017-12-01

    In this paper, a new method for the determination of optimum parameters of open-cycle liquid-propellant engine of launch vehicles is introduced. The parameters affecting the objective function, which is the ratio of specific impulse to gross mass of the launch vehicle, are chosen to achieve maximum specific impulse as well as minimum mass for the structure of engine, tanks, etc. The proposed algorithm uses constant integration of thrust with respect to time for launch vehicle with specific diameter and length to calculate the optimum working condition. The results by this novel algorithm are compared to those obtained from using Genetic Algorithm method and they are also validated against the results of existing launch vehicle.

  1. Pre-Launch Assessment of User Needs for SWOT Mission Data Products

    Science.gov (United States)

    Srinivasan, M. M.; Peterson, C. A.; Doorn, B.

    2015-12-01

    In order to effectively address the applications requirements of future Surface Water and Ocean Topography (SWOT) mission data users, we must understand their needs with respect to latency, spatial scales, technical capabilities, and other practical considerations. We have developed the 1st SWOT User Survey for broad distribution to the SWOT applications community to provide the SWOT Project with an understanding of and improved ability to support users needs. Actionable knowledge for specific applications may be realized when we can determine the margins of user requirements for data products and access. The SWOT Applications team will be launching a SWOT Early Adopters program and are interested in identifying a broad community of users who will participate in pre-launch applications activities including meetings, briefings, and workshops. The SWOT applications program is designed to connect mission scientists to end users and leverage the scientific research and data management tools with operational decision-making for different thematic users and data requirements. SWOT is scheduled to launch in 2020, so simulated hydrology and ocean data sets have been and will continued to be developed by science team members and the SWOT Project in order to determine how the data will represent the physical Earth systems targeted by the mission. SWOT will produce the first global survey of Earth's surface water by measuring sea surface height and the heights, slopes, and inundated areas of rivers, lakes, and wetlands. These coastal, lake and river measurements will be used for monitoring the hydrologic cycle, flooding, and climate impacts of a changing environment. The oceanographic measurements will enhance understanding of submesoscale processes and extend the capabilities of ocean state and climate prediction models.

  2. Pre-Launch GOES-R Risk Reduction Activities for the Geostationary Lightning Mapper

    Science.gov (United States)

    Goodman, S. J.; Blakeslee, R. J.; Boccippio, D. J.; Christian, H. J.; Koshak, W. J.; Petersen, W. A.

    2005-01-01

    The GOES-R Geostationary Lightning Mapper (GLM) is a new instrument planned for GOES-R that will greatly improve storm hazard nowcasting and increase warning lead time day and night. Daytime detection of lightning is a particularly significant technological advance given the fact that the solar illuminated cloud-top signal can exceed the intensity of the lightning signal by a factor of one hundred. Our approach is detailed across three broad themes which include: Data Processing Algorithm Readiness, Forecast Applications, and Radiance Data Mining. These themes address how the data will be processed and distributed, and the algorithms and models for developing, producing, and using the data products. These pre-launch risk reduction activities will accelerate the operational and research use of the GLM data once GOES-R begins on-orbit operations. The GLM will provide unprecedented capabilities for tracking thunderstorms and earlier warning of impending severe and hazardous weather threats. By providing direct information on lightning initiation, propagation, extent, and rate, the GLM will also capture the updraft dynamics and life cycle of convective storms, as well as internal ice precipitation processes. The GLM provides information directly from the heart of the thunderstorm as opposed to cloud-top only. Nowcasting applications enabled by the GLM data will expedite the warning and response time of emergency management systems, improve the dispatch of electric power utility repair crews, and improve airline routing around thunderstorms thereby improving safety and efficiency, saving fuel and reducing delays. The use of GLM data will assist the Bureau of Land Management (BLM) and the Forest Service in quickly detecting lightning ground strikes that have a high probability of causing fires. Finally, GLM data will help assess the role of thunderstorms and deep convection in global climate, and will improve regional air quality and global chemistry/climate modeling

  3. STS-93 crew gathers for pre-launch breakfast in O&C Building

    Science.gov (United States)

    1999-01-01

    The STS-93 crew gathers a second time for a pre-launch breakfast in the Operations and Checkout Building before suiting up for launch. After Space Shuttle Columbia's July 20 launch attempt was scrubbed at the T-7 second mark in the countdown, the launch was rescheduled for Thursday, July 22, at 12:28 a.m. EDT. Seated from left are Mission Specialists Michel Tognini, of France, who represents the Centre National d'Etudes Spatiales (CNES), and Steven A. Hawley (Ph.D.), Commander Eileen M. Collins, Pilot Jeffrey S. Ashby, and Mission Specialist Catherine G. Coleman (Ph.D.). STS-93 is a five-day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. The new telescope is 20 to 50 times more sensitive than any previous X-ray telescope and is expected unlock the secrets of supernovae, quasars and black holes. Collins is the first woman to serve as commander of a Shuttle mission. The target landing date is July 26, 1999, at 11:24 p.m. EDT.

  4. On the performance of pre-microRNA detection algorithms

    DEFF Research Database (Denmark)

    Saçar Demirci, Müşerref Duygu; Baumbach, Jan; Allmer, Jens

    2017-01-01

    assess 13 ab initio pre-miRNA detection approaches using all relevant, published, and novel data sets while judging algorithm performance based on ten intrinsic performance measures. We present an extensible framework, izMiR, which allows for the unbiased comparison of existing algorithms, adding new...

  5. Closed Loop Guidance Trade Study for Space Launch System Block-1B Vehicle

    Science.gov (United States)

    Von der Porten, Paul; Ahmad, Naeem; Hawkins, Matt

    2018-01-01

    NASA is currently building the Space Launch System (SLS) Block-1 launch vehicle for the Exploration Mission 1 (EM-1) test flight. The design of the next evolution of SLS, Block-1B, is well underway. The Block-1B vehicle is more capable overall than Block-1; however, the relatively low thrust-to-weight ratio of the Exploration Upper Stage (EUS) presents a challenge to the Powered Explicit Guidance (PEG) algorithm used by Block-1. To handle the long burn durations (on the order of 1000 seconds) of EUS missions, two algorithms were examined. An alternative algorithm, OPGUID, was introduced, while modifications were made to PEG. A trade study was conducted to select the guidance algorithm for future SLS vehicles. The chosen algorithm needs to support a wide variety of mission operations: ascent burns to LEO, apogee raise burns, trans-lunar injection burns, hyperbolic Earth departure burns, and contingency disposal burns using the Reaction Control System (RCS). Additionally, the algorithm must be able to respond to a single engine failure scenario. Each algorithm was scored based on pre-selected criteria, including insertion accuracy, algorithmic complexity and robustness, extensibility for potential future missions, and flight heritage. Monte Carlo analysis was used to select the final algorithm. This paper covers the design criteria, approach, and results of this trade study, showing impacts and considerations when adapting launch vehicle guidance algorithms to a broader breadth of in-space operations.

  6. Application of Space Environmental Observations to Spacecraft Pre-Launch Engineering and Spacecraft Operations

    Science.gov (United States)

    Barth, Janet L.; Xapsos, Michael

    2008-01-01

    This presentation focuses on the effects of the space environment on spacecraft systems and applying this knowledge to spacecraft pre-launch engineering and operations. Particle radiation, neutral gas particles, ultraviolet and x-rays, as well as micrometeoroids and orbital debris in the space environment have various effects on spacecraft systems, including degradation of microelectronic and optical components, physical damage, orbital decay, biasing of instrument readings, and system shutdowns. Space climate and weather must be considered during the mission life cycle (mission concept, mission planning, systems design, and launch and operations) to minimize and manage risk to both the spacecraft and its systems. A space environment model for use in the mission life cycle is presented.

  7. Genetic Algorithms for Development of New Financial Products

    Directory of Open Access Journals (Sweden)

    Eder Oliveira Abensur

    2007-06-01

    Full Text Available New Product Development (NPD is recognized as a fundamental activity that has a relevant impact on the performance of companies. Despite the relevance of the financial market there is a lack of work on new financial product development. The aim of this research is to propose the use of Genetic Algorithms (GA as an alternative procedure for evaluating the most favorable combination of variables for the product launch. The paper focuses on: (i determining the essential variables of the financial product studied (investment fund; (ii determining how to evaluate the success of a new investment fund launch and (iii how GA can be applied to the financial product development problem. The proposed framework was tested using 4 years of real data from the Brazilian financial market and the results suggest that this is an innovative development methodology and useful for designing complex financial products with many attributes.

  8. Algorithm integration using ADL (Algorithm Development Library) for improving CrIMSS EDR science product quality

    Science.gov (United States)

    Das, B.; Wilson, M.; Divakarla, M. G.; Chen, W.; Barnet, C.; Wolf, W.

    2013-05-01

    Algorithm Development Library (ADL) is a framework that mimics the operational system IDPS (Interface Data Processing Segment) that is currently being used to process data from instruments aboard Suomi National Polar-orbiting Partnership (S-NPP) satellite. The satellite was launched successfully in October 2011. The Cross-track Infrared and Microwave Sounder Suite (CrIMSS) consists of the Advanced Technology Microwave Sounder (ATMS) and Cross-track Infrared Sounder (CrIS) instruments that are on-board of S-NPP. These instruments will also be on-board of JPSS (Joint Polar Satellite System) that will be launched in early 2017. The primary products of the CrIMSS Environmental Data Record (EDR) include global atmospheric vertical temperature, moisture, and pressure profiles (AVTP, AVMP and AVPP) and Ozone IP (Intermediate Product from CrIS radiances). Several algorithm updates have recently been proposed by CrIMSS scientists that include fixes to the handling of forward modeling errors, a more conservative identification of clear scenes, indexing corrections for daytime products, and relaxed constraints between surface temperature and air temperature for daytime land scenes. We have integrated these improvements into the ADL framework. This work compares the results from ADL emulation of future IDPS system incorporating all the suggested algorithm updates with the current official processing results by qualitative and quantitative evaluations. The results prove these algorithm updates improve science product quality.

  9. Development of radio frequency interference detection algorithms for passive microwave remote sensing

    Science.gov (United States)

    Misra, Sidharth

    Radio Frequency Interference (RFI) signals are man-made sources that are increasingly plaguing passive microwave remote sensing measurements. RFI is of insidious nature, with some signals low power enough to go undetected but large enough to impact science measurements and their results. With the launch of the European Space Agency (ESA) Soil Moisture and Ocean Salinity (SMOS) satellite in November 2009 and the upcoming launches of the new NASA sea-surface salinity measuring Aquarius mission in June 2011 and soil-moisture measuring Soil Moisture Active Passive (SMAP) mission around 2015, active steps are being taken to detect and mitigate RFI at L-band. An RFI detection algorithm was designed for the Aquarius mission. The algorithm performance was analyzed using kurtosis based RFI ground-truth. The algorithm has been developed with several adjustable location dependant parameters to control the detection statistics (false-alarm rate and probability of detection). The kurtosis statistical detection algorithm has been compared with the Aquarius pulse detection method. The comparative study determines the feasibility of the kurtosis detector for the SMAP radiometer, as a primary RFI detection algorithm in terms of detectability and data bandwidth. The kurtosis algorithm has superior detection capabilities for low duty-cycle radar like pulses, which are more prevalent according to analysis of field campaign data. Most RFI algorithms developed have generally been optimized for performance with individual pulsed-sinusoidal RFI sources. A new RFI detection model is developed that takes into account multiple RFI sources within an antenna footprint. The performance of the kurtosis detection algorithm under such central-limit conditions is evaluated. The SMOS mission has a unique hardware system, and conventional RFI detection techniques cannot be applied. Instead, an RFI detection algorithm for SMOS is developed and applied in the angular domain. This algorithm compares

  10. Reproducible cancer biomarker discovery in SELDI-TOF MS using different pre-processing algorithms.

    Directory of Open Access Journals (Sweden)

    Jinfeng Zou

    Full Text Available BACKGROUND: There has been much interest in differentiating diseased and normal samples using biomarkers derived from mass spectrometry (MS studies. However, biomarker identification for specific diseases has been hindered by irreproducibility. Specifically, a peak profile extracted from a dataset for biomarker identification depends on a data pre-processing algorithm. Until now, no widely accepted agreement has been reached. RESULTS: In this paper, we investigated the consistency of biomarker identification using differentially expressed (DE peaks from peak profiles produced by three widely used average spectrum-dependent pre-processing algorithms based on SELDI-TOF MS data for prostate and breast cancers. Our results revealed two important factors that affect the consistency of DE peak identification using different algorithms. One factor is that some DE peaks selected from one peak profile were not detected as peaks in other profiles, and the second factor is that the statistical power of identifying DE peaks in large peak profiles with many peaks may be low due to the large scale of the tests and small number of samples. Furthermore, we demonstrated that the DE peak detection power in large profiles could be improved by the stratified false discovery rate (FDR control approach and that the reproducibility of DE peak detection could thereby be increased. CONCLUSIONS: Comparing and evaluating pre-processing algorithms in terms of reproducibility can elucidate the relationship among different algorithms and also help in selecting a pre-processing algorithm. The DE peaks selected from small peak profiles with few peaks for a dataset tend to be reproducibly detected in large peak profiles, which suggests that a suitable pre-processing algorithm should be able to produce peaks sufficient for identifying useful and reproducible biomarkers.

  11. Space Launch System Development Status

    Science.gov (United States)

    Lyles, Garry

    2014-01-01

    Development of NASA's Space Launch System (SLS) heavy lift rocket is shifting from the formulation phase into the implementation phase in 2014, a little more than three years after formal program approval. Current development is focused on delivering a vehicle capable of launching 70 metric tons (t) into low Earth orbit. This "Block 1" configuration will launch the Orion Multi-Purpose Crew Vehicle (MPCV) on its first autonomous flight beyond the Moon and back in December 2017, followed by its first crewed flight in 2021. SLS can evolve to a130-t lift capability and serve as a baseline for numerous robotic and human missions ranging from a Mars sample return to delivering the first astronauts to explore another planet. Benefits associated with its unprecedented mass and volume include reduced trip times and simplified payload design. Every SLS element achieved significant, tangible progress over the past year. Among the Program's many accomplishments are: manufacture of Core Stage test panels; testing of Solid Rocket Booster development hardware including thrust vector controls and avionics; planning for testing the RS-25 Core Stage engine; and more than 4,000 wind tunnel runs to refine vehicle configuration, trajectory, and guidance. The Program shipped its first flight hardware - the Multi-Purpose Crew Vehicle Stage Adapter (MSA) - to the United Launch Alliance for integration with the Delta IV heavy rocket that will launch an Orion test article in 2014 from NASA's Kennedy Space Center. Objectives of this Earth-orbit flight include validating the performance of Orion's heat shield and the MSA design, which will be manufactured again for SLS missions to deep space. The Program successfully completed Preliminary Design Review in 2013 and Key Decision Point C in early 2014. NASA has authorized the Program to move forward to Critical Design Review, scheduled for 2015 and a December 2017 first launch. The Program's success to date is due to prudent use of proven

  12. Close coupling of pre- and post-processing vision stations using inexact algorithms

    Science.gov (United States)

    Shih, Chi-Hsien V.; Sherkat, Nasser; Thomas, Peter D.

    1996-02-01

    Work has been reported using lasers to cut deformable materials. Although the use of laser reduces material deformation, distortion due to mechanical feed misalignment persists. Changes in the lace patten are also caused by the release of tension in the lace structure as it is cut. To tackle the problem of distortion due to material flexibility, the 2VMethod together with the Piecewise Error Compensation Algorithm incorporating the inexact algorithms, i.e., fuzzy logic, neural networks and neural fuzzy technique, are developed. A spring mounted pen is used to emulate the distortion of the lace pattern caused by tactile cutting and feed misalignment. Using pre- and post-processing vision systems, it is possible to monitor the scalloping process and generate on-line information for the artificial intelligence engines. This overcomes the problems of lace distortion due to the trimming process. Applying the algorithms developed, the system can produce excellent results, much better than a human operator.

  13. Benefits of Government Incentives for Reusable Launch Vehicle Development

    Science.gov (United States)

    Shaw, Eric J.; Hamaker, Joseph W.; Prince, Frank A.

    1998-01-01

    Many exciting new opportunities in space, both government missions and business ventures, could be realized by a reduction in launch prices. Reusable launch vehicle (RLV) designs have the potential to lower launch costs dramatically from those of today's expendable and partially-expendable vehicles. Unfortunately, governments must budget to support existing launch capability, and so lack the resources necessary to completely fund development of new reusable systems. In addition, the new commercial space markets are too immature and uncertain to motivate the launch industry to undertake a project of this magnitude and risk. Low-cost launch vehicles will not be developed without a mature market to service; however, launch prices must be reduced in order for a commercial launch market to mature. This paper estimates and discusses the various benefits that may be reaped from government incentives for a commercial reusable launch vehicle program.

  14. A Study on Fuel Estimation Algorithms for a Geostationary Communication & Broadcasting Satellite

    OpenAIRE

    Jong Won Eun

    2000-01-01

    It has been developed to calculate fuel budget for a geostationary communication and broadcasting satellite. It is quite essential that the pre-launch fuel budget estimation must account for the deterministic transfer and drift orbit maneuver requirements. After on-station, the calculation of satellite lifetime should be based on the estimation of remaining fuel and assessment of actual performance. These estimations step from the proper algorithms to produce the prediction of satellite lifet...

  15. Popular NREL-Developed Transportation Mobile App Launches on Android

    Science.gov (United States)

    Platform | News | NREL Popular NREL-Developed Transportation Mobile App Launches on Android Platform Popular NREL-Developed Transportation Mobile App Launches on Android Platform May 23, 2017 More since the new Android version of the Alternative Fueling Station Locator App launched last week. The U.S

  16. A Study on Fuel Estimation Algorithms for a Geostationary Communication & Broadcasting Satellite

    Directory of Open Access Journals (Sweden)

    Jong Won Eun

    2000-12-01

    Full Text Available It has been developed to calculate fuel budget for a geostationary communication and broadcasting satellite. It is quite essential that the pre-launch fuel budget estimation must account for the deterministic transfer and drift orbit maneuver requirements. After on-station, the calculation of satellite lifetime should be based on the estimation of remaining fuel and assessment of actual performance. These estimations step from the proper algorithms to produce the prediction of satellite lifetime. This paper concentrates on the fuel estimation method that was studied for calculation of the propellant budget by using the given algorithms. Applications of this method are discussed for a communication and broadcasting satellite.

  17. MammaPrint Pre-screen Algorithm (MPA) reduces chemotherapy in ...

    African Journals Online (AJOL)

    MammaPrint Pre-screen Algorithm (MPA) reduces chemotherapy in patients with early-stage breast cancer. ... An implementation study was designed to take advantage of the fact that the 70-gene profile excludes analysis of hormone receptor and human epidermal growth factor receptor 2 (HER2) status, which form part of ...

  18. Planck pre-launch status: The optical system

    DEFF Research Database (Denmark)

    Tauber, J. A.; Nørgaard-Nielsen, Hans Ulrik; Ade, P. A. R.

    2010-01-01

    Planck is a scientific satellite that represents the next milestone in space-based research related to the cosmic microwave background, and in many other astrophysical fields. Planck was launched on 14 May of 2009 and is now operational. The uncertainty in the optical response of its detectors......, based on the knowledge available at the time of launch. We also briefly describe the impact of the major systematic effects of optical origin, and the concept of in-flight optical calibration. Detailed discussions of related areas are provided in accompanying papers....

  19. The Application of the NASA Advanced Concepts Office, Launch Vehicle Team Design Process and Tools for Modeling Small Responsive Launch Vehicles

    Science.gov (United States)

    Threet, Grady E.; Waters, Eric D.; Creech, Dennis M.

    2012-01-01

    The Advanced Concepts Office (ACO) Launch Vehicle Team at the NASA Marshall Space Flight Center (MSFC) is recognized throughout NASA for launch vehicle conceptual definition and pre-phase A concept design evaluation. The Launch Vehicle Team has been instrumental in defining the vehicle trade space for many of NASA s high level launch system studies from the Exploration Systems Architecture Study (ESAS) through the Augustine Report, Constellation, and now Space Launch System (SLS). The Launch Vehicle Team s approach to rapid turn-around and comparative analysis of multiple launch vehicle architectures has played a large role in narrowing the design options for future vehicle development. Recently the Launch Vehicle Team has been developing versions of their vetted tools used on large launch vehicles and repackaged the process and capability to apply to smaller more responsive launch vehicles. Along this development path the LV Team has evaluated trajectory tools and assumptions against sounding rocket trajectories and air launch systems, begun altering subsystem mass estimating relationships to handle smaller vehicle components, and as an additional development driver, have begun an in-house small launch vehicle study. With the recent interest in small responsive launch systems and the known capability and response time of the ACO LV Team, ACO s launch vehicle assessment capability can be utilized to rapidly evaluate the vast and opportune trade space that small launch vehicles currently encompass. This would provide a great benefit to the customer in order to reduce that large trade space to a select few alternatives that should best fit the customer s payload needs.

  20. NASA's Space Launch System Development Status

    Science.gov (United States)

    Lyles, Garry

    2014-01-01

    Development of the National Aeronautics and Space Administration's (NASA's) Space Launch System (SLS) heavy lift rocket is shifting from the formulation phase into the implementation phase in 2014, a little more than 3 years after formal program establishment. Current development is focused on delivering a vehicle capable of launching 70 metric tons (t) into low Earth orbit. This "Block 1" configuration will launch the Orion Multi-Purpose Crew Vehicle (MPCV) on its first autonomous flight beyond the Moon and back in December 2017, followed by its first crewed flight in 2021. SLS can evolve to a130t lift capability and serve as a baseline for numerous robotic and human missions ranging from a Mars sample return to delivering the first astronauts to explore another planet. Benefits associated with its unprecedented mass and volume include reduced trip times and simplified payload design. Every SLS element achieved significant, tangible progress over the past year. Among the Program's many accomplishments are: manufacture of core stage test barrels and domes; testing of Solid Rocket Booster development hardware including thrust vector controls and avionics; planning for RS- 25 core stage engine testing; and more than 4,000 wind tunnel runs to refine vehicle configuration, trajectory, and guidance. The Program shipped its first flight hardware - the Multi-Purpose Crew Vehicle Stage Adapter (MSA) - to the United Launch Alliance for integration with the Delta IV heavy rocket that will launch an Orion test article in 2014 from NASA's Kennedy Space Center. The Program successfully completed Preliminary Design Review in 2013 and will complete Key Decision Point C in 2014. NASA has authorized the Program to move forward to Critical Design Review, scheduled for 2015 and a December 2017 first launch. The Program's success to date is due to prudent use of proven technology, infrastructure, and workforce from the Saturn and Space Shuttle programs, a streamlined management

  1. Band-pass filtering algorithms for adaptive control of compressor pre-stall modes in aircraft gas-turbine engine

    Science.gov (United States)

    Kuznetsova, T. A.

    2018-05-01

    The methods for increasing gas-turbine aircraft engines' (GTE) adaptive properties to interference based on empowerment of automatic control systems (ACS) are analyzed. The flow pulsation in suction and a discharge line of the compressor, which may cause the stall, are considered as the interference. The algorithmic solution to the problem of GTE pre-stall modes’ control adapted to stability boundary is proposed. The aim of the study is to develop the band-pass filtering algorithms to provide the detection functions of the compressor pre-stall modes for ACS GTE. The characteristic feature of pre-stall effect is the increase of pressure pulsation amplitude over the impeller at the multiples of the rotor’ frequencies. The used method is based on a band-pass filter combining low-pass and high-pass digital filters. The impulse response of the high-pass filter is determined through a known low-pass filter impulse response by spectral inversion. The resulting transfer function of the second order band-pass filter (BPF) corresponds to a stable system. The two circuit implementations of BPF are synthesized. Designed band-pass filtering algorithms were tested in MATLAB environment. Comparative analysis of amplitude-frequency response of proposed implementation allows choosing the BPF scheme providing the best quality of filtration. The BPF reaction to the periodic sinusoidal signal, simulating the experimentally obtained pressure pulsation function in the pre-stall mode, was considered. The results of model experiment demonstrated the effectiveness of applying band-pass filtering algorithms as part of ACS to identify the pre-stall mode of the compressor for detection of pressure fluctuations’ peaks, characterizing the compressor’s approach to the stability boundary.

  2. A Reference Model for Virtual Machine Launching Overhead

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Hao; Ren, Shangping; Garzoglio, Gabriele; Timm, Steven; Bernabeu, Gerard; Chadwick, Keith; Noh, Seo-Young

    2016-07-01

    Cloud bursting is one of the key research topics in the cloud computing communities. A well designed cloud bursting module enables private clouds to automatically launch virtual machines (VMs) to public clouds when more resources are needed. One of the main challenges in developing a cloud bursting module is to decide when and where to launch a VM so that all resources are most effectively and efficiently utilized and the system performance is optimized. However, based on system operational data obtained from FermiCloud, a private cloud developed by the Fermi National Accelerator Laboratory for scientific workflows, the VM launching overhead is not a constant. It varies with physical resource utilization, such as CPU and I/O device utilizations, at the time when a VM is launched. Hence, to make judicious decisions as to when and where a VM should be launched, a VM launching overhead reference model is needed. In this paper, we first develop a VM launching overhead reference model based on operational data we have obtained on FermiCloud. Second, we apply the developed reference model on FermiCloud and compare calculated VM launching overhead values based on the model with measured overhead values on FermiCloud. Our empirical results on FermiCloud indicate that the developed reference model is accurate. We believe, with the guidance of the developed reference model, efficient resource allocation algorithms can be developed for cloud bursting process to minimize the operational cost and resource waste.

  3. Progress Towards a 2012 Landsat Launch

    Science.gov (United States)

    Irons, Jim; Sabelhaus, Phil; Masek, Jeff; Cook, Bruce; Dabney, Phil; Loveland, Tom

    2012-01-01

    The Landsat Data Continuity Mission (LDCM) is on schedule for a December 2012 launch date. The mission is being managed by an interagency partnership between NASA and the U.S. Geological Survey (USGS). NASA leads the development and launch of the satellite observatory while leads ground system development. USGS will assume responsibility for operating the satellite and for collecting, archiving, and distributing the LDCM data following launch. When launched the satellite will carry two sensors into orbit. The Operational Land Imager (OLI) will collect data for nine shortwave spectral bands with a spatial resolution of 30 m (with a 15 m panchromatic band). The Thermal Infrared Sensor (TIRS) will coincidently collect data for two thermal infrared bands with a spatial resolution of 100 m. The OLI is fully assembled and tested and has been shipped by it?s manufacturer, Ball Aerospace and Technology Corporation, to the Orbital Sciences Corporation (Orbital) facility where it is being integrated onto the LDCM spacecraft. Pre-launch testing indicates that OLI will meet all performance specification with margin. TIRS is in development at the NASA Goddard Space Flight Center (GSFC) and is in final testing before shipping to the Orbital facility in January, 2012. The ground data processing system is in development at the USGS Earth Resources Observation and Science (EROS) Center. The presentation will describe the LDCM satellite system, provide the status of system development, and present prelaunch performance data for OLI and TIRS. The USGS has committed to renaming the satellite as Landsat 8 following launch.

  4. STS-110/Atlantic/ISS 8A Pre-Launch On Orbit-Landing-Crew Egress

    Science.gov (United States)

    2002-01-01

    The crew of STS-110, which consists of Commander Michael Bloomfield, Pilot Stephen Frick, and Mission Specialists Rex Walheim, Ellen Ochoa, Lee Morin, Jerry Ross, and Steven Smith is introduced at the customary pre-flight meal. The narrator provides background information on the astronauts during suit-up. Each crew member is shown in the White Room before boarding Space Shuttle Atlantis, and some display signs to loved ones. Launch footage includes the following replays: Beach Tracker, VAB, Pad B, Tower 1, DLTR-3, Grandstand, Cocoa Beach DOAMS, Playalinda DOAMS, UCS-23, SLF Convoy, OTV-154, OTV-163, OTV-170 (mislabeled), and OTV-171 (mislabeled). After the launch, NASA administrator Sean O'Keefe gives a speech to the Launch Control Center, with political dignitaries present. While on-orbit, Atlantis docks with the International Space Station (ISS), and Canadarm 2 on the ISS lifts the S0 Truss out of the orbiter's payload bay. The video includes highlights of three extravehicular activities (EVAs). In the first, the S0 Truss is fastened to the Destiny Laboratory Module on the ISS. During the third EVA, Walheim and Smith assist in the checkout of the handcart on the S0 Truss. The Atlantis crew is shown gathered together with the Expedition 4 crew of the ISS, and again by itself after undocking. Replays of the landing include: VAB, Tower 1, Mid-field, Runway South End, Runway North End, Tower 2, Playalinda DOAMS, Cocoa Beach DOAMS, and Pilot Point of View (PPOV). After landing, Commander Bloomfield lets each of his crew members give a short speech.

  5. Commercial aspects of semi-reusable launch systems

    Science.gov (United States)

    Obersteiner, M. H.; Müller, H.; Spies, H.

    2003-07-01

    This paper presents a business planning model for a commercial space launch system. The financing model is based on market analyses and projections combined with market capture models. An operations model is used to derive the annual cash income. Parametric cost modeling, development and production schedules are used for quantifying the annual expenditures, the internal rate of return, break even point of positive cash flow and the respective prices per launch. Alternative consortia structures, cash flow methods, capture rates and launch prices are used to examine the sensitivity of the model. Then the model is applied for a promising semi-reusable launcher concept, showing the general achievability of the commercial approach and the necessary pre-conditions.

  6. Airborne campaigns for CryoSat pre-launch calibration and validation

    DEFF Research Database (Denmark)

    Hvidegaard, Sine Munk; Forsberg, René; Skourup, Henriette

    2010-01-01

    From 2003 to 2008 DTU Space together with ESA and several international partners carried out airborne and ground field campaigns in preparation for CryoSat validation; called CryoVEx: CryoSat Validation Experiments covering the main ice caps in Greenland, Canada and Svalbard and sea ice in the Ar......From 2003 to 2008 DTU Space together with ESA and several international partners carried out airborne and ground field campaigns in preparation for CryoSat validation; called CryoVEx: CryoSat Validation Experiments covering the main ice caps in Greenland, Canada and Svalbard and sea ice...... in the Arctic Ocean. The main goal of the airborne surveys was to acquire coincident scanning laser and CryoSat type radar elevation measurements of the surface; either sea ice or land ice. Selected lines have been surveyed along with detailed mapping of validation sites coordinated with insitu field work...... and helicopter electromagnetic surveying. This paper summarises the pre-launch campaigns and presents some of the result from the coincident measurement from airborne and ground observations....

  7. Medical Decision Algorithm for Pre-Hospital Trauma Care. Phase I.

    Science.gov (United States)

    1996-09-01

    Algorithm for Pre-Hospital Trauma Care PRINCIPAL INVESTIGATOR: Donald K. Wedding, P.E., Ph.D CONTRACTING ORGANIZATION : Photonics Systems, Incorporated... ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Photonics Systems, Incorporated Northwood, Ohio 43619 9. SPONSORING...three areas: 1) data acquisition, 2) neural network design, and 3) system architechture design. In the first area of this research, a triage database

  8. An Online Tilt Estimation and Compensation Algorithm for a Small Satellite Camera

    Science.gov (United States)

    Lee, Da-Hyun; Hwang, Jai-hyuk

    2018-04-01

    In the case of a satellite camera designed to execute an Earth observation mission, even after a pre-launch precision alignment process has been carried out, misalignment will occur due to external factors during the launch and in the operating environment. In particular, for high-resolution satellite cameras, which require submicron accuracy for alignment between optical components, misalignment is a major cause of image quality degradation. To compensate for this, most high-resolution satellite cameras undergo a precise realignment process called refocusing before and during the operation process. However, conventional Earth observation satellites only execute refocusing upon de-space. Thus, in this paper, an online tilt estimation and compensation algorithm that can be utilized after de-space correction is executed. Although the sensitivity of the optical performance degradation due to the misalignment is highest in de-space, the MTF can be additionally increased by correcting tilt after refocusing. The algorithm proposed in this research can be used to estimate the amount of tilt that occurs by taking star images, and it can also be used to carry out automatic tilt corrections by employing a compensation mechanism that gives angular motion to the secondary mirror. Crucially, this algorithm is developed using an online processing system so that it can operate without communication with the ground.

  9. Planck pre-launch status: The Planck mission

    DEFF Research Database (Denmark)

    Tauber, J. A.; Mandoles, N.; Puget, J.-L.

    2010-01-01

    instruments, and of tests at fully integrated satellite level. It represents the best estimate before launch of the technical performance that the satellite and its payload will achieve in flight. In this paper, we summarise the main elements of the payload performance, which is described in detail...

  10. NASA Lewis Launch Collision Probability Model Developed and Analyzed

    Science.gov (United States)

    Bollenbacher, Gary; Guptill, James D

    1999-01-01

    There are nearly 10,000 tracked objects orbiting the earth. These objects encompass manned objects, active and decommissioned satellites, spent rocket bodies, and debris. They range from a few centimeters across to the size of the MIR space station. Anytime a new satellite is launched, the launch vehicle with its payload attached passes through an area of space in which these objects orbit. Although the population density of these objects is low, there always is a small but finite probability of collision between the launch vehicle and one or more of these space objects. Even though the probability of collision is very low, for some payloads even this small risk is unacceptable. To mitigate the small risk of collision associated with launching at an arbitrary time within the daily launch window, NASA performs a prelaunch mission assurance Collision Avoidance Analysis (or COLA). For the COLA of the Cassini spacecraft, the NASA Lewis Research Center conducted an in-house development and analysis of a model for launch collision probability. The model allows a minimum clearance criteria to be used with the COLA analysis to ensure an acceptably low probability of collision. If, for any given liftoff time, the nominal launch vehicle trajectory would pass a space object with less than the minimum required clearance, launch would not be attempted at that time. The model assumes that the nominal positions of the orbiting objects and of the launch vehicle can be predicted as a function of time, and therefore, that any tracked object that comes within close proximity of the launch vehicle can be identified. For any such pair, these nominal positions can be used to calculate a nominal miss distance. The actual miss distances may differ substantially from the nominal miss distance, due, in part, to the statistical uncertainty of the knowledge of the objects positions. The model further assumes that these position uncertainties can be described with position covariance matrices

  11. Development of the Landsat Data Continuity Mission Cloud Cover Assessment Algorithms

    Science.gov (United States)

    Scaramuzza, Pat; Bouchard, M.A.; Dwyer, John L.

    2012-01-01

    The upcoming launch of the Operational Land Imager (OLI) will start the next era of the Landsat program. However, the Automated Cloud-Cover Assessment (CCA) (ACCA) algorithm used on Landsat 7 requires a thermal band and is thus not suited for OLI. There will be a thermal instrument on the Landsat Data Continuity Mission (LDCM)-the Thermal Infrared Sensor-which may not be available during all OLI collections. This illustrates a need for CCA for LDCM in the absence of thermal data. To research possibilities for full-resolution OLI cloud assessment, a global data set of 207 Landsat 7 scenes with manually generated cloud masks was created. It was used to evaluate the ACCA algorithm, showing that the algorithm correctly classified 79.9% of a standard test subset of 3.95 109 pixels. The data set was also used to develop and validate two successor algorithms for use with OLI data-one derived from an off-the-shelf machine learning package and one based on ACCA but enhanced by a simple neural network. These comprehensive CCA algorithms were shown to correctly classify pixels as cloudy or clear 88.5% and 89.7% of the time, respectively.

  12. A Pre-Detection Based Anti-Collision Algorithm with Adjustable Slot Size Scheme for Tag Identification

    Directory of Open Access Journals (Sweden)

    Chiu-Kuo LIANG

    2015-06-01

    Full Text Available One of the research areas in RFID systems is a tag anti-collision protocol; how to reduce identification time with a given number of tags in the field of an RFID reader. There are two types of tag anti-collision protocols for RFID systems: tree based algorithms and slotted aloha based algorithms. Many anti-collision algorithms have been proposed in recent years, especially in tree based protocols. However, there still have challenges on enhancing the system throughput and stability due to the underlying technologies had faced different limitation in system performance when network density is high. Particularly, the tree based protocols had faced the long identification delay. Recently, a Hybrid Hyper Query Tree (H2QT protocol, which is a tree based approach, was proposed and aiming to speedup tag identification in large scale RFID systems. The main idea of H2QT is to track the tag response and try to predict the distribution of tag IDs in order to reduce collisions. In this paper, we propose a pre-detection tree based algorithm, called the Adaptive Pre-Detection Broadcasting Query Tree algorithm (APDBQT, to avoid those unnecessary queries. Our proposed APDBQT protocol can reduce not only the collisions but the idle cycles as well by using pre-detection scheme and adjustable slot size mechanism. The simulation results show that our proposed technique provides superior performance in high density environments. It is shown that the APDBQT is effective in terms of increasing system throughput and minimizing identification delay.

  13. Technique applied in electrical power distribution for Satellite Launch Vehicle

    Directory of Open Access Journals (Sweden)

    João Maurício Rosário

    2010-09-01

    Full Text Available The Satellite Launch Vehicle electrical network, which is currently being developed in Brazil, is sub-divided for analysis in the following parts: Service Electrical Network, Controlling Electrical Network, Safety Electrical Network and Telemetry Electrical Network. During the pre-launching and launching phases, these electrical networks are associated electrically and mechanically to the structure of the vehicle. In order to succeed in the integration of these electrical networks it is necessary to employ techniques of electrical power distribution, which are proper to Launch Vehicle systems. This work presents the most important techniques to be considered in the characterization of the electrical power supply applied to Launch Vehicle systems. Such techniques are primarily designed to allow the electrical networks, when submitted to the single-phase fault to ground, to be able of keeping the power supply to the loads.

  14. Development process of muzzle flows including a gun-launched missile

    Directory of Open Access Journals (Sweden)

    Zhuo Changfei

    2015-04-01

    Full Text Available Numerical investigations on the launch process of a gun-launched missile from the muzzle of a cannon to the free-flight stage have been performed in this paper. The dynamic overlapped grids approach are applied to dealing with the problems of a moving gun-launched missile. The high-resolution upwind scheme (AUSMPW+ and the detailed reaction kinetics model are adopted to solve the chemical non-equilibrium Euler equations for dynamic grids. The development process and flow field structure of muzzle flows including a gun-launched missile are discussed in detail. This present numerical study confirms that complicated transient phenomena exist in the shortly launching stages when the gun-launched missile moves from the muzzle of a cannon to the free-flight stage. The propellant gas flows, the initial environmental ambient air flows and the moving missile mutually couple and interact. A complete structure of flow field is formed at the launching stages, including the blast wave, base shock, reflected shock, incident shock, shear layer, primary vortex ring and triple point.

  15. Genetic algorithm for project time-cost optimization in fuzzy environment

    Directory of Open Access Journals (Sweden)

    Khan Md. Ariful Haque

    2012-12-01

    Full Text Available Purpose: The aim of this research is to develop a more realistic approach to solve project time-cost optimization problem under uncertain conditions, with fuzzy time periods. Design/methodology/approach: Deterministic models for time-cost optimization are never efficient considering various uncertainty factors. To make such problems realistic, triangular fuzzy numbers and the concept of a-cut method in fuzzy logic theory are employed to model the problem. Because of NP-hard nature of the project scheduling problem, Genetic Algorithm (GA has been used as a searching tool. Finally, Dev-C++ 4.9.9.2 has been used to code this solver. Findings: The solution has been performed under different combinations of GA parameters and after result analysis optimum values of those parameters have been found for the best solution. Research limitations/implications: For demonstration of the application of the developed algorithm, a project on new product (Pre-paid electric meter, a project under government finance launching has been chosen as a real case. The algorithm is developed under some assumptions. Practical implications: The proposed model leads decision makers to choose the desired solution under different risk levels. Originality/value: Reports reveal that project optimization problems have never been solved under multiple uncertainty conditions. Here, the function has been optimized using Genetic Algorithm search technique, with varied level of risks and fuzzy time periods.

  16. Development process of muzzle flows including a gun-launched missile

    OpenAIRE

    Zhuo Changfei; Feng Feng; Wu Xiaosong

    2015-01-01

    Numerical investigations on the launch process of a gun-launched missile from the muzzle of a cannon to the free-flight stage have been performed in this paper. The dynamic overlapped grids approach are applied to dealing with the problems of a moving gun-launched missile. The high-resolution upwind scheme (AUSMPW+) and the detailed reaction kinetics model are adopted to solve the chemical non-equilibrium Euler equations for dynamic grids. The development process and flow field structure of m...

  17. Development of Constraint Force Equation Methodology for Application to Multi-Body Dynamics Including Launch Vehicle Stage Seperation

    Science.gov (United States)

    Pamadi, Bandu N.; Toniolo, Matthew D.; Tartabini, Paul V.; Roithmayr, Carlos M.; Albertson, Cindy W.; Karlgaard, Christopher D.

    2016-01-01

    The objective of this report is to develop and implement a physics based method for analysis and simulation of multi-body dynamics including launch vehicle stage separation. The constraint force equation (CFE) methodology discussed in this report provides such a framework for modeling constraint forces and moments acting at joints when the vehicles are still connected. Several stand-alone test cases involving various types of joints were developed to validate the CFE methodology. The results were compared with ADAMS(Registered Trademark) and Autolev, two different industry standard benchmark codes for multi-body dynamic analysis and simulations. However, these two codes are not designed for aerospace flight trajectory simulations. After this validation exercise, the CFE algorithm was implemented in Program to Optimize Simulated Trajectories II (POST2) to provide a capability to simulate end-to-end trajectories of launch vehicles including stage separation. The POST2/CFE methodology was applied to the STS-1 Space Shuttle solid rocket booster (SRB) separation and Hyper-X Research Vehicle (HXRV) separation from the Pegasus booster as a further test and validation for its application to launch vehicle stage separation problems. Finally, to demonstrate end-to-end simulation capability, POST2/CFE was applied to the ascent, orbit insertion, and booster return of a reusable two-stage-to-orbit (TSTO) vehicle concept. With these validation exercises, POST2/CFE software can be used for performing conceptual level end-to-end simulations, including launch vehicle stage separation, for problems similar to those discussed in this report.

  18. Examination of Speed Contribution of Parallelization for Several Fingerprint Pre-Processing Algorithms

    Directory of Open Access Journals (Sweden)

    GORGUNOGLU, S.

    2014-05-01

    Full Text Available In analysis of minutiae based fingerprint systems, fingerprints needs to be pre-processed. The pre-processing is carried out to enhance the quality of the fingerprint and to obtain more accurate minutiae points. Reducing the pre-processing time is important for identification and verification in real time systems and especially for databases holding large fingerprints information. Parallel processing and parallel CPU computing can be considered as distribution of processes over multi core processor. This is done by using parallel programming techniques. Reducing the execution time is the main objective in parallel processing. In this study, pre-processing of minutiae based fingerprint system is implemented by parallel processing on multi core computers using OpenMP and on graphics processor using CUDA to improve execution time. The execution times and speedup ratios are compared with the one that of single core processor. The results show that by using parallel processing, execution time is substantially improved. The improvement ratios obtained for different pre-processing algorithms allowed us to make suggestions on the more suitable approaches for parallelization.

  19. The Development of Several Electromagnetic Monitoring Strategies and Algorithms for Validating Pre-Earthquake Electromagnetic Signals

    Science.gov (United States)

    Bleier, T. E.; Dunson, J. C.; Roth, S.; Mueller, S.; Lindholm, C.; Heraud, J. A.

    2012-12-01

    QuakeFinder, a private research group in California, reports on the development of a 100+ station network consisting of 3-axis induction magnetometers, and air conductivity sensors to collect and characterize pre-seismic electromagnetic (EM) signals. These signals are combined with daily Infra Red signals collected from the GOES weather satellite infrared (IR) instrument to compare and correlate with the ground EM signals, both from actual earthquakes and boulder stressing experiments. This presentation describes the efforts QuakeFinder has undertaken to automatically detect these pulse patterns using their historical data as a reference, and to develop other discriminative algorithms that can be used with air conductivity sensors, and IR instruments from the GOES satellites. The overall big picture results of the QuakeFinder experiment are presented. In 2007, QuakeFinder discovered the occurrence of strong uni-polar pulses in their magnetometer coil data that increased in tempo dramatically prior to the M5.1 earthquake at Alum Rock, California. Suggestions that these pulses might have been lightning or power-line arcing did not fit with the data actually recorded as was reported in Bleier [2009]. Then a second earthquake occurred near the same site on January 7, 2010 as was reported in Dunson [2011], and the pattern of pulse count increases before the earthquake occurred similarly to the 2007 event. There were fewer pulses, and the magnitude of them was decreased, both consistent with the fact that the earthquake was smaller (M4.0 vs M5.4) and farther away (7Km vs 2km). At the same time similar effects were observed at the QuakeFinder Tacna, Peru site before the May 5th, 2010 M6.2 earthquake and a cluster of several M4-5 earthquakes.

  20. JPSS-1 VIIRS Pre-Launch Response Versus Scan Angle Testing and Performance

    Science.gov (United States)

    Moyer, David; McIntire, Jeff; Oudrari, Hassan; McCarthy, James; Xiong, Xiaoxiong; De Luccia, Frank

    2016-01-01

    The Visible Infrared Imaging Radiometer Suite (VIIRS) instruments on-board both the Suomi National Polar-orbiting Partnership (S-NPP) and the first Joint Polar Satellite System (JPSS-1) spacecraft, with launch dates of October 2011 and December 2016 respectively, are cross-track scanners with an angular swath of +/-56.06 deg. A four-mirror Rotating Telescope Assembly (RTA) is used for scanning combined with a Half Angle Mirror (HAM) that directs light exiting from the RTA into the aft-optics. It has 14 Reflective Solar Bands (RSBs), seven Thermal Emissive Bands (TEBs) and a panchromatic Day Night Band (DNB). There are three internal calibration targets, the Solar Diffuser, the BlackBody and the Space View, that have fixed scan angles within the internal cavity of VIIRS. VIIRS has calibration requirements of 2% on RSB reflectance and as tight as 0.4% on TEB radiance that requires the sensor's gain change across the scan or Response Versus Scan angle (RVS) to be well quantified. A flow down of the top level calibration requirements put constraints on the characterization of the RVS to 0.2%-0.3% but there are no specified limitations on the magnitude of response change across scan. The RVS change across scan angle can vary significantly between bands with the RSBs having smaller changes of approximately 2% and some TEBs having approximately 10% variation. Within aband, the RVS has both detector and HAM side dependencies that vary across scan. Errors in the RVS characterization will contribute to image banding and striping artifacts if their magnitudes are above the noise level of the detectors. The RVS was characterized pre-launch for both S-NPP and JPSS-1 VIIRS and a comparison of the RVS curves between these two sensors will be discussed.

  1. Developing the science product algorithm testbed for Chinese next-generation geostationary meteorological satellites: Fengyun-4 series

    Science.gov (United States)

    Min, Min; Wu, Chunqiang; Li, Chuan; Liu, Hui; Xu, Na; Wu, Xiao; Chen, Lin; Wang, Fu; Sun, Fenglin; Qin, Danyu; Wang, Xi; Li, Bo; Zheng, Zhaojun; Cao, Guangzhen; Dong, Lixin

    2017-08-01

    Fengyun-4A (FY-4A), the first of the Chinese next-generation geostationary meteorological satellites, launched in 2016, offers several advances over the FY-2: more spectral bands, faster imaging, and infrared hyperspectral measurements. To support the major objective of developing the prototypes of FY-4 science algorithms, two science product algorithm testbeds for imagers and sounders have been developed by the scientists in the FY-4 Algorithm Working Group (AWG). Both testbeds, written in FORTRAN and C programming languages for Linux or UNIX systems, have been tested successfully by using Intel/g compilers. Some important FY-4 science products, including cloud mask, cloud properties, and temperature profiles, have been retrieved successfully through using a proxy imager, Himawari-8/Advanced Himawari Imager (AHI), and sounder data, obtained from the Atmospheric InfraRed Sounder, thus demonstrating their robustness. In addition, in early 2016, the FY-4 AWG was developed based on the imager testbed—a near real-time processing system for Himawari-8/AHI data for use by Chinese weather forecasters. Consequently, robust and flexible science product algorithm testbeds have provided essential and productive tools for popularizing FY-4 data and developing substantial improvements in FY-4 products.

  2. Web-based Weather Expert System (WES) for Space Shuttle Launch

    Science.gov (United States)

    Bardina, Jorge E.; Rajkumar, T.

    2003-01-01

    The Web-based Weather Expert System (WES) is a critical module of the Virtual Test Bed development to support 'go/no go' decisions for Space Shuttle operations in the Intelligent Launch and Range Operations program of NASA. The weather rules characterize certain aspects of the environment related to the launching or landing site, the time of the day or night, the pad or runway conditions, the mission durations, the runway equipment and landing type. Expert system rules are derived from weather contingency rules, which were developed over years by NASA. Backward chaining, a goal-directed inference method is adopted, because a particular consequence or goal clause is evaluated first, and then chained backward through the rules. Once a rule is satisfied or true, then that particular rule is fired and the decision is expressed. The expert system is continuously verifying the rules against the past one-hour weather conditions and the decisions are made. The normal procedure of operations requires a formal pre-launch weather briefing held on Launch minus 1 day, which is a specific weather briefing for all areas of Space Shuttle launch operations. In this paper, the Web-based Weather Expert System of the Intelligent Launch and range Operations program is presented.

  3. Reusable launch vehicle development research

    Science.gov (United States)

    1995-01-01

    NASA has generated a program approach for a SSTO reusable launch vehicle technology (RLV) development which includes a follow-on to the Ballistic Missile Defense Organization's (BMDO) successful DC-X program, the DC-XA (Advanced). Also, a separate sub-scale flight demonstrator, designated the X-33, will be built and flight tested along with numerous ground based technologies programs. For this to be a successful effort, a balance between technical, schedule, and budgetary risks must be attained. The adoption of BMDO's 'fast track' management practices will be a key element in the eventual success of NASA's effort.

  4. Spatial dual-orthogonal (SDO) phase-shifting algorithm by pre-recomposing the interference fringe.

    Science.gov (United States)

    Wang, Yi; Li, Bingbo; Zhong, Liyun; Tian, Jindong; Lu, Xiaoxu

    2017-07-24

    In the case that the phase distribution of interferogram is nonuniform and the background/modulation amplitude change rapidly, the current self-calibration algorithms with better performance like principal components analysis (PCA) and advanced iterative algorithm (AIA) cannot work well. In this study, from three or more phase-shifting interferograms with unknown phase-shifts, we propose a spatial dual-orthogonal (SDO) phase-shifting algorithm with high accuracy through using the spatial orthogonal property of interference fringe, in which a new sequence of fringe patterns with uniform phase distribution can be constructed by pre-recomposing original interferograms to determine their corresponding optimum combination coefficients, which are directly related with the phase shifts. Both simulation and experimental results show that using the proposed SDO algorithm, we can achieve accurate phase from the phase-shifting interferograms with nonuniform phase distribution, non-constant background and arbitrary phase shifts. Specially, it is found that the accuracy of phase retrieval with the proposed SDO algorithm is insensitive to the variation of fringe pattern, and this will supply a guarantee for high accuracy phase measurement and application.

  5. Phase retrieval from diffraction data utilizing pre-determined partial information

    International Nuclear Information System (INIS)

    Kim, S.S.; Marathe, S.; Kim, S.N.; Kang, H.C.; Noh, D.Y.

    2007-01-01

    We developed a phase retrieval algorithm that utilizes pre-determined partial phase information to overcome insufficient oversampling ratio in diffraction data. Implementing the Fourier modulus projection and the modified support projection manifesting the pre-determined information, a generalized difference map and HIO (Hybrid Input-Output) algorithms are developed. Optical laser diffraction data as well as simulated X-ray diffraction data are used to illustrate the validity of the proposed algorithm, which revealed the strength and the limitations of the algorithm. The proposed algorithm can expand the applicability of the diffraction based image reconstruction

  6. Bantam: A Systematic Approach to Reusable Launch Vehicle Technology Development

    Science.gov (United States)

    Griner, Carolyn; Lyles, Garry

    1999-01-01

    The Bantam technology project is focused on providing a low cost launch capability for very small (100 kilogram) NASA and University science payloads. The cost goal has been set at one million dollars per launch. The Bantam project, however, represents much more than a small payload launch capability. Bantam represents a unique, systematic approach to reusable launch vehicle technology development. This technology maturation approach will enable future highly reusable launch concepts in any payload class. These launch vehicle concepts of the future could deliver payloads for hundreds of dollars per pound, enabling dramatic growth in civil and commercial space enterprise. The National Aeronautics and Space Administration (NASA) has demonstrated a better, faster, and cheaper approach to science discovery in recent years. This approach is exemplified by the successful Mars Exploration Program lead by the Jet Propulsion Laboratory (JPL) for the NASA Space Science Enterprise. The Bantam project represents an approach to space transportation technology maturation that is very similar to the Mars Exploration Program. The NASA Advanced Space Transportation Program (ASTP) and Future X Pathfinder Program will combine to systematically mature reusable space transportation technology from low technology readiness to system level flight demonstration. New reusable space transportation capability will be demonstrated at a small (Bantam) scale approximately every two years. Each flight demonstration will build on the knowledge derived from the previous flight tests. The Bantam scale flight demonstrations will begin with the flights of the X-34. The X-34 will demonstrate reusable launch vehicle technologies including; flight regimes up to Mach 8 and 250,000 feet, autonomous flight operations, all weather operations, twenty-five flights in one year with a surge capability of two flights in less than twenty-four hours and safe abort. The Bantam project will build on this initial

  7. The Launch Systems Operations Cost Model

    Science.gov (United States)

    Prince, Frank A.; Hamaker, Joseph W. (Technical Monitor)

    2001-01-01

    One of NASA's primary missions is to reduce the cost of access to space while simultaneously increasing safety. A key component, and one of the least understood, is the recurring operations and support cost for reusable launch systems. In order to predict these costs, NASA, under the leadership of the Independent Program Assessment Office (IPAO), has commissioned the development of a Launch Systems Operations Cost Model (LSOCM). LSOCM is a tool to predict the operations & support (O&S) cost of new and modified reusable (and partially reusable) launch systems. The requirements are to predict the non-recurring cost for the ground infrastructure and the recurring cost of maintaining that infrastructure, performing vehicle logistics, and performing the O&S actions to return the vehicle to flight. In addition, the model must estimate the time required to cycle the vehicle through all of the ground processing activities. The current version of LSOCM is an amalgamation of existing tools, leveraging our understanding of shuttle operations cost with a means of predicting how the maintenance burden will change as the vehicle becomes more aircraft like. The use of the Conceptual Operations Manpower Estimating Tool/Operations Cost Model (COMET/OCM) provides a solid point of departure based on shuttle and expendable launch vehicle (ELV) experience. The incorporation of the Reliability and Maintainability Analysis Tool (RMAT) as expressed by a set of response surface model equations gives a method for estimating how changing launch system characteristics affects cost and cycle time as compared to today's shuttle system. Plans are being made to improve the model. The development team will be spending the next few months devising a structured methodology that will enable verified and validated algorithms to give accurate cost estimates. To assist in this endeavor the LSOCM team is part of an Agency wide effort to combine resources with other cost and operations professionals to

  8. Design of a Ground-Launched Ballistic Missile Interceptor Using a Genetic Algorithm

    National Research Council Canada - National Science Library

    Anderson, Murray

    1999-01-01

    ...) minimize maximum U-loading. In 50 generations the genetic algorithm was able to develop two basic types of external aerodynamic designs that performed nearly the same, with miss distances less than 1.0 foot...

  9. Multi-Stage Hybrid Rocket Conceptual Design for Micro-Satellites Launch using Genetic Algorithm

    Science.gov (United States)

    Kitagawa, Yosuke; Kitagawa, Koki; Nakamiya, Masaki; Kanazaki, Masahiro; Shimada, Toru

    The multi-objective genetic algorithm (MOGA) is applied to the multi-disciplinary conceptual design problem for a three-stage launch vehicle (LV) with a hybrid rocket engine (HRE). MOGA is an optimization tool used for multi-objective problems. The parallel coordinate plot (PCP), which is a data mining method, is employed in the post-process in MOGA for design knowledge discovery. A rocket that can deliver observing micro-satellites to the sun-synchronous orbit (SSO) is designed. It consists of an oxidizer tank containing liquid oxidizer, a combustion chamber containing solid fuel, a pressurizing tank and a nozzle. The objective functions considered in this study are to minimize the total mass of the rocket and to maximize the ratio of the payload mass to the total mass. To calculate the thrust and the engine size, the regression rate is estimated based on an empirical model for a paraffin (FT-0070) propellant. Several non-dominated solutions are obtained using MOGA, and design knowledge is discovered for the present hybrid rocket design problem using a PCP analysis. As a result, substantial knowledge on the design of an LV with an HRE is obtained for use in space transportation.

  10. Quality Control Algorithms and Proposed Integration Process for Wind Profilers Used by Launch Vehicle Systems

    Science.gov (United States)

    Decker, Ryan; Barbre, Robert E., Jr.

    2011-01-01

    Impact of winds to space launch vehicle include Design, Certification Day-of-launch (DOL) steering commands (1)Develop "knockdowns" of load indicators (2) Temporal uncertainty of flight winds. Currently use databases from weather balloons. Includes discrete profiles and profile pair datasets. Issues are : (1)Larger vehicles operate near design limits during ascent 150 discrete profiles per month 110-217 seasonal 2.0 and 3.5-hour pairs Balloon rise time (one hour) and drift (up to 100 n mi) Advantages of the Alternative approach using Doppler Radar Wind Profiler (DRWP) are: (1) Obtain larger sample size (2) Provide flexibility for assessing trajectory changes due to winds (3) Better representation of flight winds.

  11. Foreign launch competition growing

    Science.gov (United States)

    Brodsky, R. F.; Wolfe, M. G.; Pryke, I. W.

    1986-07-01

    A survey is given of progress made by other nations in providing or preparing to provide satellite launch services. The European Space Agency has four generations of Ariane vehicles, with a fifth recently approved; a second launch facility in French Guiana that has become operational has raised the possible Ariane launch rate to 10 per year, although a May failure of an Ariane 2 put launches on hold. The French Hermes spaceplane and the British HOTOL are discussed. Under the auspices of the Italian National Space Plane, the Iris orbital transfer vehicle is developed and China's Long March vehicles and the Soviet Protons and SL-4 vehicles are discussed; the Soviets moreover are apparently developing not only a Saturn V-class heavy lift vehicle with a 150,000-kg capacity (about five times the largest U.S. capacity) but also a space shuttle and a spaceplane. Four Japanese launch vehicles and some vehicles in an Indian program are also ready to provide launch services. In this new, tough market for launch services, the customers barely outnumber the suppliers. The competition develops just as the Challenger and Titan disasters place the U.S. at a disadvantage and underline the hard work ahead to recoup its heretofore leading position in launch services.

  12. NPOESS Tools for Rapid Algorithm Updates

    Science.gov (United States)

    Route, G.; Grant, K. D.; Hughes, B.; Reed, B.

    2009-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes both NPP and NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. Northrop Grumman Aerospace Systems Algorithms and Data Products (A&DP) organization is responsible for the algorithms that produce the EDRs, including their quality aspects. As the Calibration and Validation activities move forward following both the NPP launch and subsequent NPOESS launches, rapid algorithm updates may be required. Raytheon and Northrop Grumman have developed tools and processes to enable changes to be evaluated, tested, and moved into the operational baseline in a rapid and efficient manner. This presentation will provide an overview of the tools available to the Cal/Val teams to ensure rapid and accurate assessment of algorithm changes, along with the processes in place to ensure baseline integrity.

  13. Launch Vehicle Manual Steering with Adaptive Augmenting Control In-flight Evaluations Using a Piloted Aircraft

    Science.gov (United States)

    Hanson, Curt

    2014-01-01

    An adaptive augmenting control algorithm for the Space Launch System has been developed at the Marshall Space Flight Center as part of the launch vehicles baseline flight control system. A prototype version of the SLS flight control software was hosted on a piloted aircraft at the Armstrong Flight Research Center to demonstrate the adaptive controller on a full-scale realistic application in a relevant flight environment. Concerns regarding adverse interactions between the adaptive controller and a proposed manual steering mode were investigated by giving the pilot trajectory deviation cues and pitch rate command authority.

  14. An Innovative Hybrid Model Based on Data Pre-Processing and Modified Optimization Algorithm and Its Application in Wind Speed Forecasting

    Directory of Open Access Journals (Sweden)

    Ping Jiang

    2017-07-01

    Full Text Available Wind speed forecasting has an unsuperseded function in the high-efficiency operation of wind farms, and is significant in wind-related engineering studies. Back-propagation (BP algorithms have been comprehensively employed to forecast time series that are nonlinear, irregular, and unstable. However, the single model usually overlooks the importance of data pre-processing and parameter optimization of the model, which results in weak forecasting performance. In this paper, a more precise and robust model that combines data pre-processing, BP neural network, and a modified artificial intelligence optimization algorithm was proposed, which succeeded in avoiding the limitations of the individual algorithm. The novel model not only improves the forecasting accuracy but also retains the advantages of the firefly algorithm (FA and overcomes the disadvantage of the FA while optimizing in the later stage. To verify the forecasting performance of the presented hybrid model, 10-min wind speed data from Penglai city, Shandong province, China, were analyzed in this study. The simulations revealed that the proposed hybrid model significantly outperforms other single metaheuristics.

  15. Throttleable GOX/ABS launch assist hybrid rocket motor for small scale air launch platform

    Science.gov (United States)

    Spurrier, Zachary S.

    Aircraft-based space-launch platforms allow operational flexibility and offer the potential for significant propellant savings for small-to-medium orbital payloads. The NASA Armstrong Flight Research Center's Towed Glider Air-Launch System (TGALS) is a small-scale flight research project investigating the feasibility for a remotely-piloted, towed, glider system to act as a versatile air launch platform for nano-scale satellites. Removing the crew from the launch vehicle means that the system does not have to be human rated, and offers a potential for considerable cost savings. Utah State University is developing a small throttled launch-assist system for the TGALS platform. This "stage zero" design allows the TGALS platform to achieve the required flight path angle for the launch point, a condition that the TGALS cannot achieve without external propulsion. Throttling is required in order to achieve and sustain the proper launch attitude without structurally overloading the airframe. The hybrid rocket system employs gaseous-oxygen and acrylonitrile butadiene styrene (ABS) as propellants. This thesis summarizes the development and testing campaign, and presents results from the clean-sheet design through ground-based static fire testing. Development of the closed-loop throttle control system is presented.

  16. Evolution of the Florida Launch Site Architecture: Embracing Multiple Customers, Enhancing Launch Opportunities

    Science.gov (United States)

    Colloredo, Scott; Gray, James A.

    2011-01-01

    The impending conclusion of the Space Shuttle Program and the Constellation Program cancellation unveiled in the FY2011 President's budget created a large void for human spaceflight capability and specifically launch activity from the Florida launch Site (FlS). This void created an opportunity to re-architect the launch site to be more accommodating to the future NASA heavy lift and commercial space industry. The goal is to evolve the heritage capabilities into a more affordable and flexible launch complex. This case study will discuss the FlS architecture evolution from the trade studies to select primary launch site locations for future customers, to improving infrastructure; promoting environmental remediation/compliance; improving offline processing, manufacturing, & recovery; developing range interface and control services with the US Air Force, and developing modernization efforts for the launch Pad, Vehicle Assembly Building, Mobile launcher, and supporting infrastructure. The architecture studies will steer how to best invest limited modernization funding from initiatives like the 21 st elSe and other potential funding.

  17. NASA's Space Launch System: Developing the World's Most Powerful Solid Booster

    Science.gov (United States)

    Priskos, Alex

    2016-01-01

    NASA's Journey to Mars has begun. Indicative of that challenge, this will be a multi-decadal effort requiring the development of technology, operational capability, and experience. The first steps are under way with more than 15 years of continuous human operations aboard the International Space Station (ISS) and development of commercial cargo and crew transportation capabilities. NASA is making progress on the transportation required for deep space exploration - the Orion crew spacecraft and the Space Launch System (SLS) heavy-lift rocket that will launch Orion and large components such as in-space stages, habitat modules, landers, and other hardware necessary for deep-space operations. SLS is a key enabling capability and is designed to evolve with mission requirements. The initial configuration of SLS - Block 1 - will be capable of launching more than 70 metric tons (t) of payload into low Earth orbit, greater mass than any other launch vehicle in existence. By enhancing the propulsion elements and larger payload fairings, future SLS variants will launch 130 t into space, an unprecedented capability that simplifies hardware design and in-space operations, reduces travel times, and enhances the odds of mission success. SLS will be powered by four liquid fuel RS-25 engines and two solid propellant five-segment boosters, both based on space shuttle technologies. This paper will focus on development of the booster, which will provide more than 75 percent of total vehicle thrust at liftoff. Each booster is more than 17 stories tall, 3.6 meters (m) in diameter and weighs 725,000 kilograms (kg). While the SLS booster appears similar to the shuttle booster, it incorporates several changes. The additional propellant segment provides additional booster performance. Parachutes and other hardware associated with recovery operations have been deleted and the booster designated as expendable for affordability reasons. The new motor incorporates new avionics, new propellant

  18. Brand Launching and Sustainingin a developing country : The case study of Honda on Vietnam Motorcycle Market

    OpenAIRE

    Nguyen, Thi Bich Ngoc; Nguyen, Thi Xuan Thu

    2009-01-01

      Abstract Date May 29th, 2009 Course Master Thesis EFO705, International Marketing Tutor Daniel Tolstoy Authors Thi Bich Ngoc Nguyen Thi Xuan Thu Nguyen Title Brand Launching and Sustaining in a Developing CountryPurpose The project is to investigate the Brand Launching and Sustaining in a The Case Study of Honda on Vietnam Motorcycle Market developing country through the study on how Honda has successfully launched and sustained its Brand on the Motorcycle Market of Vietnam. Problems Hond...

  19. Pre-Launch Calibration and Performance Study of the Polarcube 3u Temperature Sounding Radiometer Mission

    Science.gov (United States)

    Periasamy, L.; Gasiewski, A. J.; Sanders, B. T.; Rouw, C.; Alvarenga, G.; Gallaher, D. W.

    2016-12-01

    The positive impact of passive microwave observations of tropospheric temperature, water vapor and surface variables on short-term weather forecasts has been clearly demonstrated in recent forecast anomaly growth studies. The development of a fleet of such passive microwave sensors especially at V-band and higher frequencies in low earth orbit using 3U and 6U CubeSats could help accomplish the aforementioned objectives at low system cost and risk as well as provide for regularly updated radiometer technology. The University of Colorado's 3U CubeSat, PolarCube is intended to serve as a demonstrator for such a fleet of passive sounders and imagers. PolarCube supports MiniRad, an eight channel, double sideband 118.7503 GHz passive microwave sounder. The mission is focused primarily on sounding in Arctic and Antarctic regions with the following key remote sensing science and engineering objectives: (i) Collect coincident tropospheric temperature profiles above sea ice, open polar ocean, and partially open areas to develop joint sea ice concentration and lower tropospheric temperature mapping capabilities in clear and cloudy atmospheric conditions. This goal will be accomplished in conjunction with data from existing passive microwave sensors operating at complementary bands; and (ii) Assess the capabilities of small passive microwave satellite sensors for environmental monitoring in support of the future development of inexpensive Earth science missions. Performance data of the payload/spacecraft from pre-launch calibration will be presented. This will include- (i) characterization of the antenna sub-system comprising of an offset 3D printed feedhorn and spinning parabolic reflector and impact of the antenna efficiencies on radiometer performance, (ii) characterization of MiniRad's RF front-end and IF back-end with respect to temperature fluctuations and their impact on atmospheric temperature weighting functions and receiver sensitivity, (iii) results from roof

  20. Multisensor data fusion algorithm development

    Energy Technology Data Exchange (ETDEWEB)

    Yocky, D.A.; Chadwick, M.D.; Goudy, S.P.; Johnson, D.K.

    1995-12-01

    This report presents a two-year LDRD research effort into multisensor data fusion. We approached the problem by addressing the available types of data, preprocessing that data, and developing fusion algorithms using that data. The report reflects these three distinct areas. First, the possible data sets for fusion are identified. Second, automated registration techniques for imagery data are analyzed. Third, two fusion techniques are presented. The first fusion algorithm is based on the two-dimensional discrete wavelet transform. Using test images, the wavelet algorithm is compared against intensity modulation and intensity-hue-saturation image fusion algorithms that are available in commercial software. The wavelet approach outperforms the other two fusion techniques by preserving spectral/spatial information more precisely. The wavelet fusion algorithm was also applied to Landsat Thematic Mapper and SPOT panchromatic imagery data. The second algorithm is based on a linear-regression technique. We analyzed the technique using the same Landsat and SPOT data.

  1. Developments in the Aerosol Layer Height Retrieval Algorithm for the Copernicus Sentinel-4/UVN Instrument

    Science.gov (United States)

    Nanda, Swadhin; Sanders, Abram; Veefkind, Pepijn

    2016-04-01

    The Sentinel-4 mission is a part of the European Commission's Copernicus programme, the goal of which is to provide geo-information to manage environmental assets, and to observe, understand and mitigate the effects of the changing climate. The Sentinel-4/UVN instrument design is motivated by the need to monitor trace gas concentrations and aerosols in the atmosphere from a geostationary orbit. The on-board instrument is a high resolution UV-VIS-NIR (UVN) spectrometer system that provides hourly radiance measurements over Europe and northern Africa with a spatial sampling of 8 km. The main application area of Sentinel-4/UVN is air quality. One of the data products that is being developed for Sentinel-4/UVN is the Aerosol Layer Height (ALH). The goal is to determine the height of aerosol plumes with a resolution of better than 0.5 - 1 km. The ALH product thus targets aerosol layers in the free troposphere, such as desert dust, volcanic ash and biomass during plumes. KNMI is assigned with the development of the Aerosol Layer Height (ALH) algorithm. Its heritage is the ALH algorithm developed by Sanders and De Haan (ATBD, 2016) for the TROPOMI instrument on board the Sentinel-5 Precursor mission that is to be launched in June or July 2016 (tentative date). The retrieval algorithm designed so far for the aerosol height product is based on the absorption characteristics of the oxygen-A band (759-770 nm). The algorithm has heritage to the ALH algorithm developed for TROPOMI on the Sentinel 5 precursor satellite. New aspects for Sentinel-4/UVN include the higher resolution (0.116 nm compared to 0.4 for TROPOMI) and hourly observation from the geostationary orbit. The algorithm uses optimal estimation to obtain a spectral fit of the reflectance across absorption band, while assuming a single uniform layer with fixed width to represent the aerosol vertical distribution. The state vector includes amongst other elements the height of this layer and its aerosol optical

  2. Optimization of a truck-drone in tandem delivery network using k-means and genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Ferrandez, S. M.; Harbison, T.; Weber, T.; Sturges, R.; Rich, R.

    2016-07-01

    The purpose of this paper is to investigate the effectiveness of implementing unmanned aerial delivery vehicles in delivery networks. We investigate the notion of the reduced overall delivery time, energy, and costs for a truck-drone network by comparing the in-tandem system with a stand-alone delivery effort. The objectives are (1) to investigate the time, energy, and costs associated to a truck-drone delivery network compared to standalone truck or drone, (2) to propose an optimization algorithm that determines the optimal number of launch sites and locations given delivery requirements, and drones per truck, (3) to develop mathematical formulations for closed form estimations for the optimal number of launch locations, optimal total time, as well as the associated cost for the system. The design of the algorithm herein computes the minimal time of delivery utilizing K-means clustering to find launch locations, as well as a genetic algorithm to solve the truck route as a traveling salesmen problem (TSP). The optimal solution is determined by finding the minimum cost associated to the parabolic convex cost function. The optimal min-cost is determined by finding the most efficient launch locations using K-means algorithms to determine launch locations and a genetic algorithm to determine truck route between those launch locations. Results show improvements with in-tandem delivery efforts as opposed to standalone systems. Further, multiple drones per truck are more optimal and contribute to savings in both energy and time. For this, we sampled various initialization variables to derive closed form mathematical solutions for the problem. Ultimately, this provides the necessary analysis of an integrated truck-drone delivery system which could be implemented by a company in order to maximize deliveries while minimizing time and energy. Closed-form mathematical solutions can be used as close estimators for final costs and time. (Author)

  3. Behaviors study of image registration algorithms in image guided radiation therapy

    International Nuclear Information System (INIS)

    Zou Lian; Hou Qing

    2008-01-01

    Objective: Study the behaviors of image registration algorithms, and analyze the elements which influence the performance of image registrations. Methods: Pre-known corresponding coordinates were appointed for reference image and moving image, and then the influence of region of interest (ROI) selection, transformation function initial parameters and coupled parameter spaces on registration results were studied with a software platform developed in home. Results: Region of interest selection had a manifest influence on registration performance. An improperly chosen ROI resulted in a bad registration. Transformation function initial parameters selection based on pre-known information could improve the accuracy of image registration. Coupled parameter spaces would enhance the dependence of image registration algorithm on ROI selection. Conclusions: It is necessary for clinic IGRT to obtain a ROI selection strategy (depending on specific commercial software) correlated to tumor sites. Three suggestions for image registration technique developers are automatic selection of the initial parameters of transformation function based on pre-known information, developing specific image registration algorithm for specific image feature, and assembling real-time image registration algorithms according to tumor sites selected by software user. (authors)

  4. Evaluation of a metal artifact reduction algorithm applied to post-interventional flat detector CT in comparison to pre-treatment CT in patients with acute subarachnoid haemorrhage

    International Nuclear Information System (INIS)

    Mennecke, Angelika; Svergun, Stanislav; Doerfler, Arnd; Struffert, Tobias; Scholz, Bernhard; Royalty, Kevin

    2017-01-01

    Metal artefacts can impair accurate diagnosis of haemorrhage using flat detector CT (FD-CT), especially after aneurysm coiling. Within this work we evaluate a prototype metal artefact reduction algorithm by comparison of the artefact-reduced and the non-artefact-reduced FD-CT images to pre-treatment FD-CT and multi-slice CT images. Twenty-five patients with acute aneurysmal subarachnoid haemorrhage (SAH) were selected retrospectively. FD-CT and multi-slice CT before endovascular treatment as well as FD-CT data sets after treatment were available for all patients. The algorithm was applied to post-treatment FD-CT. The effect of the algorithm was evaluated utilizing the pre-post concordance of a modified Fisher score, a subjective image quality assessment, the range of the Hounsfield units within three ROIs, and the pre-post slice-wise Pearson correlation. The pre-post concordance of the modified Fisher score, the subjective image quality, and the pre-post correlation of the ranges of the Hounsfield units were significantly higher for artefact-reduced than for non-artefact-reduced images. Within the metal-affected slices, the pre-post slice-wise Pearson correlation coefficient was higher for artefact-reduced than for non-artefact-reduced images. The overall diagnostic quality of the artefact-reduced images was improved and reached the level of the pre-interventional FD-CT images. The metal-unaffected parts of the image were not modified. (orig.)

  5. Evaluation of a metal artifact reduction algorithm applied to post-interventional flat detector CT in comparison to pre-treatment CT in patients with acute subarachnoid haemorrhage

    Energy Technology Data Exchange (ETDEWEB)

    Mennecke, Angelika; Svergun, Stanislav; Doerfler, Arnd; Struffert, Tobias [University of Erlangen-Nuremberg, Department of Neuroradiology, Erlangen (Germany); Scholz, Bernhard [Siemens Healthcare GmbH, Forchheim (Germany); Royalty, Kevin [Siemens Medical Solutions, USA, Inc., Hoffman Estates, IL (United States)

    2017-01-15

    Metal artefacts can impair accurate diagnosis of haemorrhage using flat detector CT (FD-CT), especially after aneurysm coiling. Within this work we evaluate a prototype metal artefact reduction algorithm by comparison of the artefact-reduced and the non-artefact-reduced FD-CT images to pre-treatment FD-CT and multi-slice CT images. Twenty-five patients with acute aneurysmal subarachnoid haemorrhage (SAH) were selected retrospectively. FD-CT and multi-slice CT before endovascular treatment as well as FD-CT data sets after treatment were available for all patients. The algorithm was applied to post-treatment FD-CT. The effect of the algorithm was evaluated utilizing the pre-post concordance of a modified Fisher score, a subjective image quality assessment, the range of the Hounsfield units within three ROIs, and the pre-post slice-wise Pearson correlation. The pre-post concordance of the modified Fisher score, the subjective image quality, and the pre-post correlation of the ranges of the Hounsfield units were significantly higher for artefact-reduced than for non-artefact-reduced images. Within the metal-affected slices, the pre-post slice-wise Pearson correlation coefficient was higher for artefact-reduced than for non-artefact-reduced images. The overall diagnostic quality of the artefact-reduced images was improved and reached the level of the pre-interventional FD-CT images. The metal-unaffected parts of the image were not modified. (orig.)

  6. NPP/NPOESS Tools for Rapid Algorithm Updates

    Science.gov (United States)

    Route, G.; Grant, K. D.; Hughes, R.

    2010-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS Preparatory Project (NPP) and NPOESS satellites will carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes both NPP and NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. The Northrop Grumman Aerospace Systems (NGAS) Algorithms and Data Products (A&DP) organization is responsible for the algorithms that produce the Environmental Data Records (EDRs), including their quality aspects. As the Calibration and Validation (Cal/Val) activities move forward following both the NPP launch and subsequent NPOESS launches, rapid algorithm updates may be required. Raytheon and Northrop Grumman have developed tools and processes to enable changes to be evaluated, tested, and moved into the operational baseline in a rapid and efficient manner. This presentation will provide an overview of the tools available to the Cal/Val teams to ensure rapid and accurate assessment of algorithm changes, along with the processes in place to ensure baseline integrity.

  7. DEVELOPMENT OF 2D HUMAN BODY MODELING USING THINNING ALGORITHM

    Directory of Open Access Journals (Sweden)

    K. Srinivasan

    2010-11-01

    Full Text Available Monitoring the behavior and activities of people in Video surveillance has gained more applications in Computer vision. This paper proposes a new approach to model the human body in 2D view for the activity analysis using Thinning algorithm. The first step of this work is Background subtraction which is achieved by the frame differencing algorithm. Thinning algorithm has been used to find the skeleton of the human body. After thinning, the thirteen feature points like terminating points, intersecting points, shoulder, elbow, and knee points have been extracted. Here, this research work attempts to represent the body model in three different ways such as Stick figure model, Patch model and Rectangle body model. The activities of humans have been analyzed with the help of 2D model for the pre-defined poses from the monocular video data. Finally, the time consumption and efficiency of our proposed algorithm have been evaluated.

  8. Trends in the commercial launch services industry

    Science.gov (United States)

    Haase, Ethan E.

    2001-02-01

    The market for space launch services has undergone significant development in the last two decades and is poised to change even further. With the introduction of new players in the market, and the development of new vehicles by existing providers, competition has increased. At the same time, customer payloads have been changing as satellites grow in size and capability. Amidst these changes, launch delays have become a concern in the industry, and launch service providers have developed different solutions to avoid delays and satisfy customer needs. This analysis discusses these trends in the launch services market and their drivers. Focus is given to the market for medium, intermediate, and heavy launch services which generally includes launches of GEO communication satellites, large government payloads, and NGSO constellations. .

  9. Developing Pre-service Teachers' Technology Integration ...

    African Journals Online (AJOL)

    Developing Pre-service Teachers' Technology Integration Competencies in Science and Mathematics Teaching: Experiences from Tanzania and Uganda. ... This study investigated the ICT integration practices in pre-service teacher education in the School of Education at Makerere University (College of Education and ...

  10. Space Launch System Ascent Flight Control Design

    Science.gov (United States)

    Orr, Jeb S.; Wall, John H.; VanZwieten, Tannen S.; Hall, Charles E.

    2014-01-01

    A robust and flexible autopilot architecture for NASA's Space Launch System (SLS) family of launch vehicles is presented. The SLS configurations represent a potentially significant increase in complexity and performance capability when compared with other manned launch vehicles. It was recognized early in the program that a new, generalized autopilot design should be formulated to fulfill the needs of this new space launch architecture. The present design concept is intended to leverage existing NASA and industry launch vehicle design experience and maintain the extensibility and modularity necessary to accommodate multiple vehicle configurations while relying on proven and flight-tested control design principles for large boost vehicles. The SLS flight control architecture combines a digital three-axis autopilot with traditional bending filters to support robust active or passive stabilization of the vehicle's bending and sloshing dynamics using optimally blended measurements from multiple rate gyros on the vehicle structure. The algorithm also relies on a pseudo-optimal control allocation scheme to maximize the performance capability of multiple vectored engines while accommodating throttling and engine failure contingencies in real time with negligible impact to stability characteristics. The architecture supports active in-flight disturbance compensation through the use of nonlinear observers driven by acceleration measurements. Envelope expansion and robustness enhancement is obtained through the use of a multiplicative forward gain modulation law based upon a simple model reference adaptive control scheme.

  11. Construction environment education development activity for children pre-school

    OpenAIRE

    MA. TRAN THI THUY NGA; MA. PHAM THI YEN

    2015-01-01

    Education motor development contribute to the comprehensive development of pre-school children. Building educational environment for young athletes develop in pre-school is one of many issues of concern in the current stage of pre-school education in Vietnam.

  12. Optimization of a truck-drone in tandem delivery network using k-means and genetic algorithm

    Directory of Open Access Journals (Sweden)

    Sergio Mourelo Ferrandez

    2016-04-01

    Full Text Available Purpose: The purpose of this paper is to investigate the effectiveness of implementing unmanned aerial delivery vehicles in delivery networks. We investigate the notion of the reduced overall delivery time, energy, and costs for a truck-drone network by comparing the in-tandem system with a stand-alone delivery effort. The objectives are (1 to investigate the time, energy, and costs associated to a truck-drone delivery network compared to standalone truck or drone, (2 to propose an optimization algorithm that determines the optimal number of launch sites and locations given delivery requirements, and drones per truck, (3 to develop mathematical formulations for closed form estimations for the optimal number of launch locations, optimal total time, as well as the associated cost for the system. Design/methodology/approach: The design of the algorithm herein computes the minimal time of delivery utilizing K-means clustering to find launch locations, as well as a genetic algorithm to solve the truck route as a traveling salesmen problem (TSP. The optimal solution is determined by finding the minimum cost associated to the parabolic convex cost function. The optimal min-cost is determined by finding the most efficient launch locations using K-means algorithms to determine launch locations and a genetic algorithm to determine truck route between those launch locations.  Findings: Results show improvements with in-tandem delivery efforts as opposed to standalone systems. Further, multiple drones per truck are more optimal and contribute to savings in both energy and time. For this, we sampled various initialization variables to derive closed form mathematical solutions for the problem. Originality/value: Ultimately, this provides the necessary analysis of an integrated truck-drone delivery system which could be implemented by a company in order to maximize deliveries while minimizing time and energy. Closed-form mathematical solutions can be used as

  13. In-Flight Suppression of a De-Stabilized F/A-18 Structural Mode Using the Space Launch System Adaptive Augmenting Control System

    Science.gov (United States)

    Wall, John; VanZwieten, Tannen; Giiligan Eric; Miller, Chris; Hanson, Curtis; Orr, Jeb

    2015-01-01

    Adaptive Augmenting Control (AAC) has been developed for NASA's Space Launch System (SLS) family of launch vehicles and implemented as a baseline part of its flight control system (FCS). To raise the technical readiness level of the SLS AAC algorithm, the Launch Vehicle Adaptive Control (LVAC) flight test program was conducted in which the SLS FCS prototype software was employed to control the pitch axis of Dryden's specially outfitted F/A-18, the Full Scale Advanced Systems Test Bed (FAST). This presentation focuses on a set of special test cases which demonstrate the successful mitigation of the unstable coupling of an F/A-18 airframe structural mode with the SLS FCS.

  14. DEVELOPMENT OF A NEW ALGORITHM FOR KEY AND S-BOX GENERATION IN BLOWFISH ALGORITHM

    Directory of Open Access Journals (Sweden)

    TAYSEER S. ATIA

    2014-08-01

    Full Text Available Blowfish algorithm is a block cipher algorithm, its strong, simple algorithm used to encrypt data in block of size 64-bit. Key and S-box generation process in this algorithm require time and memory space the reasons that make this algorithm not convenient to be used in smart card or application requires changing secret key frequently. In this paper a new key and S-box generation process was developed based on Self Synchronization Stream Cipher (SSS algorithm where the key generation process for this algorithm was modified to be used with the blowfish algorithm. Test result shows that the generation process requires relatively slow time and reasonably low memory requirement and this enhance the algorithm and gave it the possibility for different usage.

  15. Space Launch Systems Block 1B Preliminary Navigation System Design

    Science.gov (United States)

    Oliver, T. Emerson; Park, Thomas; Anzalone, Evan; Smith, Austin; Strickland, Dennis; Patrick, Sean

    2018-01-01

    NASA is currently building the Space Launch Systems (SLS) Block 1 launch vehicle for the Exploration Mission 1 (EM-1) test flight. In parallel, NASA is also designing the Block 1B launch vehicle. The Block 1B vehicle is an evolution of the Block 1 vehicle and extends the capability of the NASA launch vehicle. This evolution replaces the Interim Cryogenic Propulsive Stage (ICPS) with the Exploration Upper Stage (EUS). As the vehicle evolves to provide greater lift capability, increased robustness for manned missions, and the capability to execute more demanding missions so must the SLS Integrated Navigation System evolved to support those missions. This paper describes the preliminary navigation systems design for the SLS Block 1B vehicle. The evolution of the navigation hard-ware and algorithms from an inertial-only navigation system for Block 1 ascent flight to a tightly coupled GPS-aided inertial navigation system for Block 1B is described. The Block 1 GN&C system has been designed to meet a LEO insertion target with a specified accuracy. The Block 1B vehicle navigation system is de-signed to support the Block 1 LEO target accuracy as well as trans-lunar or trans-planetary injection accuracy. Additionally, the Block 1B vehicle is designed to support human exploration and thus is designed to minimize the probability of Loss of Crew (LOC) through high-quality inertial instruments and robust algorithm design, including Fault Detection, Isolation, and Recovery (FDIR) logic.

  16. To develop a universal gamut mapping algorithm

    International Nuclear Information System (INIS)

    Morovic, J.

    1998-10-01

    When a colour image from one colour reproduction medium (e.g. nature, a monitor) needs to be reproduced on another (e.g. on a monitor or in print) and these media have different colour ranges (gamuts), it is necessary to have a method for mapping between them. If such a gamut mapping algorithm can be used under a wide range of conditions, it can also be incorporated in an automated colour reproduction system and considered to be in some sense universal. In terms of preliminary work, a colour reproduction system was implemented, for which a new printer characterisation model (including grey-scale correction) was developed. Methods were also developed for calculating gamut boundary descriptors and for calculating gamut boundaries along given lines from them. The gamut mapping solution proposed in this thesis is a gamut compression algorithm developed with the aim of being accurate and universally applicable. It was arrived at by way of an evolutionary gamut mapping development strategy for the purposes of which five test images were reproduced between a CRT and printed media obtained using an inkjet printer. Initially, a number of previously published algorithms were chosen and psychophysically evaluated whereby an important characteristic of this evaluation was that it also considered the performance of algorithms for individual colour regions within the test images used. New algorithms were then developed on their basis, subsequently evaluated and this process was repeated once more. In this series of experiments the new GCUSP algorithm, which consists of a chroma-dependent lightness compression followed by a compression towards the lightness of the reproduction cusp on the lightness axis, gave the most accurate and stable performance overall. The results of these experiments were also useful for improving the understanding of some gamut mapping factors - in particular gamut difference. In addition to looking at accuracy, the pleasantness of reproductions obtained

  17. Pre-emptive resource-constrained multimode project scheduling using genetic algorithm: A dynamic forward approach

    Directory of Open Access Journals (Sweden)

    Aidin Delgoshaei

    2016-09-01

    Full Text Available Purpose: The issue resource over-allocating is a big concern for project engineers in the process of scheduling project activities. Resource over-allocating drawback is frequently seen after scheduling of a project in practice which causes a schedule to be useless. Modifying an over-allocated schedule is very complicated and needs a lot of efforts and time. In this paper, a new and fast tracking method is proposed to schedule large scale projects which can help project engineers to schedule the project rapidly and with more confidence. Design/methodology/approach: In this article, a forward approach for maximizing net present value (NPV in multi-mode resource constrained project scheduling problem while assuming discounted positive cash flows (MRCPSP-DCF is proposed. The progress payment method is used and all resources are considered as pre-emptible. The proposed approach maximizes NPV using unscheduled resources through resource calendar in forward mode. For this purpose, a Genetic Algorithm is applied to solve. Findings: The findings show that the proposed method is an effective way to maximize NPV in MRCPSP-DCF problems while activity splitting is allowed. The proposed algorithm is very fast and can schedule experimental cases with 1000 variables and 100 resources in few seconds. The results are then compared with branch and bound method and simulated annealing algorithm and it is found the proposed genetic algorithm can provide results with better quality. Then algorithm is then applied for scheduling a hospital in practice. Originality/value: The method can be used alone or as a macro in Microsoft Office Project® Software to schedule MRCPSP-DCF problems or to modify resource over-allocated activities after scheduling a project. This can help project engineers to schedule project activities rapidly with more accuracy in practice.

  18. DIDADTIC TOOLS FOR THE STUDENTS’ ALGORITHMIC THINKING DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    T. P. Pushkaryeva

    2017-01-01

    Full Text Available Introduction. Modern engineers must possess high potential of cognitive abilities, in particular, the algorithmic thinking (AT. In this regard, the training of future experts (university graduates of technical specialities has to provide the knowledge of principles and ways of designing of various algorithms, abilities to analyze them, and to choose the most optimal variants for engineering activity implementation. For full formation of AT skills it is necessary to consider all channels of psychological perception and cogitative processing of educational information: visual, auditory, and kinesthetic.The aim of the present research is theoretical basis of design, development and use of resources for successful development of AT during the educational process of training in programming.Methodology and research methods. Methodology of the research involves the basic thesis of cognitive psychology and information approach while organizing the educational process. The research used methods: analysis; modeling of cognitive processes; designing training tools that take into account the mentality and peculiarities of information perception; diagnostic efficiency of the didactic tools. Results. The three-level model for future engineers training in programming aimed at development of AT skills was developed. The model includes three components: aesthetic, simulative, and conceptual. Stages to mastering a new discipline are allocated. It is proved that for development of AT skills when training in programming it is necessary to use kinesthetic tools at the stage of mental algorithmic maps formation; algorithmic animation and algorithmic mental maps at the stage of algorithmic model and conceptual images formation. Kinesthetic tools for development of students’ AT skills when training in algorithmization and programming are designed. Using of kinesthetic training simulators in educational process provide the effective development of algorithmic style of

  19. An ATR architecture for algorithm development and testing

    Science.gov (United States)

    Breivik, Gøril M.; Løkken, Kristin H.; Brattli, Alvin; Palm, Hans C.; Haavardsholm, Trym

    2013-05-01

    A research platform with four cameras in the infrared and visible spectral domains is under development at the Norwegian Defence Research Establishment (FFI). The platform will be mounted on a high-speed jet aircraft and will primarily be used for image acquisition and for development and test of automatic target recognition (ATR) algorithms. The sensors on board produce large amounts of data, the algorithms can be computationally intensive and the data processing is complex. This puts great demands on the system architecture; it has to run in real-time and at the same time be suitable for algorithm development. In this paper we present an architecture for ATR systems that is designed to be exible, generic and efficient. The architecture is module based so that certain parts, e.g. specific ATR algorithms, can be exchanged without affecting the rest of the system. The modules are generic and can be used in various ATR system configurations. A software framework in C++ that handles large data ows in non-linear pipelines is used for implementation. The framework exploits several levels of parallelism and lets the hardware processing capacity be fully utilised. The ATR system is under development and has reached a first level that can be used for segmentation algorithm development and testing. The implemented system consists of several modules, and although their content is still limited, the segmentation module includes two different segmentation algorithms that can be easily exchanged. We demonstrate the system by applying the two segmentation algorithms to infrared images from sea trial recordings.

  20. High-Glass-Transition-Temperature Polyimides Developed for Reusable Launch Vehicle Applications

    Science.gov (United States)

    Chuang, Kathy; Ardent, Cory P.

    2002-01-01

    Polyimide composites have been traditionally used for high-temperature applications in aircraft engines at temperatures up to 550 F (288 C) for thousands of hours. However, as NASA shifts its focus toward the development of advanced reusable launch vehicles, there is an urgent need for lightweight polymer composites that can sustain 600 to 800 F (315 to 427 C) for short excursions (hundreds of hours). To meet critical vehicle weight targets, it is essential that one use lightweight, high-temperature polymer matrix composites in propulsion components such as turbopump housings, ducts, engine supports, and struts. Composite materials in reusable launch vehicle components will heat quickly during launch and reentry. Conventional composites, consisting of layers of fabric or fiber-reinforced lamina, would either blister or encounter catastrophic delamination under high heating rates above 300 C. This blistering and delamination are the result of a sudden volume expansion within the composite due to the release of absorbed moisture and gases generated by the degradation of the polymer matrix. Researchers at the NASA Glenn Research Center and the Boeing Company (Long Beach, CA) recently demonstrated a successful approach for preventing this delamination--the use of three-dimensional stitched composites fabricated by resin infusion.

  1. Development and Optimization of a Tridyne Pressurization System for Pressure Fed Launch Vehicles

    National Research Council Canada - National Science Library

    Chakroborty, Shyama; Wollen, Mark; Malany, Lee

    2006-01-01

    Over the recent years, Microcosm has been pursuing the development of a Tridyne-based pressurization system and its implementation in the Scorpius family of launch vehicles to obtain substantial gain in payload to orbit...

  2. International Launch Vehicle Selection for Interplanetary Travel

    Science.gov (United States)

    Ferrone, Kristine; Nguyen, Lori T.

    2010-01-01

    In developing a mission strategy for interplanetary travel, the first step is to consider launch capabilities which provide the basis for fundamental parameters of the mission. This investigation focuses on the numerous launch vehicles of various characteristics available and in development internationally with respect to upmass, launch site, payload shroud size, fuel type, cost, and launch frequency. This presentation will describe launch vehicles available and in development worldwide, then carefully detail a selection process for choosing appropriate vehicles for interplanetary missions focusing on international collaboration, risk management, and minimization of cost. The vehicles that fit the established criteria will be discussed in detail with emphasis on the specifications and limitations related to interplanetary travel. The final menu of options will include recommendations for overall mission design and strategy.

  3. Second Generation Reusable Launch Vehicle Development and Global Competitiveness of US Space Transportation Industry: Critical Success Factors Assessment

    Science.gov (United States)

    Enyinda, Chris I.

    2002-01-01

    In response to the unrelenting call in both public and private sectors fora to reduce the high cost associated with space transportation, many innovative partially or fully RLV (Reusable Launch Vehicles) designs (X-34-37) were initiated. This call is directed at all levels of space missions including scientific, military, and commercial and all aspects of the missions such as nonrecurring development, manufacture, launch, and operations. According to Wertz, tbr over thirty years, the cost of space access has remained exceedingly high. The consensus in the popular press is that to decrease the current astronomical cost of access to space, more safer, reliable, and economically viable second generation RLVs (SGRLV) must be developed. Countries such as Brazil, India, Japan, and Israel are now gearing up to enter the global launch market with their own commercial space launch vehicles. NASA and the US space launch industry cannot afford to lag behind. Developing SGRLVs will immeasurably improve the US's space transportation capabilities by helping the US to regain the global commercial space markets while supporting the transportation capabilities of NASA's space missions, Developing the SGRLVs will provide affordable commercial space transportation that will assure the competitiveness of the US commercial space transportation industry in the 21st century. Commercial space launch systems are having difficulty obtaining financing because of the high cost and risk involved. Access to key financial markets is necessary for commercial space ventures. However, public sector programs in the form of tax incentives and credits, as well as loan guarantees are not yet available. The purpose of this paper is to stimulate discussion and assess the critical success factors germane for RLVs development and US global competitiveness.

  4. An Enhanced TIMESAT Algorithm for Estimating Vegetation Phenology Metrics from MODIS Data

    Science.gov (United States)

    Tan, Bin; Morisette, Jeffrey T.; Wolfe, Robert E.; Gao, Feng; Ederer, Gregory A.; Nightingale, Joanne; Pedelty, Jeffrey A.

    2012-01-01

    An enhanced TIMESAT algorithm was developed for retrieving vegetation phenology metrics from 250 m and 500 m spatial resolution Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation indexes (VI) over North America. MODIS VI data were pre-processed using snow-cover and land surface temperature data, and temporally smoothed with the enhanced TIMESAT algorithm. An objective third derivative test was applied to define key phenology dates and retrieve a set of phenology metrics. This algorithm has been applied to two MODIS VIs: Normalized Difference Vegetation Index (NDVI) and Enhanced Vegetation Index (EVI). In this paper, we describe the algorithm and use EVI as an example to compare three sets of TIMESAT algorithm/MODIS VI combinations: a) original TIMESAT algorithm with original MODIS VI, b) original TIMESAT algorithm with pre-processed MODIS VI, and c) enhanced TIMESAT and pre-processed MODIS VI. All retrievals were compared with ground phenology observations, some made available through the National Phenology Network. Our results show that for MODIS data in middle to high latitude regions, snow and land surface temperature information is critical in retrieving phenology metrics from satellite observations. The results also show that the enhanced TIMESAT algorithm can better accommodate growing season start and end dates that vary significantly from year to year. The TIMESAT algorithm improvements contribute to more spatial coverage and more accurate retrievals of the phenology metrics. Among three sets of TIMESAT/MODIS VI combinations, the start of the growing season metric predicted by the enhanced TIMESAT algorithm using pre-processed MODIS VIs has the best associations with ground observed vegetation greenup dates.

  5. Development of a Virtual Environment for Catapult Launch Officers

    Science.gov (United States)

    2015-03-01

    the duties of a launch officer. Analysis of the data gathered from the job task analysis produced a flowchart that can be represented as a finite...duties of a launch officer. Analysis of the data gathered from the job task analysis produced a flowchart that can be represented as a finite state...pass through when learning a skill as shown in Table 3.1. These skill levels are: novice, advanced beginner , competence, proficiency, expertise

  6. Critical function monitoring system algorithm development

    International Nuclear Information System (INIS)

    Harmon, D.L.

    1984-01-01

    Accurate critical function status information is a key to operator decision-making during events threatening nuclear power plant safety. The Critical Function Monitoring System provides continuous critical function status monitoring by use of algorithms which mathematically represent the processes by which an operating staff would determine critical function status. This paper discusses in detail the systematic design methodology employed to develop adequate Critical Function Monitoring System algorithms

  7. Launch Control System Software Development System Automation Testing

    Science.gov (United States)

    Hwang, Andrew

    2017-01-01

    The Spaceport Command and Control System (SCCS) is the National Aeronautics and Space Administration's (NASA) launch control system for the Orion capsule and Space Launch System, the next generation manned rocket currently in development. This system requires high quality testing that will measure and test the capabilities of the system. For the past two years, the Exploration and Operations Division at Kennedy Space Center (KSC) has assigned a group including interns and full-time engineers to develop automated tests to save the project time and money. The team worked on automating the testing process for the SCCS GUI that would use streamed simulated data from the testing servers to produce data, plots, statuses, etc. to the GUI. The software used to develop automated tests included an automated testing framework and an automation library. The automated testing framework has a tabular-style syntax, which means the functionality of a line of code must have the appropriate number of tabs for the line to function as intended. The header section contains either paths to custom resources or the names of libraries being used. The automation library contains functionality to automate anything that appears on a desired screen with the use of image recognition software to detect and control GUI components. The data section contains any data values strictly created for the current testing file. The body section holds the tests that are being run. The function section can include any number of functions that may be used by the current testing file or any other file that resources it. The resources and body section are required for all test files; the data and function sections can be left empty if the data values and functions being used are from a resourced library or another file. To help equip the automation team with better tools, the Project Lead of the Automated Testing Team, Jason Kapusta, assigned the task to install and train an optical character recognition (OCR

  8. Scale development for pre-service mathematics teachers ...

    African Journals Online (AJOL)

    The purpose of this study is to develop a scale to determine pre-service mathematics teachers' perceptions related to their pedagogical content knowledge. Firstly, a preliminary perception scale of pedagogical content knowledge was constructed and then administered to 112 pre-service mathematics teachers who were ...

  9. Modeling the Virtual Machine Launching Overhead under Fermicloud

    Energy Technology Data Exchange (ETDEWEB)

    Garzoglio, Gabriele [Fermilab; Wu, Hao [Fermilab; Ren, Shangping [IIT, Chicago; Timm, Steven [Fermilab; Bernabeu, Gerard [Fermilab; Noh, Seo-Young [KISTI, Daejeon

    2014-11-12

    FermiCloud is a private cloud developed by the Fermi National Accelerator Laboratory for scientific workflows. The Cloud Bursting module of the FermiCloud enables the FermiCloud, when more computational resources are needed, to automatically launch virtual machines to available resources such as public clouds. One of the main challenges in developing the cloud bursting module is to decide when and where to launch a VM so that all resources are most effectively and efficiently utilized and the system performance is optimized. However, based on FermiCloud’s system operational data, the VM launching overhead is not a constant. It varies with physical resource (CPU, memory, I/O device) utilization at the time when a VM is launched. Hence, to make judicious decisions as to when and where a VM should be launched, a VM launch overhead reference model is needed. The paper is to develop a VM launch overhead reference model based on operational data we have obtained on FermiCloud and uses the reference model to guide the cloud bursting process.

  10. Applied economic model development algorithm for electronics company

    Directory of Open Access Journals (Sweden)

    Mikhailov I.

    2017-01-01

    Full Text Available The purpose of this paper is to report about received experience in the field of creating the actual methods and algorithms that help to simplify development of applied decision support systems. It reports about an algorithm, which is a result of two years research and have more than one-year practical verification. In a case of testing electronic components, the time of the contract conclusion is crucial point to make the greatest managerial mistake. At this stage, it is difficult to achieve a realistic assessment of time-limit and of wage-fund for future work. The creation of estimating model is possible way to solve this problem. In the article is represented an algorithm for creation of those models. The algorithm is based on example of the analytical model development that serves for amount of work estimation. The paper lists the algorithm’s stages and explains their meanings with participants’ goals. The implementation of the algorithm have made possible twofold acceleration of these models development and fulfilment of management’s requirements. The resulting models have made a significant economic effect. A new set of tasks was identified to be further theoretical study.

  11. Sensor Data Quality and Angular Rate Down-Selection Algorithms on SLS EM-1

    Science.gov (United States)

    Park, Thomas; Smith, Austin; Oliver, T. Emerson

    2018-01-01

    The NASA Space Launch System Block 1 launch vehicle is equipped with an Inertial Navigation System (INS) and multiple Rate Gyro Assemblies (RGA) that are used in the Guidance, Navigation, and Control (GN&C) algorithms. The INS provides the inertial position, velocity, and attitude of the vehicle along with both angular rate and specific force measurements. Additionally, multiple sets of co-located rate gyros supply angular rate data. The collection of angular rate data, taken along the launch vehicle, is used to separate out vehicle motion from flexible body dynamics. Since the system architecture uses redundant sensors, the capability was developed to evaluate the health (or validity) of the independent measurements. A suite of Sensor Data Quality (SDQ) algorithms is responsible for assessing the angular rate data from the redundant sensors. When failures are detected, SDQ will take the appropriate action and disqualify or remove faulted sensors from forward processing. Additionally, the SDQ algorithms contain logic for down-selecting the angular rate data used by the GNC software from the set of healthy measurements. This paper explores the trades and analyses that were performed in selecting a set of robust fault-detection algorithms included in the GN&C flight software. These trades included both an assessment of hardware-provided health and status data as well as an evaluation of different algorithms based on time-to-detection, type of failures detected, and probability of detecting false positives. We then provide an overview of the algorithms used for both fault-detection and measurement down selection. We next discuss the role of trajectory design, flexible-body models, and vehicle response to off-nominal conditions in setting the detection thresholds. Lastly, we present lessons learned from software integration and hardware-in-the-loop testing.

  12. Development of Educational Support System for Algorithm using Flowchart

    Science.gov (United States)

    Ohchi, Masashi; Aoki, Noriyuki; Furukawa, Tatsuya; Takayama, Kanta

    Recently, an information technology is indispensable for the business and industrial developments. However, it has been a social problem that the number of software developers has been insufficient. To solve the problem, it is necessary to develop and implement the environment for learning the algorithm and programming language. In the paper, we will describe the algorithm study support system for a programmer using the flowchart. Since the proposed system uses Graphical User Interface(GUI), it will become easy for a programmer to understand the algorithm in programs.

  13. Preparation and Launch of the JEM ISS Elements - A NASA Mission Manager's Perspective

    Science.gov (United States)

    Higginbotham, Scott A.

    2016-01-01

    The pre-flight launch site preparations and launch of the Japanese Experiment Module (JEM) elements of the International Space Station required an intense multi-year, international collaborative effort between US and Japanese personnel at the Kennedy Space Center (KSC). This presentation will provide a brief overview of KSC, a brief overview of the ISS, and a summary of authors experience managing the NASA team responsible that supported and conducted the JEM element operations.

  14. A Conversation on Data Mining Strategies in LC-MS Untargeted Metabolomics: Pre-Processing and Pre-Treatment Steps

    Directory of Open Access Journals (Sweden)

    Fidele Tugizimana

    2016-11-01

    Full Text Available Untargeted metabolomic studies generate information-rich, high-dimensional, and complex datasets that remain challenging to handle and fully exploit. Despite the remarkable progress in the development of tools and algorithms, the “exhaustive” extraction of information from these metabolomic datasets is still a non-trivial undertaking. A conversation on data mining strategies for a maximal information extraction from metabolomic data is needed. Using a liquid chromatography-mass spectrometry (LC-MS-based untargeted metabolomic dataset, this study explored the influence of collection parameters in the data pre-processing step, scaling and data transformation on the statistical models generated, and feature selection, thereafter. Data obtained in positive mode generated from a LC-MS-based untargeted metabolomic study (sorghum plants responding dynamically to infection by a fungal pathogen were used. Raw data were pre-processed with MarkerLynxTM software (Waters Corporation, Manchester, UK. Here, two parameters were varied: the intensity threshold (50–100 counts and the mass tolerance (0.005–0.01 Da. After the pre-processing, the datasets were imported into SIMCA (Umetrics, Umea, Sweden for more data cleaning and statistical modeling. In addition, different scaling (unit variance, Pareto, etc. and data transformation (log and power methods were explored. The results showed that the pre-processing parameters (or algorithms influence the output dataset with regard to the number of defined features. Furthermore, the study demonstrates that the pre-treatment of data prior to statistical modeling affects the subspace approximation outcome: e.g., the amount of variation in X-data that the model can explain and predict. The pre-processing and pre-treatment steps subsequently influence the number of statistically significant extracted/selected features (variables. Thus, as informed by the results, to maximize the value of untargeted metabolomic data

  15. Launch Vehicle Manual Steering with Adaptive Augmenting Control In-flight Evaluations of Adverse Interactions Using a Piloted Aircraft

    Science.gov (United States)

    Hanson, Curt; Miller, Chris; Wall, John H.; Vanzwieten, Tannen S.; Gilligan, Eric; Orr, Jeb S.

    2015-01-01

    An adaptive augmenting control algorithm for the Space Launch System has been developed at the Marshall Space Flight Center as part of the launch vehicles baseline flight control system. A prototype version of the SLS flight control software was hosted on a piloted aircraft at the Armstrong Flight Research Center to demonstrate the adaptive controller on a full-scale realistic application in a relevant flight environment. Concerns regarding adverse interactions between the adaptive controller and a proposed manual steering mode were investigated by giving the pilot trajectory deviation cues and pitch rate command authority. Two NASA research pilots flew a total of twenty five constant pitch-rate trajectories using a prototype manual steering mode with and without adaptive control.

  16. National Launch System comparative economic analysis

    Science.gov (United States)

    Prince, A.

    1992-01-01

    Results are presented from an analysis of economic benefits (or losses), in the form of the life cycle cost savings, resulting from the development of the National Launch System (NLS) family of launch vehicles. The analysis was carried out by comparing various NLS-based architectures with the current Shuttle/Titan IV fleet. The basic methodology behind this NLS analysis was to develop a set of annual payload requirements for the Space Station Freedom and LEO, to design launch vehicle architectures around these requirements, and to perform life-cycle cost analyses on all of the architectures. A SEI requirement was included. Launch failure costs were estimated and combined with the relative reliability assumptions to measure the effects of losses. Based on the analysis, a Shuttle/NLS architecture evolving into a pressurized-logistics-carrier/NLS architecture appears to offer the best long-term cost benefit.

  17. STS-93 crew have breakfast before launch in O&C Building

    Science.gov (United States)

    1999-01-01

    The STS-93 crew gathers a third time for a pre-launch breakfast in the Operations and Checkout Building before suiting up for launch. After Space Shuttle Columbia's July 22 launch attempt was scrubbed due to the weather, the launch was rescheduled for Friday, July 23, at 12:24 a.m. EDT. Seated from left are Mission Specialists Catherine G. Coleman (Ph.D.) and Steven A. Hawley (Ph.D.); Commander Eileen M. Collins; Mission Specialist Michel Tognini, of France, who represents the Centre National d'Etudes Spatiales (CNES); and Pilot Jeffrey S. Ashby. STS-93 is a five- day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. Collins is the first woman to serve as commander of a Shuttle mission. The target landing date is July 27, 1999, at 11:20 p.m. EDT.

  18. VEGA, a small launch vehicle

    Science.gov (United States)

    Duret, François; Fabrizi, Antonio

    1999-09-01

    Several studies have been performed in Europe aiming to promote the full development of a small launch vehicle to put into orbit one ton class spacecrafts. But during the last ten years, the european workforce was mainly oriented towards the qualification of the heavy class ARIANE 5 launch vehicle.Then, due also to lack of visibility on this reduced segment of market, when comparing with the geosatcom market, no proposal was sufficiently attractive to get from the potentially interrested authorities a clear go-ahead, i.e. a financial committment. The situation is now rapidly evolving. Several european states, among them ITALY and FRANCE, are now convinced of the necessity of the availability of such a transportation system, an important argument to promote small missions, using small satellites. Application market will be mainly scientific experiments and earth observation; some telecommunications applications may be also envisaged such as placement of little LEO constellation satellites, or replacement after failure of big LEO constellation satellites. FIAT AVIO and AEROSPATIALE have proposed to their national agencies the development of such a small launch vehicle, named VEGA. The paper presents the story of the industrial proposal, and the present status of the project: Mission spectrum, technical definition, launch service and performance, target development plan and target recurring costs, as well as the industrial organisation for development, procurement, marketing and operations.

  19. Quality function deployment in launch operations

    Science.gov (United States)

    Portanova, P. L.; Tomei, E. J., Jr.

    1990-11-01

    The goal of the Advanced Launch System (ALS) is a more efficient launch capability that provides a highly reliable and operable system at substantially lower cost than current launch systems. Total Quality Management (TQM) principles are being emphasized throughout the ALS program. A continuous improvement philosophy is directed toward satisfying users' and customer's requirements in terms of quality, performance, schedule, and cost. Quality Function Deployment (QFD) is interpreted as the voice of the customer (or user), and it is an important planning tool in translating these requirements throughout the whole process of design, development, manufacture, and operations. This report explores the application of QFD methodology to launch operations, including the modification and addition of events (operations planning) in the engineering development cycle, and presents an informal status of study results to date. QFD is a technique for systematically analyzing the customer's (Space Command) perceptions of what constitutes a highly reliable and operable system and functionally breaking down those attributes to identify the critical characteristics that determine an efficient launch system capability. In applying the principle of QFD, a series of matrices or charts are developed with emphasis on the one commonly known as the House of Quality (because of its roof-like format), which identifies and translates the most critical information.

  20. Simultaneous data pre-processing and SVM classification model selection based on a parallel genetic algorithm applied to spectroscopic data of olive oils.

    Science.gov (United States)

    Devos, Olivier; Downey, Gerard; Duponchel, Ludovic

    2014-04-01

    Classification is an important task in chemometrics. For several years now, support vector machines (SVMs) have proven to be powerful for infrared spectral data classification. However such methods require optimisation of parameters in order to control the risk of overfitting and the complexity of the boundary. Furthermore, it is established that the prediction ability of classification models can be improved using pre-processing in order to remove unwanted variance in the spectra. In this paper we propose a new methodology based on genetic algorithm (GA) for the simultaneous optimisation of SVM parameters and pre-processing (GENOPT-SVM). The method has been tested for the discrimination of the geographical origin of Italian olive oil (Ligurian and non-Ligurian) on the basis of near infrared (NIR) or mid infrared (FTIR) spectra. Different classification models (PLS-DA, SVM with mean centre data, GENOPT-SVM) have been tested and statistically compared using McNemar's statistical test. For the two datasets, SVM with optimised pre-processing give models with higher accuracy than the one obtained with PLS-DA on pre-processed data. In the case of the NIR dataset, most of this accuracy improvement (86.3% compared with 82.8% for PLS-DA) occurred using only a single pre-processing step. For the FTIR dataset, three optimised pre-processing steps are required to obtain SVM model with significant accuracy improvement (82.2%) compared to the one obtained with PLS-DA (78.6%). Furthermore, this study demonstrates that even SVM models have to be developed on the basis of well-corrected spectral data in order to obtain higher classification rates. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Developing an Enhanced Lightning Jump Algorithm for Operational Use

    Science.gov (United States)

    Schultz, Christopher J.; Petersen, Walter A.; Carey, Lawrence D.

    2009-01-01

    Overall Goals: 1. Build on the lightning jump framework set through previous studies. 2. Understand what typically occurs in nonsevere convection with respect to increases in lightning. 3. Ultimately develop a lightning jump algorithm for use on the Geostationary Lightning Mapper (GLM). 4 Lightning jump algorithm configurations were developed (2(sigma), 3(sigma), Threshold 10 and Threshold 8). 5 algorithms were tested on a population of 47 nonsevere and 38 severe thunderstorms. Results indicate that the 2(sigma) algorithm performed best over the entire thunderstorm sample set with a POD of 87%, a far of 35%, a CSI of 59% and a HSS of 75%.

  2. OLYMPUS system and development of its pre-processor

    International Nuclear Information System (INIS)

    Okamoto, Masao; Takeda, Tatsuoki; Tanaka, Masatoshi; Asai, Kiyoshi; Nakano, Koh.

    1977-08-01

    The OLYMPUS SYSTEM developed by K. V. Roverts et al. was converted and introduced in computer system FACOM 230/75 of the JAERI Computing Center. A pre-processor was also developed for the OLYMPUS SYSTEM. The OLYMPUS SYSTEM is very useful for development, standardization and exchange of programs in thermonuclear fusion research and plasma physics. The pre-processor developed by the present authors is not only essential for the JAERI OLYMPUS SYSTEM, but also useful in manipulation, creation and correction of program files. (auth.)

  3. Overview of GX launch services by GALEX

    Science.gov (United States)

    Sato, Koji; Kondou, Yoshirou

    2006-07-01

    Galaxy Express Corporation (GALEX) is a launch service company in Japan to develop a medium size rocket, GX rocket and to provide commercial launch services for medium/small low Earth orbit (LEO) and Sun synchronous orbit (SSO) payloads with a future potential for small geo-stationary transfer orbit (GTO). It is GALEX's view that small/medium LEO/SSO payloads compose of medium scaled but stable launch market due to the nature of the missions. GX rocket is a two-stage rocket of well flight proven liquid oxygen (LOX)/kerosene booster and LOX/liquid natural gas (LNG) upper stage. This LOX/LNG propulsion under development by Japan's Aerospace Exploration Agency (JAXA), is robust with comparable performance as other propulsions and have future potential for wider application such as exploration programs. GX rocket is being developed through a joint work between the industries and GX rocket is applying a business oriented approach in order to realize competitive launch services for which well flight proven hardware and necessary new technology are to be introduced as much as possible. It is GALEX's goal to offer “Easy Access to Space”, a highly reliable and user-friendly launch services with a competitive price. GX commercial launch will start in Japanese fiscal year (JFY) 2007 2008.

  4. Evaluation of train-speed control algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Slavik, M.M. [BKS Advantech (Pty.) Ltd., Pretoria (South Africa)

    2000-07-01

    A relatively simple and fast simulator has been developed and used for the preliminary testing of train cruise-control algorithms. The simulation is done in software on a PC. The simulator is used to gauge the consequences and feasibility of a cruise-control strategy prior to more elaborate testing and evaluation. The tool was used to design and pre-test a train-cruise control algorithm called NSS, which does not require knowledge of exact train mass, vertical alignment, or actual braking force. Only continuous measurements on the speed of the train and electrical current are required. With this modest input, the NSS algorithm effected speed changes smoothly and efficiently for a wide range of operating conditions. (orig.)

  5. CubeSat Launch Initiative

    Science.gov (United States)

    Higginbotham, Scott

    2016-01-01

    The National Aeronautics and Space Administration (NASA) recognizes the tremendous potential that CubeSats (very small satellites) have to inexpensively demonstrate advanced technologies, collect scientific data, and enhance student engagement in Science, Technology, Engineering, and Mathematics (STEM). The CubeSat Launch Initiative (CSLI) was created to provide launch opportunities for CubeSats developed by academic institutions, non-profit entities, and NASA centers. This presentation will provide an overview of the CSLI, its benefits, and its results.

  6. Development of a Novel Locomotion Algorithm for Snake Robot

    International Nuclear Information System (INIS)

    Khan, Raisuddin; Billah, Md Masum; Watanabe, Mitsuru; Shafie, A A

    2013-01-01

    A novel algorithm for snake robot locomotion is developed and analyzed in this paper. Serpentine is one of the renowned locomotion for snake robot in disaster recovery mission to overcome narrow space navigation. Several locomotion for snake navigation, such as concertina or rectilinear may be suitable for narrow spaces, but is highly inefficient if the same type of locomotion is used even in open spaces resulting friction reduction which make difficulties for snake movement. A novel locomotion algorithm has been proposed based on the modification of the multi-link snake robot, the modifications include alterations to the snake segments as well elements that mimic scales on the underside of the snake body. Snake robot can be able to navigate in the narrow space using this developed locomotion algorithm. The developed algorithm surmount the others locomotion limitation in narrow space navigation

  7. Gemini News Service Re-launch | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Gemini closed in 2002 after three decades of operation due, in part, to the high ... and, if appropriate, a business plan for re-launching Gemini News Service at the ... IWRA/IDRC webinar on climate change and adaptive water management.

  8. Magnetic Launch Assist Demonstration Test

    Science.gov (United States)

    2001-01-01

    This image shows a 1/9 subscale model vehicle clearing the Magnetic Launch Assist System, formerly referred to as the Magnetic Levitation (MagLev), test track during a demonstration test conducted at the Marshall Space Flight Center (MSFC). Engineers at MSFC have developed and tested Magnetic Launch Assist technologies. To launch spacecraft into orbit, a Magnetic Launch Assist System would use magnetic fields to levitate and accelerate a vehicle along a track at very high speeds. Similar to high-speed trains and roller coasters that use high-strength magnets to lift and propel a vehicle a couple of inches above a guideway, a launch-assist system would electromagnetically drive a space vehicle along the track. A full-scale, operational track would be about 1.5-miles long and capable of accelerating a vehicle to 600 mph in 9.5 seconds. This track is an advanced linear induction motor. Induction motors are common in fans, power drills, and sewing machines. Instead of spinning in a circular motion to turn a shaft or gears, a linear induction motor produces thrust in a straight line. Mounted on concrete pedestals, the track is 100-feet long, about 2-feet wide and about 1.5-feet high. The major advantages of launch assist for NASA launch vehicles is that it reduces the weight of the take-off, the landing gear, the wing size, and less propellant resulting in significant cost savings. The US Navy and the British MOD (Ministry of Defense) are planning to use magnetic launch assist for their next generation aircraft carriers as the aircraft launch system. The US Army is considering using this technology for launching target drones for anti-aircraft training.

  9. 3-D image pre-processing algorithms for improved automated tracing of neuronal arbors.

    Science.gov (United States)

    Narayanaswamy, Arunachalam; Wang, Yu; Roysam, Badrinath

    2011-09-01

    The accuracy and reliability of automated neurite tracing systems is ultimately limited by image quality as reflected in the signal-to-noise ratio, contrast, and image variability. This paper describes a novel combination of image processing methods that operate on images of neurites captured by confocal and widefield microscopy, and produce synthetic images that are better suited to automated tracing. The algorithms are based on the curvelet transform (for denoising curvilinear structures and local orientation estimation), perceptual grouping by scalar voting (for elimination of non-tubular structures and improvement of neurite continuity while preserving branch points), adaptive focus detection, and depth estimation (for handling widefield images without deconvolution). The proposed methods are fast, and capable of handling large images. Their ability to handle images of unlimited size derives from automated tiling of large images along the lateral dimension, and processing of 3-D images one optical slice at a time. Their speed derives in part from the fact that the core computations are formulated in terms of the Fast Fourier Transform (FFT), and in part from parallel computation on multi-core computers. The methods are simple to apply to new images since they require very few adjustable parameters, all of which are intuitive. Examples of pre-processing DIADEM Challenge images are used to illustrate improved automated tracing resulting from our pre-processing methods.

  10. Space Launch System Accelerated Booster Development Cycle

    Science.gov (United States)

    Arockiam, Nicole; Whittecar, William; Edwards, Stephen

    2012-01-01

    With the retirement of the Space Shuttle, NASA is seeking to reinvigorate the national space program and recapture the public s interest in human space exploration by developing missions to the Moon, near-earth asteroids, Lagrange points, Mars, and beyond. The would-be successor to the Space Shuttle, NASA s Constellation Program, planned to take humans back to the Moon by 2020, but due to budgetary constraints was cancelled in 2010 in search of a more "affordable, sustainable, and realistic" concept2. Following a number of studies, the much anticipated Space Launch System (SLS) was unveiled in September of 2011. The SLS core architecture consists of a cryogenic first stage with five Space Shuttle Main Engines (SSMEs), and a cryogenic second stage using a new J-2X engine3. The baseline configuration employs two 5-segment solid rocket boosters to achieve a 70 metric ton payload capability, but a new, more capable booster system will be required to attain the goal of 130 metric tons to orbit. To this end, NASA s Marshall Space Flight Center recently released a NASA Research Announcement (NRA) entitled "Space Launch System (SLS) Advanced Booster Engineering Demonstration and/or Risk Reduction." The increased emphasis on affordability is evident in the language used in the NRA, which is focused on risk reduction "leading to an affordable Advanced Booster that meets the evolved capabilities of SLS" and "enabling competition" to "enhance SLS affordability. The purpose of the work presented in this paper is to perform an independent assessment of the elements that make up an affordable and realistic path forward for the SLS booster system, utilizing advanced design methods and technology evaluation techniques. The goal is to identify elements that will enable a more sustainable development program by exploring the trade space of heavy lift booster systems and focusing on affordability, operability, and reliability at the system and subsystem levels5. For this study

  11. Life Cycle Analysis of Dedicated Nano-Launch Technologies

    Science.gov (United States)

    Zapata, Edgar; McCleskey, Carey (Editor); Martin, John; Lepsch, Roger; Ternani, Tosoc

    2014-01-01

    Recent technology advancements have enabled the development of small cheap satellites that can perform useful functions in the space environment. Currently, the only low cost option for getting these payloads into orbit is through ride share programs - small satellites awaiting the launch of a larger satellite, and then riding along on the same launcher. As a result, these small satellite customers await primary payload launches and a backlog exists. An alternative option would be dedicated nano-launch systems built and operated to provide more flexible launch services, higher availability, and affordable prices. The potential customer base that would drive requirements or support a business case includes commercial, academia, civil government and defense. Further, NASA technology investments could enable these alternative game changing options. With this context, in 2013 the Game Changing Development (GCD) program funded a NASA team to investigate the feasibility of dedicated nano-satellite launch systems with a recurring cost of less than $2 million per launch for a 5 kg payload to low Earth orbit. The team products would include potential concepts, technologies and factors for enabling the ambitious cost goal, exploring the nature of the goal itself, and informing the GCD program technology investment decision making process. This paper provides an overview of the life cycle analysis effort that was conducted in 2013 by an inter-center NASA team. This effort included the development of reference nano-launch system concepts, developing analysis processes and models, establishing a basis for cost estimates (development, manufacturing and launch) suitable to the scale of the systems, and especially, understanding the relationship of potential game changing technologies to life cycle costs, as well as other factors, such as flights per year.

  12. PRE-INVESTMENT ASSESSMENT OF ACADEMIC SCIENCE DEVELOPMENT EXPERIENCE: CASE VMNK YAMAL

    Directory of Open Access Journals (Sweden)

    Chechetkina E. V.

    2015-09-01

    Full Text Available In the article the author analyzed the possibilities of effectiveness evaluation of an innovative project at the pre-investment stage, systematized approach to such analysis. In addition, general procedure algorithm for an assessment of scientific project’s potential efficiency is suggested. The specifics of the paper is to try to overcome the factors of uncertainty and information incompleteness, typical for the innovative projects, ranging the key indicators depending on time and a place of their occurrencein different activity fields of the company, analyzing of potential and prospects innovation using, and also other factors influence process of carrying out the pre-investment analysis. Experience of effectiveness evaluation of scientific projects, which is proposed for implementation in OAO «Gazprom», in collaboration with the IPGG SB RAS, is presented.

  13. Startup development: strategic point of view

    OpenAIRE

    ZOPUNYAN Y.S.

    2015-01-01

    The article analyses different strategies of the startup development. The author considers such phases of the startup development as pre-startup, launch, post-startup. The author analyzes factors that affect startup development. The paper presents conclusions about key elements of startup's strategic development.

  14. COSMOS Launch Services

    Science.gov (United States)

    Kalnins, Indulis

    2002-01-01

    COSMOS-3M is a two stage launcher with liquid propellant rocket engines. Since 1960's COSMOS has launched satellites of up to 1.500kg in both circular low Earth and elliptical orbits with high inclination. The direct SSO ascent is available from Plesetsk launch site. The very high number of 759 launches and the achieved success rate of 97,4% makes this space transportation system one of the most reliable and successful launchers in the world. The German small satellite company OHB System co-operates since 1994 with the COSMOS manufacturer POLYOT, Omsk, in Russia. They have created the joint venture COSMOS International and successfully launched five German and Italian satellites in 1999 and 2000. The next commercial launches are contracted for 2002 and 2003. In 2005 -2007 COSMOS will be also used for the new German reconnaissance satellite launches. This paper provides an overview of COSMOS-3M launcher: its heritage and performance, examples of scientific and commercial primary and piggyback payload launches, the launch service organization and international cooperation. The COSMOS launch service business strategy main points are depicted. The current and future position of COSMOS in the worldwide market of launch services is outlined.

  15. Model based development of engine control algorithms

    NARCIS (Netherlands)

    Dekker, H.J.; Sturm, W.L.

    1996-01-01

    Model based development of engine control systems has several advantages. The development time and costs are strongly reduced because much of the development and optimization work is carried out by simulating both engine and control system. After optimizing the control algorithm it can be executed

  16. B ampersand W PWR advanced control system algorithm development

    International Nuclear Information System (INIS)

    Winks, R.W.; Wilson, T.L.; Amick, M.

    1992-01-01

    This paper discusses algorithm development of an Advanced Control System for the B ampersand W Pressurized Water Reactor (PWR) nuclear power plant. The paper summarizes the history of the project, describes the operation of the algorithm, and presents transient results from a simulation of the plant and control system. The history discusses the steps in the development process and the roles played by the utility owners, B ampersand W Nuclear Service Company (BWNS), Oak Ridge National Laboratory (ORNL), and the Foxboro Company. The algorithm description is a brief overview of the features of the control system. The transient results show that operation of the algorithm in a normal power maneuvering mode and in a moderately large upset following a feedwater pump trip

  17. Artificial intelligent decision support for low-cost launch vehicle integrated mission operations

    Science.gov (United States)

    Szatkowski, Gerard P.; Schultz, Roger

    1988-01-01

    The feasibility, benefits, and risks associated with Artificial Intelligence (AI) Expert Systems applied to low cost space expendable launch vehicle systems are reviewed. This study is in support of the joint USAF/NASA effort to define the next generation of a heavy-lift Advanced Launch System (ALS) which will provide economical and routine access to space. The significant technical goals of the ALS program include: a 10 fold reduction in cost per pound to orbit, launch processing in under 3 weeks, and higher reliability and safety standards than current expendables. Knowledge-based system techniques are being explored for the purpose of automating decision support processes in onboard and ground systems for pre-launch checkout and in-flight operations. Issues such as: satisfying real-time requirements, providing safety validation, hardware and Data Base Management System (DBMS) interfacing, system synergistic effects, human interfaces, and ease of maintainability, have an effect on the viability of expert systems as a useful tool.

  18. Solving Optimization Problems via Vortex Optimization Algorithm and Cognitive Development Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Ahmet Demir

    2017-01-01

    Full Text Available In the fields which require finding the most appropriate value, optimization became a vital approach to employ effective solutions. With the use of optimization techniques, many different fields in the modern life have found solutions to their real-world based problems. In this context, classical optimization techniques have had an important popularity. But after a while, more advanced optimization problems required the use of more effective techniques. At this point, Computer Science took an important role on providing software related techniques to improve the associated literature. Today, intelligent optimization techniques based on Artificial Intelligence are widely used for optimization problems. The objective of this paper is to provide a comparative study on the employment of classical optimization solutions and Artificial Intelligence solutions for enabling readers to have idea about the potential of intelligent optimization techniques. At this point, two recently developed intelligent optimization algorithms, Vortex Optimization Algorithm (VOA and Cognitive Development Optimization Algorithm (CoDOA, have been used to solve some multidisciplinary optimization problems provided in the source book Thomas' Calculus 11th Edition and the obtained results have compared with classical optimization solutions. 

  19. A Proposed Criterion for Launch Ramp Availability

    National Research Council Canada - National Science Library

    Dalzell, J

    2003-01-01

    The project under which the present report was produced has as an objective the development of methods for the evaluation and comparison of stem-launch and side-launch systems for small boat deployment from USCG cutters...

  20. Using Discrete Event Simulation to Model Integrated Commodities Consumption for a Launch Campaign of the Space Launch System

    Science.gov (United States)

    Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant

    2014-01-01

    In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.

  1. Intelligent inversion method for pre-stack seismic big data based on MapReduce

    Science.gov (United States)

    Yan, Xuesong; Zhu, Zhixin; Wu, Qinghua

    2018-01-01

    Seismic exploration is a method of oil exploration that uses seismic information; that is, according to the inversion of seismic information, the useful information of the reservoir parameters can be obtained to carry out exploration effectively. Pre-stack data are characterised by a large amount of data, abundant information, and so on, and according to its inversion, the abundant information of the reservoir parameters can be obtained. Owing to the large amount of pre-stack seismic data, existing single-machine environments have not been able to meet the computational needs of the huge amount of data; thus, the development of a method with a high efficiency and the speed to solve the inversion problem of pre-stack seismic data is urgently needed. The optimisation of the elastic parameters by using a genetic algorithm easily falls into a local optimum, which results in a non-obvious inversion effect, especially for the optimisation effect of the density. Therefore, an intelligent optimisation algorithm is proposed in this paper and used for the elastic parameter inversion of pre-stack seismic data. This algorithm improves the population initialisation strategy by using the Gardner formula and the genetic operation of the algorithm, and the improved algorithm obtains better inversion results when carrying out a model test with logging data. All of the elastic parameters obtained by inversion and the logging curve of theoretical model are fitted well, which effectively improves the inversion precision of the density. This algorithm was implemented with a MapReduce model to solve the seismic big data inversion problem. The experimental results show that the parallel model can effectively reduce the running time of the algorithm.

  2. Magnetic Launch Assist Experimental Track

    Science.gov (United States)

    1999-01-01

    In this photograph, a futuristic spacecraft model sits atop a carrier on the Magnetic Launch Assist System, formerly known as the Magnetic Levitation (MagLev) System, experimental track at the Marshall Space Flight Center (MSFC). Engineers at MSFC have developed and tested Magnetic Launch Assist technologies that would use magnetic fields to levitate and accelerate a vehicle along a track at very high speeds. Similar to high-speed trains and roller coasters that use high-strength magnets to lift and propel a vehicle a couple of inches above a guideway, a Magnetic Launch Assist system would electromagnetically drive a space vehicle along the track. A full-scale, operational track would be about 1.5-miles long and capable of accelerating a vehicle to 600 mph in 9.5 seconds. This track is an advanced linear induction motor. Induction motors are common in fans, power drills, and sewing machines. Instead of spinning in a circular motion to turn a shaft or gears, a linear induction motor produces thrust in a straight line. Mounted on concrete pedestals, the track is 100-feet long, about 2-feet wide, and about 1.5-feet high. The major advantages of launch assist for NASA launch vehicles is that it reduces the weight of the take-off, the landing gear, the wing size, and less propellant resulting in significant cost savings. The US Navy and the British MOD (Ministry of Defense) are planning to use magnetic launch assist for their next generation aircraft carriers as the aircraft launch system. The US Army is considering using this technology for launching target drones for anti-aircraft training.

  3. Rationales for the Lightning Launch Commit Criteria

    Science.gov (United States)

    Willett, John C. (Editor); Merceret, Francis J. (Editor); Krider, E. Philip; O'Brien, T. Paul; Dye, James E.; Walterscheid, Richard L.; Stolzenburg, Maribeth; Cummins, Kenneth; Christian, Hugh J.; Madura, John T.

    2016-01-01

    Since natural and triggered lightning are demonstrated hazards to launch vehicles, payloads, and spacecraft, NASA and the Department of Defense (DoD) follow the Lightning Launch Commit Criteria (LLCC) for launches from Federal Ranges. The LLCC were developed to prevent future instances of a rocket intercepting natural lightning or triggering a lightning flash during launch from a Federal Range. NASA and DoD utilize the Lightning Advisory Panel (LAP) to establish and develop robust rationale from which the criteria originate. The rationale document also contains appendices that provide additional scientific background, including detailed descriptions of the theory and observations behind the rationales. The LLCC in whole or part are used across the globe due to the rigor of the documented criteria and associated rationale. The Federal Aviation Administration (FAA) adopted the LLCC in 2006 for commercial space transportation and the criteria were codified in the FAA's Code of Federal Regulations (CFR) for Safety of an Expendable Launch Vehicle (Appendix G to 14 CFR Part 417, (G417)) and renamed Lightning Flight Commit Criteria in G417.

  4. 14 CFR 420.21 - Launch site location review-launch site boundary.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Launch site location review-launch site boundary. 420.21 Section 420.21 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION... travels given a worst-case launch vehicle failure in the launch area. An applicant must clearly and...

  5. Aircraft operability methods applied to space launch vehicles

    Science.gov (United States)

    Young, Douglas

    1997-01-01

    The commercial space launch market requirement for low vehicle operations costs necessitates the application of methods and technologies developed and proven for complex aircraft systems. The ``building in'' of reliability and maintainability, which is applied extensively in the aircraft industry, has yet to be applied to the maximum extent possible on launch vehicles. Use of vehicle system and structural health monitoring, automated ground systems and diagnostic design methods derived from aircraft applications support the goal of achieving low cost launch vehicle operations. Transforming these operability techniques to space applications where diagnostic effectiveness has significantly different metrics is critical to the success of future launch systems. These concepts will be discussed with reference to broad launch vehicle applicability. Lessons learned and techniques used in the adaptation of these methods will be outlined drawing from recent aircraft programs and implementation on phase 1 of the X-33/RLV technology development program.

  6. JPSS CGS Tools For Rapid Algorithm Updates

    Science.gov (United States)

    Smith, D. C.; Grant, K. D.

    2011-12-01

    The National Oceanic and Atmospheric Administration (NOAA) and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation civilian weather and environmental satellite system: the Joint Polar Satellite System (JPSS). JPSS will contribute the afternoon orbit component and ground processing system of the restructured National Polar-orbiting Operational Environmental Satellite System (NPOESS). As such, JPSS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the ground processing component of both POES and the Defense Meteorological Satellite Program (DMSP) replacement known as the Defense Weather Satellite System (DWSS), managed by the Department of Defense (DoD). The JPSS satellites will carry a suite of sensors designed to collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground processing system for JPSS is known as the JPSS Common Ground System (JPSS CGS), and consists of a Command, Control, and Communications Segment (C3S) and the Interface Data Processing Segment (IDPS). Both are developed by Raytheon Intelligence and Information Systems (IIS). The Interface Data Processing Segment will process NPOESS Preparatory Project, Joint Polar Satellite System and Defense Weather Satellite System satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. Under NPOESS, Northrop Grumman Aerospace Systems Algorithms and Data Products (A&DP) organization was responsible for the algorithms that produce the EDRs, including their quality aspects. For JPSS, that responsibility has transferred to NOAA's Center for Satellite Applications & Research (STAR). As the Calibration and Validation (Cal/Val) activities move forward following both the NPP launch and subsequent JPSS and DWSS launches, rapid algorithm updates may be required. Raytheon and

  7. Optimization model of conventional missile maneuvering route based on improved Floyd algorithm

    Science.gov (United States)

    Wu, Runping; Liu, Weidong

    2018-04-01

    Missile combat plays a crucial role in the victory of war under high-tech conditions. According to the characteristics of maneuver tasks of conventional missile units in combat operations, the factors influencing road maneuvering are analyzed. Based on road distance, road conflicts, launching device speed, position requirements, launch device deployment, Concealment and so on. The shortest time optimization model was built to discuss the situation of road conflict and the strategy of conflict resolution. The results suggest that in the process of solving road conflict, the effect of node waiting is better than detour to another way. In this study, we analyzed the deficiency of the traditional Floyd algorithm which may limit the optimal way of solving road conflict, and put forward the improved Floyd algorithm, meanwhile, we designed the algorithm flow which would be better than traditional Floyd algorithm. Finally, throgh a numerical example, the model and the algorithm were proved to be reliable and effective.

  8. The Cost-Optimal Size of Future Reusable Launch Vehicles

    Science.gov (United States)

    Koelle, D. E.

    2000-07-01

    The paper answers the question, what is the optimum vehicle size — in terms of LEO payload capability — for a future reusable launch vehicle ? It is shown that there exists an optimum vehicle size that results in minimum specific transportation cost. The optimum vehicle size depends on the total annual cargo mass (LEO equivalent) enviseaged, which defines at the same time the optimum number of launches per year (LpA). Based on the TRANSCOST-Model algorithms a wide range of vehicle sizes — from 20 to 100 Mg payload in LEO, as well as launch rates — from 2 to 100 per year — have been investigated. It is shown in a design chart how much the vehicle size as well as the launch rate are influencing the specific transportation cost (in MYr/Mg and USS/kg). The comparison with actual ELVs (Expendable Launch Vehicles) and Semi-Reusable Vehicles (a combination of a reusable first stage with an expendable second stage) shows that there exists only one economic solution for an essential reduction of space transportation cost: the Fully Reusable Vehicle Concept, with rocket propulsion and vertical take-off. The Single-stage Configuration (SSTO) has the best economic potential; its feasibility is not only a matter of technology level but also of the vehicle size as such. Increasing the vehicle size (launch mass) reduces the technology requirements because the law of scale provides a better mass fraction and payload fraction — practically at no cost. The optimum vehicle design (after specification of the payload capability) requires a trade-off between lightweight (and more expensive) technology vs. more conventional (and cheaper) technology. It is shown that the the use of more conventional technology and accepting a somewhat larger vehicle is the more cost-effective and less risky approach.

  9. Development of a Smart Release Algorithm for Mid-Air Separation of Parachute Test Articles

    Science.gov (United States)

    Moore, James W.

    2011-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) project is currently developing an autonomous method to separate a capsule-shaped parachute test vehicle from an air-drop platform for use in the test program to develop and validate the parachute system for the Orion spacecraft. The CPAS project seeks to perform air-drop tests of an Orion-like boilerplate capsule. Delivery of the boilerplate capsule to the test condition has proven to be a critical and complicated task. In the current concept, the boilerplate vehicle is extracted from an aircraft on top of a Type V pallet and then separated from the pallet in mid-air. The attitude of the vehicles at separation is critical to avoiding re-contact and successfully deploying the boilerplate into a heatshield-down orientation. Neither the pallet nor the boilerplate has an active control system. However, the attitude of the mated vehicle as a function of time is somewhat predictable. CPAS engineers have designed an avionics system to monitor the attitude of the mated vehicle as it is extracted from the aircraft and command a release when the desired conditions are met. The algorithm includes contingency capabilities designed to release the test vehicle before undesirable orientations occur. The algorithm was verified with simulation and ground testing. The pre-flight development and testing is discussed and limitations of ground testing are noted. The CPAS project performed a series of three drop tests as a proof-of-concept of the release technique. These tests helped to refine the attitude instrumentation and software algorithm to be used on future tests. The drop tests are described in detail and the evolution of the release system with each test is described.

  10. Testing Algorithmic Skills in Traditional and Non-Traditional Programming Environments

    Science.gov (United States)

    Csernoch, Mária; Biró, Piroska; Máth, János; Abari, Kálmán

    2015-01-01

    The Testing Algorithmic and Application Skills (TAaAS) project was launched in the 2011/2012 academic year to test first year students of Informatics, focusing on their algorithmic skills in traditional and non-traditional programming environments, and on the transference of their knowledge of Informatics from secondary to tertiary education. The…

  11. Maternal pre-pregnancy obesity and neuropsychological development in pre-school children: a prospective cohort study.

    Science.gov (United States)

    Casas, Maribel; Forns, Joan; Martínez, David; Guxens, Mònica; Fernandez-Somoano, Ana; Ibarluzea, Jesus; Lertxundi, Nerea; Murcia, Mario; Rebagliato, Marisa; Tardon, Adonina; Sunyer, Jordi; Vrijheid, Martine

    2017-10-01

    BackgroundMaternal pre-pregnancy obesity may impair infant neuropsychological development, but it is unclear whether intrauterine or confounding factors drive this association.MethodsWe assessed whether maternal pre-pregnancy obesity was associated with neuropsychological development in 1,827 Spanish children. At 5 years, cognitive and psychomotor development was assessed using McCarthy Scales of Children's Abilities, attention deficit hyperactivity disorder (ADHD) symptoms using the Criteria of Diagnostic and Statistical Manual of Mental Disorders, and autism spectrum disorder symptoms using the Childhood Asperger Syndrome Test. Models were adjusted for sociodemographic factors and maternal intelligence quotient. We used paternal obesity as negative control exposure as it involves the same source of confounding than maternal obesity.ResultsThe percentage of obese mothers and fathers was 8% and 12%, respectively. In unadjusted models, children of obese mothers had lower scores than children of normal weight mothers in all McCarthy subscales. After adjustment, only the verbal subscale remained statistically significantly reduced (β: -2.8; 95% confidence interval: -5.3, -0.2). No associations were observed among obese fathers. Maternal and paternal obesity were associated with an increase in ADHD-related symptoms. Parental obesity was not associated with autism symptoms.ConclusionMaternal pre-pregnancy obesity was associated with a reduction in offspring verbal scores at pre-school age.

  12. Development of pre-critical excore detector linear subchannel calibration method

    International Nuclear Information System (INIS)

    Choi, Yoo Sun; Goo, Bon Seung; Cha, Kyun Ho; Lee, Chang Seop; Kim, Yong Hee; Ahn, Chul Soo; Kim, Man Soo

    2001-01-01

    The improved pre-critical excore detector linear subchannel calibration method has been developed to improve the applicability of pre-critical calibration method. The existing calibration method does not always guarantee the accuracy of pre-critical calibration because the calibration results of the previous cycle are not reflected into the current cycle calibration. The developed method has a desirable feature that calibration error would not be propagated in the following cycles since the calibration data determined in previous cycle is incorporated in the current cycle calibration. The pre-critical excore detector linear calibration is tested for YGN unit 3 and UCN unit 3 to evaluate its characteristics and accuracy

  13. Google Chrome OS: Cultural influence on product launch strategy between India and developed countries

    OpenAIRE

    Santhosh, Arjun

    2011-01-01

    In recent times product launch has become vital deciding factor in the success of a product. The significance of product launch becomes even higher if the product is radically new and different from existing products in the market. The aim of this dissertation is to look into the possible factors which might influence the product launch of Google Chrome Operating System that has radical concepts and design. The essential variations which might be needed for the successful launch in India as c...

  14. Space gravitational wave detector DECIGO/pre-DECIGO

    Science.gov (United States)

    Musha, Mitsuru

    2017-09-01

    The gravitational wave (GW) is ripples in gravitational fields caused by the motion of mass such as inspiral and merger of blackhole binaries or explosion of super novae, which was predicted by A.Einstein in his general theory of relativity. In Japan, besides the ground-base GW detector, KAGRA, the space gravitational wave detector, DECIGO, is also promoted for detecting GW at lower frequency range. DECIGO (DECi-heltz Gravitational-wave Observatory) consists of 3 satellites, forming a 1000-km triangle-shaped Fabry-Perot laser interferometer whose designed strain sensitivity is ?l/l planned a milestone mission for DECIGO named Pre-DECIGO, which has almost the same configuration as DECIGO with shorter arm length of 100 km. Pre-DECIGO is aimed for detecting GW from merger of blackhole binaries with less sensitivity as DECIGO, and also for feasibility test of key technologies for realizing DECIGO. Pre-DECIGO is now under designing and developing for launching in late 2020s, with the financial support of JAXA and JSPS. In our presentation, we will review DECIGO project, and show the design and current status of Pre-DECIGO.

  15. Reusable Launch Vehicle Technology Program

    Science.gov (United States)

    Freeman, Delma C., Jr.; Talay, Theodore A.; Austin, R. Eugene

    1997-01-01

    Industry/NASA reusable launch vehicle (RLV) technology program efforts are underway to design, test, and develop technologies and concepts for viable commercial launch systems that also satisfy national needs at acceptable recurring costs. Significant progress has been made in understanding the technical challenges of fully reusable launch systems and the accompanying management and operational approaches for achieving a low cost program. This paper reviews the current status of the RLV technology program including the DC-XA, X-33 and X-34 flight systems and associated technology programs. It addresses the specific technologies being tested that address the technical and operability challenges of reusable launch systems including reusable cryogenic propellant tanks, composite structures, thermal protection systems, improved propulsion and subsystem operability enhancements. The recently concluded DC-XA test program demonstrated some of these technologies in ground and flight test. Contracts were awarded recently for both the X-33 and X-34 flight demonstrator systems. The Orbital Sciences Corporation X-34 flight test vehicle will demonstrate an air-launched reusable vehicle capable of flight to speeds of Mach 8. The Lockheed-Martin X-33 flight test vehicle will expand the test envelope for critical technologies to flight speeds of Mach 15. A propulsion program to test the X-33 linear aerospike rocket engine using a NASA SR-71 high speed aircraft as a test bed is also discussed. The paper also describes the management and operational approaches that address the challenge of new cost effective, reusable launch vehicle systems.

  16. Supporting pre-service science teachers in developing culturally relevant pedagogy

    Science.gov (United States)

    Krajeski, Stephen

    This study employed a case study methodology to investigate a near-authentic intervention program designed to support the development of culturally relevant pedagogy and its impact on pre-service science teachers' notions of culturally relevant pedagogy. The unit of analysis for this study was the discourse of pre-service science teachers enrolled in a second semester science methods course, which was the site of the intervention program. Data for this study was collected from videos of classroom observations, audio recordings of personal interviews, and artifacts created by the pre-service science teachers during the class. To determine how effective science teacher certification programs are at supporting the development of culturally relevant pedagogy without an immersion aspect, two research questions were investigated: 1) How do pre-service science teachers view and design pedagogy while participating in an intervention designed to support the development of culturally relevant pedagogy? 2) How do pre-service science teachers view the importance of culturally relevant pedagogy for supporting student learning? How do their practices in the field change these initial views?

  17. Space Launch System Spacecraft and Payload Elements: Progress Toward Crewed Launch and Beyond

    Science.gov (United States)

    Schorr, Andrew A.; Smith, David Alan; Holcomb, Shawn; Hitt, David

    2017-01-01

    While significant and substantial progress continues to be accomplished toward readying the Space Launch System (SLS) rocket for its first test flight, work is already underway on preparations for the second flight - using an upgraded version of the vehicle - and beyond. Designed to support human missions into deep space, SLS is the most powerful human-rated launch vehicle the United States has ever undertaken, and is one of three programs being managed by the National Aeronautics and Space Administration's (NASA's) Exploration Systems Development division. The Orion spacecraft program is developing a new crew vehicle that will support human missions beyond low Earth orbit (LEO), and the Ground Systems Development and Operations (GSDO) program is transforming Kennedy Space Center (KSC) into a next-generation spaceport capable of supporting not only SLS but also multiple commercial users. Together, these systems will support human exploration missions into the proving ground of cislunar space and ultimately to Mars. For its first flight, SLS will deliver a near-term heavy-lift capability for the nation with its 70-metric-ton (t) Block 1 configuration. Each element of the vehicle now has flight hardware in production in support of the initial flight of the SLS, which will propel Orion around the moon and back. Encompassing hardware qualification, structural testing to validate hardware compliance and analytical modeling, progress is on track to meet the initial targeted launch date. In Utah and Mississippi, booster and engine testing are verifying upgrades made to proven shuttle hardware. At Michoud Assembly Facility (MAF) in Louisiana, the world's largest spacecraft welding tool is producing tanks for the SLS core stage. Providing the Orion crew capsule/launch vehicle interface and in-space propulsion via a cryogenic upper stage, the Spacecraft/Payload Integration and Evolution (SPIE) element serves a key role in achieving SLS goals and objectives. The SPIE element

  18. Non-contact and contact measurement system for detecting projectile position in electromagnetic launch bore

    Science.gov (United States)

    Xu, Weidong; Yuan, Weiqun; Xu, Rong; Zhao, Hui; Cheng, Wenping; Zhang, Dongdong; Zhao, Ying; Yan, Ping

    2017-12-01

    This paper introduces a new measurement system for measuring the position of a projectile within a rapid fire electromagnetic launching system. The measurement system contains both non-contact laser shading and metal fiber contact measurement devices. Two projectiles are placed in the rapid fire electromagnetic launch bore, one in the main accelerating segment and the other in the pre-loading segment. The projectile placed in the main accelerating segment should be shot first, and then the other is loaded into the main segment from the pre-loading segment. The main driving current (I-main) can only be discharged again when the second projectile has arrived at the key position (the projectile position corresponds to the discharging time) in the main accelerating segment. So, it is important to be able to detect when the second projectile arrives at the key position in the main accelerating segment. The B-dot probe is the most widely used system for detecting the position of the projectile in the electromagnetic launch bore. However, the B-dot signal is affected by the driving current amplitude and the projectile velocity. There is no current in the main accelerating segment when the second projectile moves into this segment in rapid fire mode, so the B-dot signal for detecting the key position is invalid. Due to the presence of a high-intensity magnetic field, a high current, a high-temperature aluminum attachment, smoke and strong vibrations, it is very difficult to detect the projectile position in the bore accurately. So, other measurements need to be researched and developed in order to achieve high reliability. A measurement system based on a laser (non-contact) and metal fibers (contact) has been designed, and the integrated output signal based on this detector is described in the following paper.

  19. Launch Pad in a Box

    Science.gov (United States)

    Mantovani, James; Tamasy, Gabor; Mueller, Rob; Townsend, Van; Sampson, Jeff; Lane, Mike

    2016-01-01

    NASA Kennedy Space Center (KSC) is developing a new deployable launch system capability to support a small class of launch vehicles for NASA and commercial space companies to test and launch their vehicles. The deployable launch pad concept was first demonstrated on a smaller scale at KSC in 2012 in support of NASA Johnson Space Center's Morpheus Lander Project. The main objective of the Morpheus Project was to test a prototype planetary lander as a vertical takeoff and landing test-bed for advanced spacecraft technologies using a hazard field that KSC had constructed at the Shuttle Landing Facility (SLF). A steel pad for launch or landing was constructed using a modular design that allowed it to be reconfigurable and expandable. A steel flame trench was designed as an optional module that could be easily inserted in place of any modular steel plate component. The concept of a transportable modular launch and landing pad may also be applicable to planetary surfaces where the effects of rocket exhaust plume on surface regolith is problematic for hardware on the surface that may either be damaged by direct impact of high speed dust particles, or impaired by the accumulation of dust (e.g., solar array panels and thermal radiators). During the Morpheus free flight campaign in 2013-14, KSC performed two studies related to rocket plume effects. One study compared four different thermal ablatives that were applied to the interior of a steel flame trench that KSC had designed and built. The second study monitored the erosion of a concrete landing pad following each landing of the Morpheus vehicle on the same pad located in the hazard field. All surfaces of a portable flame trench that could be directly exposed to hot gas during launch of the Morpheus vehicle were coated with four types of ablatives. All ablative products had been tested by NASA KSC and/or the manufacturer. The ablative thicknesses were measured periodically following the twelve Morpheus free flight tests

  20. Development and Application of a Portable Health Algorithms Test System

    Science.gov (United States)

    Melcher, Kevin J.; Fulton, Christopher E.; Maul, William A.; Sowers, T. Shane

    2007-01-01

    This paper describes the development and initial demonstration of a Portable Health Algorithms Test (PHALT) System that is being developed by researchers at the NASA Glenn Research Center (GRC). The PHALT System was conceived as a means of evolving the maturity and credibility of algorithms developed to assess the health of aerospace systems. Comprising an integrated hardware-software environment, the PHALT System allows systems health management algorithms to be developed in a graphical programming environment; to be tested and refined using system simulation or test data playback; and finally, to be evaluated in a real-time hardware-in-the-loop mode with a live test article. In this paper, PHALT System development is described through the presentation of a functional architecture, followed by the selection and integration of hardware and software. Also described is an initial real-time hardware-in-the-loop demonstration that used sensor data qualification algorithms to diagnose and isolate simulated sensor failures in a prototype Power Distribution Unit test-bed. Success of the initial demonstration is highlighted by the correct detection of all sensor failures and the absence of any real-time constraint violations.

  1. Assessing Upper-Level Winds on Day-of-Launch

    Science.gov (United States)

    Bauman, William H., III; Wheeler, Mark M.

    2012-01-01

    On the day-or-launch. the 45th Weather Squadron Launch Weather Officers (LWOS) monitor the upper-level winds for their launch customers to include NASA's Launch Services Program (LSP). During launch operations, the payload launch team sometimes asks the LWO if they expect the upper level winds to change during the countdown but the LWOs did not have the capability to quickly retrieve or display the upper-level observations and compare them to the numerical weather prediction model point forecasts. The LWOs requested the Applied Meteorology Unit (AMU) develop a capability in the form of a graphical user interface (GUI) that would allow them to plot upper-level wind speed and direction observations from the Kennedy Space Center Doppler Radar Wind Profilers and Cape Canaveral Air Force Station rawinsondes and then overlay model point forecast profiles on the observation profiles to assess the performance of these models and graphically display them to the launch team. The AMU developed an Excel-based capability for the LWOs to assess the model forecast upper-level winds and compare them to observations. They did so by creating a GUI in Excel that allows the LWOs to first initialize the models by comparing the O-hour model forecasts to the observations and then to display model forecasts in 3-hour intervals from the current time through 12 hours.

  2. Diagram of Saturn V Launch Vehicle

    Science.gov (United States)

    1971-01-01

    This is a good cutaway diagram of the Saturn V launch vehicle showing the three stages, the instrument unit, and the Apollo spacecraft. The chart on the right presents the basic technical data in clear detail. The Saturn V is the largest and most powerful launch vehicle in the United States. The towering 363-foot Saturn V was a multistage, multiengine launch vehicle standing taller than the Statue of Liberty. Altogether, the Saturn V engines produced as much power as 85 Hoover Dams. Development of the Saturn V was the responsibility of the Marshall Space Flight Center at Huntsville, Alabama, directed by Dr. Wernher von Braun.

  3. Battery algorithm verification and development using hardware-in-the-loop testing

    Science.gov (United States)

    He, Yongsheng; Liu, Wei; Koch, Brain J.

    Battery algorithms play a vital role in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), extended-range electric vehicles (EREVs), and electric vehicles (EVs). The energy management of hybrid and electric propulsion systems needs to rely on accurate information on the state of the battery in order to determine the optimal electric drive without abusing the battery. In this study, a cell-level hardware-in-the-loop (HIL) system is used to verify and develop state of charge (SOC) and power capability predictions of embedded battery algorithms for various vehicle applications. Two different batteries were selected as representative examples to illustrate the battery algorithm verification and development procedure. One is a lithium-ion battery with a conventional metal oxide cathode, which is a power battery for HEV applications. The other is a lithium-ion battery with an iron phosphate (LiFePO 4) cathode, which is an energy battery for applications in PHEVs, EREVs, and EVs. The battery cell HIL testing provided valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms, to accelerate battery algorithm future development and improvement, and to reduce hybrid/electric vehicle system development time and costs.

  4. Battery algorithm verification and development using hardware-in-the-loop testing

    Energy Technology Data Exchange (ETDEWEB)

    He, Yongsheng [General Motors Global Research and Development, 30500 Mound Road, MC 480-106-252, Warren, MI 48090 (United States); Liu, Wei; Koch, Brain J. [General Motors Global Vehicle Engineering, Warren, MI 48090 (United States)

    2010-05-01

    Battery algorithms play a vital role in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), extended-range electric vehicles (EREVs), and electric vehicles (EVs). The energy management of hybrid and electric propulsion systems needs to rely on accurate information on the state of the battery in order to determine the optimal electric drive without abusing the battery. In this study, a cell-level hardware-in-the-loop (HIL) system is used to verify and develop state of charge (SOC) and power capability predictions of embedded battery algorithms for various vehicle applications. Two different batteries were selected as representative examples to illustrate the battery algorithm verification and development procedure. One is a lithium-ion battery with a conventional metal oxide cathode, which is a power battery for HEV applications. The other is a lithium-ion battery with an iron phosphate (LiFePO{sub 4}) cathode, which is an energy battery for applications in PHEVs, EREVs, and EVs. The battery cell HIL testing provided valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms, to accelerate battery algorithm future development and improvement, and to reduce hybrid/electric vehicle system development time and costs. (author)

  5. The Importance of Post-Launch, On-Orbit Absolute Radiometric Calibration for Remote Sensing Applications

    Science.gov (United States)

    Kuester, M. A.

    2015-12-01

    Remote sensing is a powerful tool for monitoring changes on the surface of the Earth at a local or global scale. The use of data sets from different sensors across many platforms, or even a single sensor over time, can bring a wealth of information when exploring anthropogenic changes to the environment. For example, variations in crop yield and health for a specific region can be detected by observing changes in the spectral signature of the particular species under study. However, changes in the atmosphere, sun illumination and viewing geometries during image capture can result in inconsistent image data, hindering automated information extraction. Additionally, an incorrect spectral radiometric calibration will lead to false or misleading results. It is therefore critical that the data being used are normalized and calibrated on a regular basis to ensure that physically derived variables are as close to truth as is possible. Although most earth observing sensors are well-calibrated in a laboratory prior to launch, a change in the radiometric response of the system is inevitable due to thermal, mechanical or electrical effects caused during the rigors of launch or by the space environment itself. Outgassing and exposure to ultra-violet radiation will also have an effect on the sensor's filter responses. Pre-launch lamps and other laboratory calibration systems can also fall short in representing the actual output of the Sun. A presentation of the differences in the results of some example cases (e.g. geology, agriculture) derived for science variables using pre- and post-launch calibration will be presented using DigitalGlobe's WorldView-3 super spectral sensor, with bands in the visible and near infrared, as well as in the shortwave infrared. Important defects caused by an incomplete (i.e. pre-launch only) calibration will be discussed using validation data where available. In addition, the benefits of using a well-validated surface reflectance product will be

  6. Fast algorithm of adaptive Fourier series

    Science.gov (United States)

    Gao, You; Ku, Min; Qian, Tao

    2018-05-01

    Adaptive Fourier decomposition (AFD, precisely 1-D AFD or Core-AFD) was originated for the goal of positive frequency representations of signals. It achieved the goal and at the same time offered fast decompositions of signals. There then arose several types of AFDs. AFD merged with the greedy algorithm idea, and in particular, motivated the so-called pre-orthogonal greedy algorithm (Pre-OGA) that was proven to be the most efficient greedy algorithm. The cost of the advantages of the AFD type decompositions is, however, the high computational complexity due to the involvement of maximal selections of the dictionary parameters. The present paper offers one formulation of the 1-D AFD algorithm by building the FFT algorithm into it. Accordingly, the algorithm complexity is reduced, from the original $\\mathcal{O}(M N^2)$ to $\\mathcal{O}(M N\\log_2 N)$, where $N$ denotes the number of the discretization points on the unit circle and $M$ denotes the number of points in $[0,1)$. This greatly enhances the applicability of AFD. Experiments are carried out to show the high efficiency of the proposed algorithm.

  7. Launching a nuclear nower programme in a developing country - Technical and Scientific Support Organisations (TSO) in capacity building

    International Nuclear Information System (INIS)

    Ngotho, E.M.

    2010-01-01

    The need for involvement of Technical and Scientific Support Organisations (TSO) in developing countries intending to launch a nuclear power programme (NPP) cannot be overemphasized. In an International Conference on Topical Issues in Nuclear Installation Safety held in 2008, Mumbai, India, I presented a paper entitled 'Launching a Nuclear Power Programme - a third world country's perspective' - IAEA-CN-158/9. I pointed out some real constraints encountered by a developing country while trying to introduce a nuclear power programme. This was inadequate base infrastructure, financial incapability and lack of skilled manpower. Granted there are areas where the role of TSOs is minimal like in carrying the actual cost of infrastructure but their input in areas of technology, evaluation, assessment and skills development cannot be gainsaid. (author)

  8. Online nutrition and T2DM continuing medical education course launched on state-level medical association.

    Science.gov (United States)

    Hicks, Kristen K; Murano, Peter S

    2017-01-01

    The purpose of this research study was to determine whether a 1-hour online continuing medical education (CME) course focused on nutrition for type 2 diabetes would result in a gain in nutrition knowledge by practicing physicians. A practicing physician and dietitian collaborated to develop an online CME course (both webinar and self-study versions) on type 2 diabetes. This 1-hour accredited course was launched through the state-level medical association's education library, available to all physicians. Physicians (n=43) registered for the course, and of those, 31 completed the course in its entirety. A gain in knowledge was found when comparing pre- versus post-test scores related to the online nutrition CME ( P Online CME courses launched via state-level medical associations offer convenient continuing education to assist practicing physicians in addressing patient nutrition and lifestyle concerns related to chronic disease. The present diabetes CME one-credit course allowed physicians to develop basic nutrition care concepts on this topic to assist patients in a better way.

  9. Shape Memory Alloy (SMA)-Based Launch Lock

    Science.gov (United States)

    Badescu, Mircea; Bao, Xiaoqi; Bar-Cohen, Yoseph

    2014-01-01

    Most NASA missions require the use of a launch lock for securing moving components during the launch or securing the payload before release. A launch lock is a device used to prevent unwanted motion and secure the controlled components. The current launch locks are based on pyrotechnic, electro mechanically or NiTi driven pin pullers and they are mostly one time use mechanisms that are usually bulky and involve a relatively high mass. Generally, the use of piezoelectric actuation provides high precession nanometer accuracy but it relies on friction to generate displacement. During launch, the generated vibrations can release the normal force between the actuator components allowing shaft's free motion which could result in damage to the actuated structures or instruments. This problem is common to other linear actuators that consist of a ball screw mechanism. The authors are exploring the development of a novel launch lock mechanism that is activated by a shape memory alloy (SMA) material ring, a rigid element and an SMA ring holding flexure. The proposed design and analytical model will be described and discussed in this paper.

  10. The reusable launch vehicle technology program

    Science.gov (United States)

    Cook, S.

    1995-01-01

    Today's launch systems have major shortcomings that will increase in significance in the future, and thus are principal drivers for seeking major improvements in space transportation. They are too costly; insufficiently reliable, safe, and operable; and increasingly losing market share to international competition. For the United States to continue its leadership in the human exploration and wide ranging utilization of space, the first order of business must be to achieve low cost, reliable transportatin to Earth orbit. NASA's Access to Space Study, in 1993, recommended the development of a fully reusable single-stage-to-orbit (SSTO) rocket vehicle as an Agency goal. The goal of the Reusable Launch Vehicle (RLV) technology program is to mature the technologies essential for a next-generation reusable launch system capable of reliably serving National space transportation needs at substantially reduced costs. The primary objectives of the RLV technology program are to (1) mature the technologies required for the next-generation system, (2) demonstrate the capability to achieve low development and operational cost, and rapid launch turnaround times and (3) reduce business and technical risks to encourage significant private investment in the commercial development and operation of the next-generation system. Developing and demonstrating the technologies required for a Single Stage to Orbit (SSTO) rocket is a focus of the program becuase past studies indicate that it has the best potential for achieving the lowest space access cost while acting as an RLV technology driver (since it also encompasses the technology requirements of reusable rocket vehicles in general).

  11. The reusable launch vehicle technology program

    Science.gov (United States)

    Cook, S.

    Today's launch systems have major shortcomings that will increase in significance in the future, and thus are principal drivers for seeking major improvements in space transportation. They are too costly; insufficiently reliable, safe, and operable; and increasingly losing market share to international competition. For the United States to continue its leadership in the human exploration and wide ranging utilization of space, the first order of business must be to achieve low cost, reliable transportatin to Earth orbit. NASA's Access to Space Study, in 1993, recommended the development of a fully reusable single-stage-to-orbit (SSTO) rocket vehicle as an Agency goal. The goal of the Reusable Launch Vehicle (RLV) technology program is to mature the technologies essential for a next-generation reusable launch system capable of reliably serving National space transportation needs at substantially reduced costs. The primary objectives of the RLV technology program are to (1) mature the technologies required for the next-generation system, (2) demonstrate the capability to achieve low development and operational cost, and rapid launch turnaround times and (3) reduce business and technical risks to encourage significant private investment in the commercial development and operation of the next-generation system. Developing and demonstrating the technologies required for a Single Stage to Orbit (SSTO) rocket is a focus of the program becuase past studies indicate that it has the best potential for achieving the lowest space access cost while acting as an RLV technology driver (since it also encompasses the technology requirements of reusable rocket vehicles in general).

  12. Post launch calibration and testing of the Advanced Baseline Imager on the GOES-R satellite

    Science.gov (United States)

    Lebair, William; Rollins, C.; Kline, John; Todirita, M.; Kronenwetter, J.

    2016-05-01

    The Geostationary Operational Environmental Satellite R (GOES-R) series is the planned next generation of operational weather satellites for the United State's National Oceanic and Atmospheric Administration. The first launch of the GOES-R series is planned for October 2016. The GOES-R series satellites and instruments are being developed by the National Aeronautics and Space Administration (NASA). One of the key instruments on the GOES-R series is the Advance Baseline Imager (ABI). The ABI is a multi-channel, visible through infrared, passive imaging radiometer. The ABI will provide moderate spatial and spectral resolution at high temporal and radiometric resolution to accurately monitor rapidly changing weather. Initial on-orbit calibration and performance characterization is crucial to establishing baseline used to maintain performance throughout mission life. A series of tests has been planned to establish the post launch performance and establish the parameters needed to process the data in the Ground Processing Algorithm. The large number of detectors for each channel required to provide the needed temporal coverage presents unique challenges for accurately calibrating ABI and minimizing striping. This paper discusses the planned tests to be performed on ABI over the six-month Post Launch Test period and the expected performance as it relates to ground tests.

  13. Development of morphing algorithms for Histfactory using information geometry

    Energy Technology Data Exchange (ETDEWEB)

    Bandyopadhyay, Anjishnu; Brock, Ian [University of Bonn (Germany); Cranmer, Kyle [New York University (United States)

    2016-07-01

    Many statistical analyses are based on likelihood fits. In any likelihood fit we try to incorporate all uncertainties, both systematic and statistical. We generally have distributions for the nominal and ±1 σ variations of a given uncertainty. Using that information, Histfactory morphs the distributions for any arbitrary value of the given uncertainties. In this talk, a new morphing algorithm will be presented, which is based on information geometry. The algorithm uses the information about the difference between various probability distributions. Subsequently, we map this information onto geometrical structures and develop the algorithm on the basis of different geometrical properties. Apart from varying all nuisance parameters together, this algorithm can also probe both small (< 1 σ) and large (> 2 σ) variations. It will also be shown how this algorithm can be used for interpolating other forms of probability distributions.

  14. System Engineering Processes at Kennedy Space Center for Development of SLS and Orion Launch Systems

    Science.gov (United States)

    Schafer, Eric; Stambolian, Damon; Henderson, Gena

    2013-01-01

    There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems are developed at the Kennedy Space Center Engineering Directorate. The Engineering Directorate at Kennedy Space Center follows a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Presentation describes this process with examples of where the process has been applied.

  15. The advanced launch system: Application of total quality management principles to low-cost space transportation system development

    Science.gov (United States)

    Wolfe, M. G.; Rothwell, T. G.; Rosenberg, D. A.; Oliver, M. B.

    Recognizing that a major inhibitor of man's rapid expansion of the use of space is the high cost (direct and induced) of space transportation, the U.S. has embarked on a major national program to radically reduce the cost of placing payloads into orbit while, at the same time, making equally radical improvements inlaunch system operability. The program is entitled "The Advanced Launch System" (ALS) and is a joint Department of Defense/National Aeronautics and Space Administration (DoD/NASA) program which will provide launch capability in the post 2000 timeframe. It is currently in Phase II (System Definition), which began in January 1989, and will serve as a major source of U.S. launch system technology over the next several years. The ALS is characterized by a new approach to space system design, development, and operation. The practices that are being implemented by the ALS are expected to affect the management and technical operation of all future launch systems. In this regard, the two most significant initiatives being implemented on the ALS program are the practices of Total Quality Management (TQM) and the Unified Information System (Unis). TQM is a DoD initiative to improve the quality of the DoD acquisition system, contractor management systems, and the technical disciplines associated with the design, development, and operation of major systems. TQM has been mandated for all new programs and affects the way every group within the system currently does business. In order to implement the practices of TQM, new methods are needed. A program on the scale of the ALS generates vast amounts of information which must be used effectively to make sound decisions. Unis is an information network that will connect all ALS participants throughout all phases of the ALS development. Unis is providing support for project management and system design, and in following phases will provide decision support for launch operations, computer integrated manufacturing, automated

  16. NASA Crew and Cargo Launch Vehicle Development Approach Builds on Lessons from Past and Present Missions

    Science.gov (United States)

    Dumbacher, Daniel L.

    2006-01-01

    The United States (US) Vision for Space Exploration, announced in January 2004, outlines the National Aeronautics and Space Administration's (NASA) strategic goals and objectives, including retiring the Space Shuttle and replacing it with new space transportation systems for missions to the Moon, Mars, and beyond. The Crew Exploration Vehicle (CEV) that the new human-rated Crew Launch Vehicle (CLV) lofts into space early next decade will initially ferry astronauts to the International Space Station (ISS) Toward the end of the next decade, a heavy-lift Cargo Launch Vehicle (CaLV) will deliver the Earth Departure Stage (EDS) carrying the Lunar Surface Access Module (LSAM) to low-Earth orbit (LEO), where it will rendezvous with the CEV launched on the CLV and return astronauts to the Moon for the first time in over 30 years. This paper outlines how NASA is building these new space transportation systems on a foundation of legacy technical and management knowledge, using extensive experience gained from past and ongoing launch vehicle programs to maximize its design and development approach, with the objective of reducing total life cycle costs through operational efficiencies such as hardware commonality. For example, the CLV in-line configuration is composed of a 5-segment Reusable Solid Rocket Booster (RSRB), which is an upgrade of the current Space Shuttle 4- segment RSRB, and a new upper stage powered by the liquid oxygen/liquid hydrogen (LOX/LH2) J-2X engine, which is an evolution of the J-2 engine that powered the Apollo Program s Saturn V second and third stages in the 1960s and 1970s. The CaLV configuration consists of a propulsion system composed of two 5-segment RSRBs and a 33- foot core stage that will provide the LOX/LED needed for five commercially available RS-68 main engines. The J-2X also will power the EDS. The Exploration Launch Projects, managed by the Exploration Launch Office located at NASA's Marshall Space Flight Center, is leading the design

  17. Development of a New Data Tool for Computing Launch and Landing Availability with Respect to Surface Weather

    Science.gov (United States)

    Burns, K. Lee; Altino, Karen

    2008-01-01

    The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.

  18. Space Logistics: Launch Capabilities

    Science.gov (United States)

    Furnas, Randall B.

    1989-01-01

    The current maximum launch capability for the United States are shown. The predicted Earth-to-orbit requirements for the United States are presented. Contrasting the two indicates the strong National need for a major increase in Earth-to-orbit lift capability. Approximate weights for planned payloads are shown. NASA is studying the following options to meet the need for a new heavy-lift capability by mid to late 1990's: (1) Shuttle-C for near term (include growth versions); and (2) the Advanced Lauching System (ALS) for the long term. The current baseline two-engine Shuttle-C has a 15 x 82 ft payload bay and an expected lift capability of 82,000 lb to Low Earth Orbit. Several options are being considered which have expanded diameter payload bays. A three-engine Shuttle-C with an expected lift of 145,000 lb to LEO is being evaluated as well. The Advanced Launch System (ALS) is a potential joint development between the Air Force and NASA. This program is focused toward long-term launch requirements, specifically beyond the year 2000. The basic approach is to develop a family of vehicles with the same high reliability as the Shuttle system, yet offering a much greater lift capability at a greatly reduced cost (per pound of payload). The ALS unmanned family of vehicles will provide a low end lift capability equivalent to Titan IV, and a high end lift capability greater than the Soviet Energia if requirements for such a high-end vehicle are defined.In conclusion, the planning of the next generation space telescope should not be constrained to the current launch vehicles. New vehicle designs will be driven by the needs of anticipated heavy users.

  19. Algorithm development for Maxwell's equations for computational electromagnetism

    Science.gov (United States)

    Goorjian, Peter M.

    1990-01-01

    A new algorithm has been developed for solving Maxwell's equations for the electromagnetic field. It solves the equations in the time domain with central, finite differences. The time advancement is performed implicitly, using an alternating direction implicit procedure. The space discretization is performed with finite volumes, using curvilinear coordinates with electromagnetic components along those directions. Sample calculations are presented of scattering from a metal pin, a square and a circle to demonstrate the capabilities of the new algorithm.

  20. Launch Control Network Engineer

    Science.gov (United States)

    Medeiros, Samantha

    2017-01-01

    The Spaceport Command and Control System (SCCS) is being built at the Kennedy Space Center in order to successfully launch NASA’s revolutionary vehicle that allows humans to explore further into space than ever before. During my internship, I worked with the Network, Firewall, and Hardware teams that are all contributing to the huge SCCS network project effort. I learned the SCCS network design and the several concepts that are running in the background. I also updated and designed documentation for physical networks that are part of SCCS. This includes being able to assist and build physical installations as well as configurations. I worked with the network design for vehicle telemetry interfaces to the Launch Control System (LCS); this allows the interface to interact with other systems at other NASA locations. This network design includes the Space Launch System (SLS), Interim Cryogenic Propulsion Stage (ICPS), and the Orion Multipurpose Crew Vehicle (MPCV). I worked on the network design and implementation in the Customer Avionics Interface Development and Analysis (CAIDA) lab.

  1. AeroADL: applying the integration of the Suomi-NPP science algorithms with the Algorithm Development Library to the calibration and validation task

    Science.gov (United States)

    Houchin, J. S.

    2014-09-01

    A common problem for the off-line validation of the calibration algorithms and algorithm coefficients is being able to run science data through the exact same software used for on-line calibration of that data. The Joint Polar Satellite System (JPSS) program solved part of this problem by making the Algorithm Development Library (ADL) available, which allows the operational algorithm code to be compiled and run on a desktop Linux workstation using flat file input and output. However, this solved only part of the problem, as the toolkit and methods to initiate the processing of data through the algorithms were geared specifically toward the algorithm developer, not the calibration analyst. In algorithm development mode, a limited number of sets of test data are staged for the algorithm once, and then run through the algorithm over and over as the software is developed and debugged. In calibration analyst mode, we are continually running new data sets through the algorithm, which requires significant effort to stage each of those data sets for the algorithm without additional tools. AeroADL solves this second problem by providing a set of scripts that wrap the ADL tools, providing both efficient means to stage and process an input data set, to override static calibration coefficient look-up-tables (LUT) with experimental versions of those tables, and to manage a library containing multiple versions of each of the static LUT files in such a way that the correct set of LUTs required for each algorithm are automatically provided to the algorithm without analyst effort. Using AeroADL, The Aerospace Corporation's analyst team has demonstrated the ability to quickly and efficiently perform analysis tasks for both the VIIRS and OMPS sensors with minimal training on the software tools.

  2. Space commercialization: Launch vehicles and programs; Symposium on Space Commercialization: Roles of Developing Countries, Nashville, TN, Mar. 5-10, 1989, Technical Papers

    International Nuclear Information System (INIS)

    Shahrokhi, F.; Greenberg, J.S.; Al-saud, Turki.

    1990-01-01

    The present volume on progress in astronautics and aeronautics discusses the advent of commercial space, broad-based space education as a prerequisite for space commercialization, and obstacles to space commercialization in the developing world. Attention is given to NASA directions in space propulsion for the year 2000 and beyond, possible uses of the external tank in orbit, power from the space shuttle and from space for use on earth, Long-March Launch Vehicles in the 1990s, the establishment of a center for advanced space propulsion, Pegasus as a key to low-cost space applications, legal problems of developing countries' access to space launch vehicles, and international law of responsibility for remote sensing. Also discussed are low-cost satellites and satellite launch vehicles, satellite launch systems of China; Raumkurier, the German recovery program; and the Ariane transfer vehicle as logistic support to Space Station Freedom

  3. DEVELOPMENT OF A PEDESTRIAN INDOOR NAVIGATION SYSTEM BASED ON MULTI-SENSOR FUSION AND FUZZY LOGIC ESTIMATION ALGORITHMS

    Directory of Open Access Journals (Sweden)

    Y. C. Lai

    2015-05-01

    Full Text Available This paper presents a pedestrian indoor navigation system based on the multi-sensor fusion and fuzzy logic estimation algorithms. The proposed navigation system is a self-contained dead reckoning navigation that means no other outside signal is demanded. In order to achieve the self-contained capability, a portable and wearable inertial measure unit (IMU has been developed. Its adopted sensors are the low-cost inertial sensors, accelerometer and gyroscope, based on the micro electro-mechanical system (MEMS. There are two types of the IMU modules, handheld and waist-mounted. The low-cost MEMS sensors suffer from various errors due to the results of manufacturing imperfections and other effects. Therefore, a sensor calibration procedure based on the scalar calibration and the least squares methods has been induced in this study to improve the accuracy of the inertial sensors. With the calibrated data acquired from the inertial sensors, the step length and strength of the pedestrian are estimated by multi-sensor fusion and fuzzy logic estimation algorithms. The developed multi-sensor fusion algorithm provides the amount of the walking steps and the strength of each steps in real-time. Consequently, the estimated walking amount and strength per step are taken into the proposed fuzzy logic estimation algorithm to estimates the step lengths of the user. Since the walking length and direction are both the required information of the dead reckoning navigation, the walking direction is calculated by integrating the angular rate acquired by the gyroscope of the developed IMU module. Both the walking length and direction are calculated on the IMU module and transmit to a smartphone with Bluetooth to perform the dead reckoning navigation which is run on a self-developed APP. Due to the error accumulating of dead reckoning navigation, a particle filter and a pre-loaded map of indoor environment have been applied to the APP of the proposed navigation system

  4. Development of a Pedestrian Indoor Navigation System Based on Multi-Sensor Fusion and Fuzzy Logic Estimation Algorithms

    Science.gov (United States)

    Lai, Y. C.; Chang, C. C.; Tsai, C. M.; Lin, S. Y.; Huang, S. C.

    2015-05-01

    This paper presents a pedestrian indoor navigation system based on the multi-sensor fusion and fuzzy logic estimation algorithms. The proposed navigation system is a self-contained dead reckoning navigation that means no other outside signal is demanded. In order to achieve the self-contained capability, a portable and wearable inertial measure unit (IMU) has been developed. Its adopted sensors are the low-cost inertial sensors, accelerometer and gyroscope, based on the micro electro-mechanical system (MEMS). There are two types of the IMU modules, handheld and waist-mounted. The low-cost MEMS sensors suffer from various errors due to the results of manufacturing imperfections and other effects. Therefore, a sensor calibration procedure based on the scalar calibration and the least squares methods has been induced in this study to improve the accuracy of the inertial sensors. With the calibrated data acquired from the inertial sensors, the step length and strength of the pedestrian are estimated by multi-sensor fusion and fuzzy logic estimation algorithms. The developed multi-sensor fusion algorithm provides the amount of the walking steps and the strength of each steps in real-time. Consequently, the estimated walking amount and strength per step are taken into the proposed fuzzy logic estimation algorithm to estimates the step lengths of the user. Since the walking length and direction are both the required information of the dead reckoning navigation, the walking direction is calculated by integrating the angular rate acquired by the gyroscope of the developed IMU module. Both the walking length and direction are calculated on the IMU module and transmit to a smartphone with Bluetooth to perform the dead reckoning navigation which is run on a self-developed APP. Due to the error accumulating of dead reckoning navigation, a particle filter and a pre-loaded map of indoor environment have been applied to the APP of the proposed navigation system to extend its

  5. PEGASUS - A Flexible Launch Solution for Small Satellites with Unique Requirements

    Science.gov (United States)

    Richards, B. R.; Ferguson, M.; Fenn, P. D.

    require the benefits inherent in a mobile platform. In this regard Pegasus is no different from a ground- launched vehicle in that it repeatedly launches from a fixed location at each range, albeit a location that is not on land. However, Pegasus can also offer services that avoid many of the restrictions inherent in being constrained to a particular launch site, few of which are trivial. They include inclination restrictions, large plane changes required to achieve low inclination orbits from high latitude launch sites, politically inopportune launch locations, and low frequency launch opportunities for missions that require phasing. Pegasus has repeatedly demonstrated this flexibility through the course of 31 flights, including 17 consecutive successes dating back to 1996, originating from seven different locations around the world including two outside the United States. Recently, Pegasus launched NASA's HETE-2 satellite in an operation that included satellite integration and vehicle mate in California, pre-launch staging operations from Kwajalein Island in the South Pacific, and launch operations controlled from over 7000 miles away in Florida. Pegasus has also used the Canary Islands as a launch point with the associated control room in Spain, and Florida as a launch point for a mission controlled from Virginia. This paper discusses the operational uniqueness of the Pegasus launch vehicle and the activities associated with establishing low-cost, flexible-inclination, low-risk launch operations that utilize Pegasus' greatest asset: its mobility.

  6. Development and validation of an algorithm for laser application in wound treatment

    Directory of Open Access Journals (Sweden)

    Diequison Rite da Cunha

    2017-12-01

    Full Text Available ABSTRACT Objective: To develop and validate an algorithm for laser wound therapy. Method: Methodological study and literature review. For the development of the algorithm, a review was performed in the Health Sciences databases of the past ten years. The algorithm evaluation was performed by 24 participants, nurses, physiotherapists, and physicians. For data analysis, the Cronbach’s alpha coefficient and the chi-square test for independence was used. The level of significance of the statistical test was established at 5% (p<0.05. Results: The professionals’ responses regarding the facility to read the algorithm indicated: 41.70%, great; 41.70%, good; 16.70%, regular. With regard the algorithm being sufficient for supporting decisions related to wound evaluation and wound cleaning, 87.5% said yes to both questions. Regarding the participants’ opinion that the algorithm contained enough information to support their decision regarding the choice of laser parameters, 91.7% said yes. The questionnaire presented reliability using the Cronbach’s alpha coefficient test (α = 0.962. Conclusion: The developed and validated algorithm showed reliability for evaluation, wound cleaning, and use of laser therapy in wounds.

  7. Development of a B-flavor tagging algorithm for the Belle II experiment

    Energy Technology Data Exchange (ETDEWEB)

    Abudinen, Fernando; Li Gioi, Luigi [Max-Planck-Institut fuer Physik Muenchen (Germany); Gelb, Moritz [Karlsruher Institut fuer Technologie (Germany)

    2015-07-01

    The high luminosity SuperB-factory SuperKEKB will allow a precision measurement of the time-dependent CP violation parameters in the B-meson system. The analysis requires the reconstruction of one of the two exclusively produced neutral B mesons to a CP eigenstate and the determination of the flavor of the other one. Because of the high amount of decay possibilities, full reconstruction of the tagging B is not feasible. Consequently, inclusive methods that utilize flavor specific signatures of B decays are employed. The algorithm is based on multivariate methods and follows the approach adopted by BaBar. It proceeds in three steps: the track level, where the most probable target track is selected for each decay category; the event level, where the flavor specific signatures of the selected targets are analyzed; and the combiner, where the results of all categories are combined into the final output. The framework has been completed reaching a tagging efficiency of ca. 25%. A comprehensive optimization is being launched in order to increase the efficiency. This includes studies on the categories, the method-specific parameters and the kinematic variables. An overview of the algorithm is presented together with the results at the current status.

  8. Launch vehicle operations cost reduction through artificial intelligence techniques

    Science.gov (United States)

    Davis, Tom C., Jr.

    1988-01-01

    NASA's Kennedy Space Center has attempted to develop AI methods in order to reduce the cost of launch vehicle ground operations as well as to improve the reliability and safety of such operations. Attention is presently given to cost savings estimates for systems involving launch vehicle firing-room software and hardware real-time diagnostics, as well as the nature of configuration control and the real-time autonomous diagnostics of launch-processing systems by these means. Intelligent launch decisions and intelligent weather forecasting are additional applications of AI being considered.

  9. Solving the pre-marshalling problem to optimality with A* and IDA*

    DEFF Research Database (Denmark)

    Tierney, Kevin; Pacino, Dario; Voß, Stefan

    2017-01-01

    We present a novel solution approach to the container pre-marshalling problem using the A* and IDA* algorithms combined with several novel branching and symmetry breaking rules that significantly increases the number of pre-marshalling instances that can be solved to optimality. A* and IDA......* are graph search algorithms that use heuristics combined with a complete graph search to find optimal solutions to problems. The container pre-marshalling problem is a key problem for container terminals seeking to reduce delays of inter-modal container transports. The goal of the container pre...

  10. System Engineering Processes at Kennedy Space Center for Development of the SLS and Orion Launch Systems

    Science.gov (United States)

    Schafer, Eric J.

    2012-01-01

    There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems developed at the Kennedy Space Center Engineering Directorate follow a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Paper describes this process and gives an example of where the process has been applied.

  11. Commercial launch systems: A risky investment?

    Science.gov (United States)

    Dupnick, Edwin; Skratt, John

    1996-03-01

    A myriad of evolutionary paths connect the current state of government-dominated space launch operations to true commercial access to space. Every potential path requires the investment of private capital sufficient to fund the commercial venture with a perceived risk/return ratio acceptable to the investors. What is the private sector willing to invest? Does government participation reduce financial risk? How viable is a commercial launch system without government participation and support? We examine the interplay between various forms of government participation in commercial launch system development, alternative launch system designs, life cycle cost estimates, and typical industry risk aversion levels. The boundaries of this n-dimensional envelope are examined with an ECON-developed business financial model which provides for the parametric assessment and interaction of SSTO design variables (including various operational scenarios with financial variables including debt/equity assumptions, and commercial enterprise burden rates on various functions. We overlay this structure with observations from previous ECON research which characterize financial risk aversion levels for selected industrial sectors in terms of acceptable initial lump-sum investments, cumulative investments, probability of failure, payback periods, and ROI. The financial model allows the construction of parametric tradeoffs based on ranges of variables which can be said to actually encompass the ``true'' cost of operations and determine what level of ``true'' costs can be tolerated by private capitalization.

  12. Development of hybrid artificial intelligent based handover decision algorithm

    Directory of Open Access Journals (Sweden)

    A.M. Aibinu

    2017-04-01

    Full Text Available The possibility of seamless handover remains a mirage despite the plethora of existing handover algorithms. The underlying factor responsible for this has been traced to the Handover decision module in the Handover process. Hence, in this paper, the development of novel hybrid artificial intelligent handover decision algorithm has been developed. The developed model is made up of hybrid of Artificial Neural Network (ANN based prediction model and Fuzzy Logic. On accessing the network, the Received Signal Strength (RSS was acquired over a period of time to form a time series data. The data was then fed to the newly proposed k-step ahead ANN-based RSS prediction system for estimation of prediction model coefficients. The synaptic weights and adaptive coefficients of the trained ANN was then used to compute the k-step ahead ANN based RSS prediction model coefficients. The predicted RSS value was later codified as Fuzzy sets and in conjunction with other measured network parameters were fed into the Fuzzy logic controller in order to finalize handover decision process. The performance of the newly developed k-step ahead ANN based RSS prediction algorithm was evaluated using simulated and real data acquired from available mobile communication networks. Results obtained in both cases shows that the proposed algorithm is capable of predicting ahead the RSS value to about ±0.0002 dB. Also, the cascaded effect of the complete handover decision module was also evaluated. Results obtained show that the newly proposed hybrid approach was able to reduce ping-pong effect associated with other handover techniques.

  13. Hybrid adaptive ascent flight control for a flexible launch vehicle

    Science.gov (United States)

    Lefevre, Brian D.

    For the purpose of maintaining dynamic stability and improving guidance command tracking performance under off-nominal flight conditions, a hybrid adaptive control scheme is selected and modified for use as a launch vehicle flight controller. This architecture merges a model reference adaptive approach, which utilizes both direct and indirect adaptive elements, with a classical dynamic inversion controller. This structure is chosen for a number of reasons: the properties of the reference model can be easily adjusted to tune the desired handling qualities of the spacecraft, the indirect adaptive element (which consists of an online parameter identification algorithm) continually refines the estimates of the evolving characteristic parameters utilized in the dynamic inversion, and the direct adaptive element (which consists of a neural network) augments the linear feedback signal to compensate for any nonlinearities in the vehicle dynamics. The combination of these elements enables the control system to retain the nonlinear capabilities of an adaptive network while relying heavily on the linear portion of the feedback signal to dictate the dynamic response under most operating conditions. To begin the analysis, the ascent dynamics of a launch vehicle with a single 1st stage rocket motor (typical of the Ares 1 spacecraft) are characterized. The dynamics are then linearized with assumptions that are appropriate for a launch vehicle, so that the resulting equations may be inverted by the flight controller in order to compute the control signals necessary to generate the desired response from the vehicle. Next, the development of the hybrid adaptive launch vehicle ascent flight control architecture is discussed in detail. Alterations of the generic hybrid adaptive control architecture include the incorporation of a command conversion operation which transforms guidance input from quaternion form (as provided by NASA) to the body-fixed angular rate commands needed by the

  14. The Profile Envision and Splice Tool (PRESTO): Developing an Atmospheric Wind Analysis Tool for Space Launch Vehicles Using Python

    Science.gov (United States)

    Orcutt, John M.; Barbre, Robert E., Jr.; Brenton, James C.; Decker, Ryan K.

    2017-01-01

    Tropospheric winds are an important driver of the design and operation of space launch vehicles. Multiple types of weather balloons and Doppler Radar Wind Profiler (DRWP) systems exist at NASA's Kennedy Space Center (KSC), co-located on the United States Air Force's (USAF) Eastern Range (ER) at the Cape Canaveral Air Force Station (CCAFS), that are capable of measuring atmospheric winds. Meteorological data gathered by these instruments are being used in the design of NASA's Space Launch System (SLS) and other space launch vehicles, and will be used during the day-of-launch (DOL) of SLS to aid in loads and trajectory analyses. For the purpose of SLS day-of-launch needs, the balloons have the altitude coverage needed, but take over an hour to reach the maximum altitude and can drift far from the vehicle's path. The DRWPs have the spatial and temporal resolutions needed, but do not provide complete altitude coverage. Therefore, the Natural Environments Branch (EV44) at Marshall Space Flight Center (MSFC) developed the Profile Envision and Splice Tool (PRESTO) to combine balloon profiles and profiles from multiple DRWPs, filter the spliced profile to a common wavelength, and allow the operator to generate output files as well as to visualize the inputs and the spliced profile for SLS DOL operations. PRESTO was developed in Python taking advantage of NumPy and SciPy for the splicing procedure, matplotlib for the visualization, and Tkinter for the execution of the graphical user interface (GUI). This paper describes in detail the Python coding implementation for the splicing, filtering, and visualization methodology used in PRESTO.

  15. Expendable launch vehicle studies

    Science.gov (United States)

    Bainum, Peter M.; Reiss, Robert

    1995-01-01

    Analytical support studies of expendable launch vehicles concentrate on the stability of the dynamics during launch especially during or near the region of maximum dynamic pressure. The in-plane dynamic equations of a generic launch vehicle with multiple flexible bending and fuel sloshing modes are developed and linearized. The information from LeRC about the grids, masses, and modes is incorporated into the model. The eigenvalues of the plant are analyzed for several modeling factors: utilizing diagonal mass matrix, uniform beam assumption, inclusion of aerodynamics, and the interaction between the aerodynamics and the flexible bending motion. Preliminary PID, LQR, and LQG control designs with sensor and actuator dynamics for this system and simulations are also conducted. The initial analysis for comparison of PD (proportional-derivative) and full state feedback LQR Linear quadratic regulator) shows that the split weighted LQR controller has better performance than that of the PD. In order to meet both the performance and robustness requirements, the H(sub infinity) robust controller for the expendable launch vehicle is developed. The simulation indicates that both the performance and robustness of the H(sub infinity) controller are better than that for the PID and LQG controllers. The modelling and analysis support studies team has continued development of methodology, using eigensensitivity analysis, to solve three classes of discrete eigenvalue equations. In the first class, the matrix elements are non-linear functions of the eigenvector. All non-linear periodic motion can be cast in this form. Here the eigenvector is comprised of the coefficients of complete basis functions spanning the response space and the eigenvalue is the frequency. The second class of eigenvalue problems studied is the quadratic eigenvalue problem. Solutions for linear viscously damped structures or viscoelastic structures can be reduced to this form. Particular attention is paid to

  16. Convergence and accommodation development is pre-programmed in premature infants

    Science.gov (United States)

    Horwood, Anna M; Toor, Sonia S; Riddell, Patricia M

    2015-01-01

    Purpose This study investigated whether vergence and accommodation development in pre-term infants is pre-programmed or is driven by experience. Methods 32 healthy infants, born at mean 34 weeks gestation (range 31.2-36 weeks) were compared with 45 healthy full-term infants (mean 40.0 weeks) over a 6 month period, starting at 4-6 weeks post-natally. Simultaneous accommodation and convergence to a detailed target were measured using a Plusoptix PowerRefII infra-red photorefractor as a target moved between 0.33m and 2m. Stimulus/response gains and responses at 0.33m and 2m were compared by both corrected (gestational) age and chronological (post-natal) age. Results When compared by their corrected age, pre-term and full-term infants showed few significant differences in vergence and accommodation responses after 6-7 weeks of age. However, when compared by chronological age, pre-term infants’ responses were more variable, with significantly reduced vergence gains, reduced vergence response at 0.33m, reduced accommodation gain, and increased accommodation at 2m, compared to full-term infants between 8-13 weeks after birth. Conclusions When matched by corrected age, vergence and accommodation in pre-term infants show few differences from full-term infants’ responses. Maturation appears pre-programmed and is not advanced by visual experience. Longer periods of immature visual responses might leave pre-term infants more at risk of development of oculomotor deficits such as strabismus. PMID:26275135

  17. Reusable launch vehicles, enabling technology for the development of advanced upper stages and payloads

    International Nuclear Information System (INIS)

    Metzger, John D.

    1998-01-01

    In the near future there will be classes of upper stages and payloads that will require initial operation at a high-earth orbit to reduce the probability of an inadvertent reentry that could result in a detrimental impact on humans and the biosphere. A nuclear propulsion system, such as was being developed under the Space Nuclear Thermal Propulsion (SNTP) Program, is an example of such a potential payload. This paper uses the results of a reusable launch vehicle (RLV) study to demonstrate the potential importance of a Reusable Launch Vehicle (RLV) to test and implement an advanced upper stage (AUS) or payload in a safe orbit and in a cost effective and reliable manner. The RLV is a horizontal takeoff and horizontal landing (HTHL), two-stage-to-orbit (TSTO) vehicle. The results of the study shows that an HTHL is cost effective because it implements airplane-like operation, infrastructure, and flight operations. The first stage of the TSTO is powered by Rocket-Based-Combined-Cycle (RBCC) engines, the second stage is powered by a LOX/LH rocket engine. The TSTO is used since it most effectively utilizes the capability of the RBCC engine. The analysis uses the NASA code POST (Program to Optimize Simulated Trajectories) to determine trajectories and weight in high-earth orbit for AUS/advanced payloads. Cost and reliability of an RLV versus current generation expandable launch vehicles are presented

  18. Development and Evaluation of Algorithms for Breath Alcohol Screening.

    Science.gov (United States)

    Ljungblad, Jonas; Hök, Bertil; Ekström, Mikael

    2016-04-01

    Breath alcohol screening is important for traffic safety, access control and other areas of health promotion. A family of sensor devices useful for these purposes is being developed and evaluated. This paper is focusing on algorithms for the determination of breath alcohol concentration in diluted breath samples using carbon dioxide to compensate for the dilution. The examined algorithms make use of signal averaging, weighting and personalization to reduce estimation errors. Evaluation has been performed by using data from a previously conducted human study. It is concluded that these features in combination will significantly reduce the random error compared to the signal averaging algorithm taken alone.

  19. Motion Cueing Algorithm Development: Human-Centered Linear and Nonlinear Approaches

    Science.gov (United States)

    Houck, Jacob A. (Technical Monitor); Telban, Robert J.; Cardullo, Frank M.

    2005-01-01

    While the performance of flight simulator motion system hardware has advanced substantially, the development of the motion cueing algorithm, the software that transforms simulated aircraft dynamics into realizable motion commands, has not kept pace. Prior research identified viable features from two algorithms: the nonlinear "adaptive algorithm", and the "optimal algorithm" that incorporates human vestibular models. A novel approach to motion cueing, the "nonlinear algorithm" is introduced that combines features from both approaches. This algorithm is formulated by optimal control, and incorporates a new integrated perception model that includes both visual and vestibular sensation and the interaction between the stimuli. Using a time-varying control law, the matrix Riccati equation is updated in real time by a neurocomputing approach. Preliminary pilot testing resulted in the optimal algorithm incorporating a new otolith model, producing improved motion cues. The nonlinear algorithm vertical mode produced a motion cue with a time-varying washout, sustaining small cues for longer durations and washing out large cues more quickly compared to the optimal algorithm. The inclusion of the integrated perception model improved the responses to longitudinal and lateral cues. False cues observed with the NASA adaptive algorithm were absent. The neurocomputing approach was crucial in that the number of presentations of an input vector could be reduced to meet the real time requirement without degrading the quality of the motion cues.

  20. POST-LAUNCHING MONITORING ACTIVITIES FOR NEW TRANSACTIONAL BANKING PRODUCTS ADDRESSED TO SMES (CONSIDERATIONS

    Directory of Open Access Journals (Sweden)

    Giuca Simona-Mihaela

    2014-07-01

    Full Text Available The current paper has the aim to provide guidelines for post-launching monitoring activities and steps related to new transactional banking products addressed to SMEs. While the pre-launching activities have the purpose of accurately defining the objectives, assumptions and estimations, the purpose of the post-launching plan is to identify: if the final objectives of a product launching have been met, on one hand, to analyze results in the sense of identifying an efficient action plan in order to overcome the lack of results (if case, but most important, to identify opportunities for optimizing the products and for communicating properly the value proposition. This paper also presents schemes for monitoring the results from a business case and for motivating the sales force, as an essential step in increasing the sales. Therefore, alternatives of incentive campaigns are presented, as sustainable campaigns with to purpose to achieve an expected success rate. As an additional support guideline for the sales force, some scenarios and post-sales actions are presented, together with an example of portfolio analysis considering potential per client. Considering the methods and details presented in the current paper, one can identify the importance and find out how to monitor the results after launching a new transactional product addressed to SMEs, can understand and design an incentive scheme and also define actions to be taken in order to increase revenues from a newly launched transactional product.

  1. Minimum Cost Nanosatellite Launch System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Delta Velocity Corporation proposes the development of a very low cost, highly responsive nanosat launch system. We propose to develop an integrated propulsion...

  2. A study of metaheuristic algorithms for high dimensional feature selection on microarray data

    Science.gov (United States)

    Dankolo, Muhammad Nasiru; Radzi, Nor Haizan Mohamed; Sallehuddin, Roselina; Mustaffa, Noorfa Haszlinna

    2017-11-01

    Microarray systems enable experts to examine gene profile at molecular level using machine learning algorithms. It increases the potentials of classification and diagnosis of many diseases at gene expression level. Though, numerous difficulties may affect the efficiency of machine learning algorithms which includes vast number of genes features comprised in the original data. Many of these features may be unrelated to the intended analysis. Therefore, feature selection is necessary to be performed in the data pre-processing. Many feature selection algorithms are developed and applied on microarray which including the metaheuristic optimization algorithms. This paper discusses the application of the metaheuristics algorithms for feature selection in microarray dataset. This study reveals that, the algorithms have yield an interesting result with limited resources thereby saving computational expenses of machine learning algorithms.

  3. Tabletop Experimental Track for Magnetic Launch Assist

    Science.gov (United States)

    2000-01-01

    Marshall Space Flight Center's (MSFC's) Advanced Space Transportation Program has developed the Magnetic Launch Assist System, formerly known as the Magnetic Levitation (MagLev) technology that could give a space vehicle a running start to break free from Earth's gravity. A Magnetic Launch Assist system would use magnetic fields to levitate and accelerate a vehicle along a track at speeds up to 600 mph. The vehicle would shift to rocket engines for launch into orbit. Similar to high-speed trains and roller coasters that use high-strength magnets to lift and propel a vehicle a couple of inches above a guideway, a Magnetic Launch Assist system would electromagnetically propel a space vehicle along the track. The tabletop experimental track for the system shown in this photograph is 44-feet long, with 22-feet of powered acceleration and 22-feet of passive braking. A 10-pound carrier with permanent magnets on its sides swiftly glides by copper coils, producing a levitation force. The track uses a linear synchronous motor, which means the track is synchronized to turn the coils on just before the carrier comes in contact with them, and off once the carrier passes. Sensors are positioned on the side of the track to determine the carrier's position so the appropriate drive coils can be energized. MSFC engineers have conducted tests on the indoor track and a 50-foot outdoor track. The major advantages of launch assist for NASA launch vehicles is that it reduces the weight of the take-off, the landing gear, the wing size, and less propellant resulting in significant cost savings. The US Navy and the British MOD (Ministry of Defense) are planning to use magnetic launch assist for their next generation aircraft carriers as the aircraft launch system. The US Army is considering using this technology for launching target drones for anti-aircraft training.

  4. Launch Vehicle Demonstrator Using Shuttle Assets

    Science.gov (United States)

    Threet, Grady E., Jr.; Creech, Dennis M.; Philips, Alan D.; Water, Eric D.

    2011-01-01

    The Marshall Space Flight Center Advanced Concepts Office (ACO) has the leading role for NASA s preliminary conceptual launch vehicle design and performance analysis. Over the past several years the ACO Earth-to-Orbit Team has evaluated thousands of launch vehicle concept variations for a multitude of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). NASA plans to continue human space exploration and space station utilization. Launch vehicles used for heavy lift cargo and crew will be needed. One of the current leading concepts for future heavy lift capability is an inline one and a half stage concept using solid rocket boosters (SRB) and based on current Shuttle technology and elements. Potentially, the quickest and most cost-effective path towards an operational vehicle of this configuration is to make use of a demonstrator vehicle fabricated from existing shuttle assets and relying upon the existing STS launch infrastructure. Such a demonstrator would yield valuable proof-of-concept data and would provide a working test platform allowing for validated systems integration. Using shuttle hardware such as existing RS-25D engines and partial MPS, propellant tanks derived from the External Tank (ET) design and tooling, and four-segment SRB s could reduce the associated upfront development costs and schedule when compared to a concept that would rely on new propulsion technology and engine designs. There are potentially several other additional benefits to this demonstrator concept. Since a concept of this type would be based on man-rated flight proven hardware components, this demonstrator has the potential to evolve into the first iteration of heavy lift crew or cargo and serve as a baseline for block upgrades. This vehicle could also serve as a demonstration

  5. Launch Vehicle Control Center Architectures

    Science.gov (United States)

    Watson, Michael D.; Epps, Amy; Woodruff, Van; Vachon, Michael Jacob; Monreal, Julio; Williams, Randall; McLaughlin, Tom

    2014-01-01

    This analysis is a survey of control center architectures of the NASA Space Launch System (SLS), United Launch Alliance (ULA) Atlas V and Delta IV, and the European Space Agency (ESA) Ariane 5. Each of these control center architectures have similarities in basic structure, and differences in functional distribution of responsibilities for the phases of operations: (a) Launch vehicles in the international community vary greatly in configuration and process; (b) Each launch site has a unique processing flow based on the specific configurations; (c) Launch and flight operations are managed through a set of control centers associated with each launch site, however the flight operations may be a different control center than the launch center; and (d) The engineering support centers are primarily located at the design center with a small engineering support team at the launch site.

  6. A Low-Cost Launch Assistance System for Orbital Launch Vehicles

    Directory of Open Access Journals (Sweden)

    Oleg Nizhnik

    2012-01-01

    Full Text Available The author reviews the state of art of nonrocket launch assistance systems (LASs for spaceflight focusing on air launch options. The author proposes an alternative technologically feasible LAS based on a combination of approaches: air launch, high-altitude balloon, and tethered LAS. Proposed LAS can be implemented with the existing off-the-shelf hardware delivering 7 kg to low-earth orbit for the 5200 USD per kg. Proposed design can deliver larger reduction in price and larger orbital payloads with the future advances in the aerostats, ropes, electrical motors, and terrestrial power networks.

  7. Algorithms for boundary detection in radiographic images

    International Nuclear Information System (INIS)

    Gonzaga, Adilson; Franca, Celso Aparecido de

    1996-01-01

    Edge detecting techniques applied to radiographic digital images are discussed. Some algorithms have been implemented and the results are displayed to enhance boundary or hide details. An algorithm applied in a pre processed image with contrast enhanced is proposed and the results are discussed

  8. Digital signal processing algorithms for nuclear particle spectroscopy

    International Nuclear Information System (INIS)

    Zejnalova, O.; Zejnalov, Sh.; Hambsch, F.J.; Oberstedt, S.

    2007-01-01

    Digital signal processing algorithms for nuclear particle spectroscopy are described along with a digital pile-up elimination method applicable to equidistantly sampled detector signals pre-processed by a charge-sensitive preamplifier. The signal processing algorithms are provided as recursive one- or multi-step procedures which can be easily programmed using modern computer programming languages. The influence of the number of bits of the sampling analogue-to-digital converter on the final signal-to-noise ratio of the spectrometer is considered. Algorithms for a digital shaping-filter amplifier, for a digital pile-up elimination scheme and for ballistic deficit correction were investigated using a high purity germanium detector. The pile-up elimination method was originally developed for fission fragment spectroscopy using a Frisch-grid back-to-back double ionization chamber and was mainly intended for pile-up elimination in case of high alpha-radioactivity of the fissile target. The developed pile-up elimination method affects only the electronic noise generated by the preamplifier. Therefore the influence of the pile-up elimination scheme on the final resolution of the spectrometer is investigated in terms of the distance between pile-up pulses. The efficiency of the developed algorithms is compared with other signal processing schemes published in literature

  9. Pre-Extreme Automotive Anti-Lock Brake Systems

    Directory of Open Access Journals (Sweden)

    V. G. Ivanov

    2004-01-01

    Full Text Available Designing of systems ensuring active safety of automobiles with intellectual functions requires usage of new control principles for wheel and automobile operation. One of such principles is a preextreme control strategy. Its aim is to ensure wheel work in pre-extreme, stable area of «tire grip coefficient wheel slip coefficient» dependence. The simplest realization of pre-extreme control in automotive anti-lock brake systems consists in the threshold and gradient algorithms. A comparative analysis of these algorithms which has been made on simulation results of bus braking with various anti-lock brake systems has revealed their high efficiency.

  10. Method for Producing Launch/Landing Pads and Structures Project

    Science.gov (United States)

    Mueller, Robert P. (Compiler)

    2015-01-01

    Current plans for deep space exploration include building landing-launch pads capable of withstanding the rocket blast of much larger spacecraft that that of the Apollo days. The proposed concept will develop lightweight launch and landing pad materials from in-situ materials, utilizing regolith to produce controllable porous cast metallic foam brickstiles shapes. These shapes can be utilized to lay a landing launch platform, as a construction material or as more complex parts of mechanical assemblies.

  11. Development of the algorithm for obtaining 3-dimensional information using the structured light

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Dong Uk; Lee, Jae Hyub; Kim, Chung Soo [Korea University of Technology and Education, Cheonan (Korea)

    1998-03-01

    The utilization of robot in atomic power plants or nuclear-related facilities has grown rapidly. In order to perform preassigned jobs using robot in nuclear-related facilities, advanced technology extracting 3D information of objects is essential. We have studied an algorithm to extract 3D information of objects using laser slit light and camera, and developed the following hardware system and algorithms. (1) We have designed and fabricated the hardware system which consists of laser light and two cameras. The hardware system can be easily installed on the robot. (2) In order to reduce the occlusion problem when measuring 3D information using laser slit light and camera, we have studied system with laser slit light and two cameras and developed algorithm to synthesize 3D information obtained from two cameras. (2) For easy use of obtained 3D information, we expressed it as digital distance image format and developed algorithm to interpolate 3D information of points which is not obtained. (4) In order to simplify calibration of the camera's parameter, we have also designed an fabricated LED plate, and developed an algorithm detecting the center position of LED automatically. We can certify the efficiency of developed algorithm and hardware system through experimental results. 16 refs., 26 figs., 1 tabs. (Author)

  12. Launching technological innovations

    DEFF Research Database (Denmark)

    Talke, Katrin; Salomo, Søren

    2009-01-01

    have received less attention. This study considers the interdependencies between strategic, internally and externally, directed tactical launch activities and investigates both direct and indirect performance effects. The analysis is based upon data from 113 technological innovations launched...

  13. A Dual Launch Robotic and Human Lunar Mission Architecture

    Science.gov (United States)

    Jones, David L.; Mulqueen, Jack; Percy, Tom; Griffin, Brand; Smitherman, David

    2010-01-01

    This paper describes a comprehensive lunar exploration architecture developed by Marshall Space Flight Center's Advanced Concepts Office that features a science-based surface exploration strategy and a transportation architecture that uses two launches of a heavy lift launch vehicle to deliver human and robotic mission systems to the moon. The principal advantage of the dual launch lunar mission strategy is the reduced cost and risk resulting from the development of just one launch vehicle system. The dual launch lunar mission architecture may also enhance opportunities for commercial and international partnerships by using expendable launch vehicle services for robotic missions or development of surface exploration elements. Furthermore, this architecture is particularly suited to the integration of robotic and human exploration to maximize science return. For surface operations, an innovative dual-mode rover is presented that is capable of performing robotic science exploration as well as transporting human crew conducting surface exploration. The dual-mode rover can be deployed to the lunar surface to perform precursor science activities, collect samples, scout potential crew landing sites, and meet the crew at a designated landing site. With this approach, the crew is able to evaluate the robotically collected samples to select the best samples for return to Earth to maximize the scientific value. The rovers can continue robotic exploration after the crew leaves the lunar surface. The transportation system for the dual launch mission architecture uses a lunar-orbit-rendezvous strategy. Two heavy lift launch vehicles depart from Earth within a six hour period to transport the lunar lander and crew elements separately to lunar orbit. In lunar orbit, the crew transfer vehicle docks with the lander and the crew boards the lander for descent to the surface. After the surface mission, the crew returns to the orbiting transfer vehicle for the return to the Earth. This

  14. Launch Vehicle Manual Steering with Adaptive Augmenting Control:In-Flight Evaluations of Adverse Interactions Using a Piloted Aircraft

    Science.gov (United States)

    Hanson, Curt; Miller, Chris; Wall, John H.; VanZwieten, Tannen S.; Gilligan, Eric T.; Orr, Jeb S.

    2015-01-01

    An Adaptive Augmenting Control (AAC) algorithm for the Space Launch System (SLS) has been developed at the Marshall Space Flight Center (MSFC) as part of the launch vehicle's baseline flight control system. A prototype version of the SLS flight control software was hosted on a piloted aircraft at the Armstrong Flight Research Center to demonstrate the adaptive controller on a full-scale realistic application in a relevant flight environment. Concerns regarding adverse interactions between the adaptive controller and a potential manual steering mode were also investigated by giving the pilot trajectory deviation cues and pitch rate command authority, which is the subject of this paper. Two NASA research pilots flew a total of 25 constant pitch rate trajectories using a prototype manual steering mode with and without adaptive control, evaluating six different nominal and off-nominal test case scenarios. Pilot comments and PIO ratings were given following each trajectory and correlated with aircraft state data and internal controller signals post-flight.

  15. State Machine Modeling of the Space Launch System Solid Rocket Boosters

    Science.gov (United States)

    Harris, Joshua A.; Patterson-Hine, Ann

    2013-01-01

    The Space Launch System is a Shuttle-derived heavy-lift vehicle currently in development to serve as NASA's premiere launch vehicle for space exploration. The Space Launch System is a multistage rocket with two Solid Rocket Boosters and multiple payloads, including the Multi-Purpose Crew Vehicle. Planned Space Launch System destinations include near-Earth asteroids, the Moon, Mars, and Lagrange points. The Space Launch System is a complex system with many subsystems, requiring considerable systems engineering and integration. To this end, state machine analysis offers a method to support engineering and operational e orts, identify and avert undesirable or potentially hazardous system states, and evaluate system requirements. Finite State Machines model a system as a finite number of states, with transitions between states controlled by state-based and event-based logic. State machines are a useful tool for understanding complex system behaviors and evaluating "what-if" scenarios. This work contributes to a state machine model of the Space Launch System developed at NASA Ames Research Center. The Space Launch System Solid Rocket Booster avionics and ignition subsystems are modeled using MATLAB/Stateflow software. This model is integrated into a larger model of Space Launch System avionics used for verification and validation of Space Launch System operating procedures and design requirements. This includes testing both nominal and o -nominal system states and command sequences.

  16. Achieving a Launch on Demand Capability

    Science.gov (United States)

    Greenberg, Joel S.

    2002-01-01

    The ability to place payloads [satellites] into orbit as and when required, often referred to as launch on demand, continues to be an elusive and yet largely unfulfilled goal. But what is the value of achieving launch on demand [LOD], and what metrics are appropriate? Achievement of a desired level of LOD capability must consider transportation system thruput, alternative transportation systems that comprise the transportation architecture, transportation demand, reliability and failure recovery characteristics of the alternatives, schedule guarantees, launch delays, payload integration schedules, procurement policies, and other factors. Measures of LOD capability should relate to the objective of the transportation architecture: the placement of payloads into orbit as and when required. Launch on demand capability must be defined in probabilistic terms such as the probability of not incurring a delay in excess of T when it is determined that it is necessary to place a payload into orbit. Three specific aspects of launch on demand are considered: [1] the ability to recover from adversity [i.e., a launch failure] and to keep up with the steady-state demand for placing satellites into orbit [this has been referred to as operability and resiliency], [2] the ability to respond to the requirement to launch a satellite when the need arises unexpectedly either because of an unexpected [random] on-orbit satellite failure that requires replacement or because of the sudden recognition of an unanticipated requirement, and [3] the ability to recover from adversity [i.e., a launch failure] during the placement of a constellation into orbit. The objective of this paper is to outline a formal approach for analyzing alternative transportation architectures in terms of their ability to provide a LOD capability. The economic aspect of LOD is developed by establishing a relationship between scheduling and the elimination of on-orbit spares while achieving the desired level of on

  17. Algorithm for the automatic computation of the modified Anderson-Wilkins acuteness score of ischemia from the pre-hospital ECG in ST-segment elevation myocardial infarction

    DEFF Research Database (Denmark)

    Fakhri, Yama; Sejersten-Ripa, Maria; Schoos, Mikkel Malby

    2017-01-01

    BACKGROUND: The acuteness score (based on the modified Anderson-Wilkins score) estimates the acuteness of ischemia based on ST-segment, Q-wave and T-wave measurements obtained from the electrocardiogram (ECG) in patients with ST Elevation Myocardial Infarction (STEMI). The score (range 1 (least...... the acuteness score. METHODS: We scored 50 pre-hospital ECGs from STEMI patients, manually and by the automated algorithm. We assessed the reliability test between the manual and automated algorithm by interclass correlation coefficient (ICC) and Bland-Altman plot. RESULTS: The ICC was 0.84 (95% CI 0.......72-0.91), PECGs, all within the upper (1.46) and lower (-1.12) limits...

  18. Fabrication and testing of W7-X pre-series target elements

    International Nuclear Information System (INIS)

    Boscary, J; Boeswirth, B; Greuner, H; Grigull, P; Missirlian, M; Plankensteiner, A; Schedler, B; Friedrich, T; Schlosser, J; Streibl, B; Traxler, H

    2007-01-01

    The assembly of the highly-loaded target plates of the WENDELSTEIN 7-X (W7-X) divertor requires the fabrication of 890 target elements (TEs). The plasma facing material is made of CFC NB31 flat tiles bonded to a CuCrZr copper alloy water-cooled heat sink. The elements are designed to remove a stationary heat flux and power up to 10 MW m -2 and 100 kW, respectively. Before launching the serial fabrication, pre-series activities aimed at qualifying the design, the manufacturing route and the non-destructive examinations (NDEs). High heat flux (HHF) tests performed on full-scale pre-series TEs resulted in an improvement of the design of the bond between tiles and heat sink to reduce the stresses during operation. The consequence is the fabrication of additional pre-series TEs to be tested in the HHF facility GLADIS. NDEs of this bond based on thermography methods are developed to define the acceptance criteria suitable for serial fabrication

  19. A Multi-Faceted View of GPM GV in the Post-Launch Era

    Science.gov (United States)

    Petersen, W. A.

    2015-12-01

    NASA GPM Ground Validation (GV) activities in the early post-launch era have focused on: a) intercomparison of early version satellite products to GV data originating from NOAA Q3, WSR-88D, and Tier-1 research ground radar (GR) and instrument networks; b) continued physical validation of GPM algorithms using recent field campaign and site-specific datasets (warm and cold season); and c) development and use of rainfall products for hydrologic validation and bridging-validation of Level II and Level III satellite products (IMERG). Intercomparisons of GPM products with Q3 rainfall and WSR-88D ground-radar (GR) data over CONUS exhibit reasonable agreement. For example, DPR radar reflectivities geo-matched to reflectivity profiles from ~60 GRs generally differ by 2 dB or less. Occasional low-biases do appear in the rainwater portion of DPR Ku-Band convective reflectivity profiles. In stratiform precipitation, DPR-diagnosed reflectivity and rain drop size distributions are frequently very similar to those retrieved from GR products. DPR and Combined algorithm rainrate products compare reasonably well to each other and to Q3 at CONUS scales. GPROF2014 radiometer-based rain rates compare well to Q3 in a spatial sense (correlations of ~0.6); but, GMI estimates appear to be slightly low-biased relative to Q3 and to DPR and Combined algorithm products. The last NASA GPM GV-led field effort, OLYMPEX, will occur in Nov 2015 to Jan 2016. OLYMPEX is designed to study cold-season precipitation processes and hydrology in the orographic and oceanic domains of western Washington State. In addition to occasional field campaigns like OLYMPEX, continuous field measurements using multi-parameter radar and instrument networks targeted to direct validation and specific problems in physical validation (e.g., path-integrated attenuation and the impacts of drop size distribution, non-uniform beam filling and multiple scattering) are also being collected under coincident GPM core overpasses at

  20. Comparison between iterative wavefront control algorithm and direct gradient wavefront control algorithm for adaptive optics system

    Science.gov (United States)

    Cheng, Sheng-Yi; Liu, Wen-Jin; Chen, Shan-Qiu; Dong, Li-Zhi; Yang, Ping; Xu, Bing

    2015-08-01

    Among all kinds of wavefront control algorithms in adaptive optics systems, the direct gradient wavefront control algorithm is the most widespread and common method. This control algorithm obtains the actuator voltages directly from wavefront slopes through pre-measuring the relational matrix between deformable mirror actuators and Hartmann wavefront sensor with perfect real-time characteristic and stability. However, with increasing the number of sub-apertures in wavefront sensor and deformable mirror actuators of adaptive optics systems, the matrix operation in direct gradient algorithm takes too much time, which becomes a major factor influencing control effect of adaptive optics systems. In this paper we apply an iterative wavefront control algorithm to high-resolution adaptive optics systems, in which the voltages of each actuator are obtained through iteration arithmetic, which gains great advantage in calculation and storage. For AO system with thousands of actuators, the computational complexity estimate is about O(n2) ˜ O(n3) in direct gradient wavefront control algorithm, while the computational complexity estimate in iterative wavefront control algorithm is about O(n) ˜ (O(n)3/2), in which n is the number of actuators of AO system. And the more the numbers of sub-apertures and deformable mirror actuators, the more significant advantage the iterative wavefront control algorithm exhibits. Project supported by the National Key Scientific and Research Equipment Development Project of China (Grant No. ZDYZ2013-2), the National Natural Science Foundation of China (Grant No. 11173008), and the Sichuan Provincial Outstanding Youth Academic Technology Leaders Program, China (Grant No. 2012JQ0012).

  1. A Developed Artificial Bee Colony Algorithm Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Ye Jin

    2018-04-01

    Full Text Available The Artificial Bee Colony (ABC algorithm is a bionic intelligent optimization method. The cloud model is a kind of uncertainty conversion model between a qualitative concept T ˜ that is presented by nature language and its quantitative expression, which integrates probability theory and the fuzzy mathematics. A developed ABC algorithm based on cloud model is proposed to enhance accuracy of the basic ABC algorithm and avoid getting trapped into local optima by introducing a new select mechanism, replacing the onlooker bees’ search formula and changing the scout bees’ updating formula. Experiments on CEC15 show that the new algorithm has a faster convergence speed and higher accuracy than the basic ABC and some cloud model based ABC variants.

  2. Automated Development of Accurate Algorithms and Efficient Codes for Computational Aeroacoustics

    Science.gov (United States)

    Goodrich, John W.; Dyson, Rodger W.

    1999-01-01

    The simulation of sound generation and propagation in three space dimensions with realistic aircraft components is a very large time dependent computation with fine details. Simulations in open domains with embedded objects require accurate and robust algorithms for propagation, for artificial inflow and outflow boundaries, and for the definition of geometrically complex objects. The development, implementation, and validation of methods for solving these demanding problems is being done to support the NASA pillar goals for reducing aircraft noise levels. Our goal is to provide algorithms which are sufficiently accurate and efficient to produce usable results rapidly enough to allow design engineers to study the effects on sound levels of design changes in propulsion systems, and in the integration of propulsion systems with airframes. There is a lack of design tools for these purposes at this time. Our technical approach to this problem combines the development of new, algorithms with the use of Mathematica and Unix utilities to automate the algorithm development, code implementation, and validation. We use explicit methods to ensure effective implementation by domain decomposition for SPMD parallel computing. There are several orders of magnitude difference in the computational efficiencies of the algorithms which we have considered. We currently have new artificial inflow and outflow boundary conditions that are stable, accurate, and unobtrusive, with implementations that match the accuracy and efficiency of the propagation methods. The artificial numerical boundary treatments have been proven to have solutions which converge to the full open domain problems, so that the error from the boundary treatments can be driven as low as is required. The purpose of this paper is to briefly present a method for developing highly accurate algorithms for computational aeroacoustics, the use of computer automation in this process, and a brief survey of the algorithms that

  3. NASA's Launch Propulsion Systems Technology Roadmap

    Science.gov (United States)

    McConnaughey, Paul K.; Femminineo, Mark G.; Koelfgen, Syri J.; Lepsch, Roger A; Ryan, Richard M.; Taylor, Steven A.

    2012-01-01

    Safe, reliable, and affordable access to low-Earth (LEO) orbit is necessary for all of the United States (US) space endeavors. In 2010, NASA s Office of the Chief Technologist commissioned 14 teams to develop technology roadmaps that could be used to guide the Agency s and US technology investment decisions for the next few decades. The Launch Propulsion Systems Technology Area (LPSTA) team was tasked to address the propulsion technology challenges for access to LEO. The developed LPSTA roadmap addresses technologies that enhance existing solid or liquid propulsion technologies and their related ancillary systems or significantly advance the technology readiness level (TRL) of less mature systems like airbreathing, unconventional, and other launch technologies. In developing this roadmap, the LPSTA team consulted previous NASA, military, and industry studies as well as subject matter experts to develop their assessment of this field, which has fundamental technological and strategic impacts for US space capabilities.

  4. Search for 'Little Higgs' and reconstruction algorithms developments in Atlas

    International Nuclear Information System (INIS)

    Rousseau, D.

    2007-05-01

    This document summarizes developments of framework and reconstruction algorithms for the ATLAS detector at the LHC. A library of reconstruction algorithms has been developed in a more and more complex environment. The reconstruction software originally designed on an optimistic Monte-Carlo simulation, has been confronted with a more detailed 'as-built' simulation. The 'Little Higgs' is an effective theory which can be taken for granted, or as an opportunity to study heavy resonances. In several cases, these resonances can be detected in original channels like tZ, ZH or WH. (author)

  5. Development of Base Transceiver Station Selection Algorithm for ...

    African Journals Online (AJOL)

    TEMS) equipment was carried out on the existing BTSs, and a linear algorithm optimization program based on the spectral link efficiency of each BTS was developed, the output of this site optimization gives the selected number of base station sites ...

  6. Developing and Implementing the Data Mining Algorithms in RAVEN

    International Nuclear Information System (INIS)

    Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea; Rabiti, Cristian

    2015-01-01

    The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantification analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.

  7. Developing and Implementing the Data Mining Algorithms in RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Ramazan Sonat [Idaho National Lab. (INL), Idaho Falls, ID (United States); Maljovec, Daniel Patrick [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantification analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.

  8. WIPP - Pre-Licensing and Operations: Developer and Regulator Perspectives

    International Nuclear Information System (INIS)

    Peake, Tom; Patterson, R.

    2014-01-01

    The Waste Isolation Pilot Plant (WIPP) is a disposal system for defense-related transuranic (TRU) radioactive waste. Developed by the Department of Energy (DOE), WIPP is located in Southeastern New Mexico: radioactive waste is disposed of 2,150 feet underground in an ancient layer of salt with a total capacity of 6.2 million cubic feet of waste. Congress authorized the development and construction of WIPP in 1980 for the express purpose of providing a research and development facility to demonstrate the safe disposal of radioactive wastes resulting from the defense activities and programs of the United States. This paper makes a historical review of the site development, site operations (waste disposal operations started in 1999), communications between US EPA and DOE, the chronology of pre-licensing and pre-operations, the operational phase and the regulatory challenges, and the lessons learned after 12 years of operations

  9. Machine Learning Algorithms Outperform Conventional Regression Models in Predicting Development of Hepatocellular Carcinoma

    Science.gov (United States)

    Singal, Amit G.; Mukherjee, Ashin; Elmunzer, B. Joseph; Higgins, Peter DR; Lok, Anna S.; Zhu, Ji; Marrero, Jorge A; Waljee, Akbar K

    2015-01-01

    Background Predictive models for hepatocellular carcinoma (HCC) have been limited by modest accuracy and lack of validation. Machine learning algorithms offer a novel methodology, which may improve HCC risk prognostication among patients with cirrhosis. Our study's aim was to develop and compare predictive models for HCC development among cirrhotic patients, using conventional regression analysis and machine learning algorithms. Methods We enrolled 442 patients with Child A or B cirrhosis at the University of Michigan between January 2004 and September 2006 (UM cohort) and prospectively followed them until HCC development, liver transplantation, death, or study termination. Regression analysis and machine learning algorithms were used to construct predictive models for HCC development, which were tested on an independent validation cohort from the Hepatitis C Antiviral Long-term Treatment against Cirrhosis (HALT-C) Trial. Both models were also compared to the previously published HALT-C model. Discrimination was assessed using receiver operating characteristic curve analysis and diagnostic accuracy was assessed with net reclassification improvement and integrated discrimination improvement statistics. Results After a median follow-up of 3.5 years, 41 patients developed HCC. The UM regression model had a c-statistic of 0.61 (95%CI 0.56-0.67), whereas the machine learning algorithm had a c-statistic of 0.64 (95%CI 0.60–0.69) in the validation cohort. The machine learning algorithm had significantly better diagnostic accuracy as assessed by net reclassification improvement (pmachine learning algorithm (p=0.047). Conclusion Machine learning algorithms improve the accuracy of risk stratifying patients with cirrhosis and can be used to accurately identify patients at high-risk for developing HCC. PMID:24169273

  10. Developing pre-service science teachers' pedagogical content knowledge by using training program

    Science.gov (United States)

    Udomkan, Watinee; Suwannoi, Paisan

    2018-01-01

    A training program was developed for enhancing pre-service science teachers' pedagogical content knowledge (PCK). The pre-service science teachers are able to: understand science curriculum, knowledge of assessment in science, knowledge of students' understanding of science, instructional strategies and orientations towards science teaching, which is conceptualized as PCK [5]. This study examined the preservice science teachers' understandings and their practices which include five pre-service science teachers' PCK. In this study, the participants demonstrated their PCK through the process of the training program by writing content representations (CoRes), preparing the lesson plans, micro-teaching, and actual teaching respectively. All pre-service science teachers' performs were collected by classroom observations. Then, they were interviewed. The results showed that the pre-service science teachers progressively developed knowledge components of PCK. Micro-teaching is the key activities for developing PCK. However, they had some difficulties in their classroom teaching. They required of sufficient ability to design appropriate instructional strategies and assessment activities for teaching. Blending content and pedagogy is also a matter of great concern. The implication of this study was that science educators can enhance pre-service science teachers' PCK by fostering their better understandings of the instructional strategies, assessment activities and blending between content and pedagogy in their classroom.

  11. Eddy Seeding in the Labrador Sea: a Submerged Autonomous Launching Platform (SALP) Application

    Science.gov (United States)

    Furey, Heather H.; Femke de Jong, M.; Bower, Amy S.

    2013-04-01

    A simplified Submerged Autonomous Launch Platform (SALP) was used to release profiling floats into warm-core Irminger Rings (IRs) in order to investigate their vertical structure and evolution in the Labrador Sea from September 2007 - September 2009. IRs are thought to play an important role in restratification after convection in the Labrador Sea. The SALP is designed to release surface drifters or subsurface floats serially from a traditional ocean mooring, using real-time ocean measurements as criteria for launch. The original prototype instrument used properties measured at multiple depths, with information relayed to the SALP controller via acoustic modems. In our application, two SALP carousels were attached at 500 meters onto a heavily-instrumented deep water mooring, in the path of recently-shed IRs off the west Greenland shelf. A release algorithm was designed to use temperature and pressure measured at the SALP depth only to release one or two APEX profiling drifters each time an IR passed the mooring, using limited historical observations to set release thresholds. Mechanically and electronically, the SALP worked well: out of eleven releases, there was only one malfunction when a float was caught in the cage after the burn-wire had triggered. However, getting floats trapped in eddies met with limited success due to problems with the release algorithm and float ballasting. Out of seven floats launched from the platform using oceanographic criteria, four were released during warm water events that were not related to passing IRs. Also, after float release, it took on average about 2.6 days for the APEX to adjust from its initial ballast depth, about 600 meters, to its park point of 300 meters, leaving the float below the trapped core of water in the IRs. The other mooring instruments (at depths of 100 to 3000 m), revealed that 12 IRs passed by the mooring in the 2-year monitoring period. With this independent information, we were able to assess and improve

  12. Mass media and the development of pre-reading of preschool children

    OpenAIRE

    GALATÍKOVÁ, Zuzana

    2016-01-01

    This thesis maps mass media, especially television broadcasting and electronic devices with connection to the Internet, in the lives of pre-school children, and investigates the relationship between mass media and development of initial reading skills. The theoretical part analyses existing literature relevant to pre-school child development elementary reading and mass media, while the empirical research makes an independent investigation into this phenomenon in society using questionnaires f...

  13. NTR-Enhanced Lunar-Base Supply using Existing Launch Fleet Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    John D. Bess; Emily Colvin; Paul G. Cummings

    2009-06-01

    During the summer of 2006, students at the Center for Space Nuclear Research sought to augment the current NASA lunar exploration architecture with a nuclear thermal rocket (NTR). An additional study investigated the possible use of an NTR with existing launch vehicles to provide 21 metric tons of supplies to the lunar surface in support of a lunar outpost. Current cost estimates show that the complete mission cost for an NTR-enhanced assembly of Delta-IV and Atlas V vehicles may cost 47-86% more than the estimated Ares V launch cost of $1.5B; however, development costs for the current NASA architecture have not been assessed. The additional cost of coordinating the rendezvous of four to six launch vehicles with an in-orbit assembly facility also needs more thorough analysis and review. Future trends in launch vehicle use will also significantly impact the results from this comparison. The utility of multiple launch vehicles allows for the development of a more robust and lower risk exploration architecture.

  14. NTR-Enhanced Lunar-Base Supply using Existing Launch Fleet Capabilities

    International Nuclear Information System (INIS)

    Bess, John D.; Colvin, Emily; Cummings, Paul G.

    2009-01-01

    During the summer of 2006, students at the Center for Space Nuclear Research sought to augment the current NASA lunar exploration architecture with a nuclear thermal rocket (NTR). An additional study investigated the possible use of an NTR with existing launch vehicles to provide 21 metric tons of supplies to the lunar surface in support of a lunar outpost. Current cost estimates show that the complete mission cost for an NTR-enhanced assembly of Delta-IV and Atlas V vehicles may cost 47-86% more than the estimated Ares V launch cost of $1.5B; however, development costs for the current NASA architecture have not been assessed. The additional cost of coordinating the rendezvous of four to six launch vehicles with an in-orbit assembly facility also needs more thorough analysis and review. Future trends in launch vehicle use will also significantly impact the results from this comparison. The utility of multiple launch vehicles allows for the development of a more robust and lower risk exploration architecture

  15. Development and application of a model for the analysis of trades between space launch system operations and acquisition costs

    Science.gov (United States)

    Nix, Michael B.

    2005-12-01

    Early design decisions in the development of space launch systems determine the costs to acquire and operate launch systems. Some sources indicate that as much as 90% of life cycle costs are fixed by the end of the critical design review phase. System characteristics determined by these early decisions are major factors in the acquisition cost of flight hardware elements and facilities and influence operations costs through the amount of maintenance and support labor required to sustain system function. Operations costs are also dependent on post-development management decisions regarding how much labor will be deployed to meet requirements of market demand and ownership profit. The ability to perform early trade-offs between these costs is vital to the development of systems that have the necessary capacity to provide service and are profitable to operate. An Excel-based prototype model was developed for making early analyses of trade-offs between the costs to operate a space launch system and to acquire the necessary assets to meet a given set of operational requirements. The model, integrating input from existing models and adding missing capability, allows the user to make such trade-offs across a range of operations concepts (required flight rates, staffing levels, shifts per workday, workdays per week and per year, unreliability, wearout and depot maintenance) and the number, type and capability of assets (flight hardware elements, processing and supporting facilities and infrastructure). The costs and capabilities of hypothetical launch systems can be modeled as a function of interrelated turnaround times and labor resource levels, and asset loss and retirement. The number of flight components and facilities required can be calculated and the operations and acquisition costs compared for a specified scenario. Findings, based on the analysis of a hypothetical two stage to orbit, reusable, unmanned launch system, indicate that the model is suitable for the

  16. Development of target-tracking algorithms using neural network

    Energy Technology Data Exchange (ETDEWEB)

    Park, Dong Sun; Lee, Joon Whaoan; Yoon, Sook; Baek, Seong Hyun; Lee, Myung Jae [Chonbuk National University, Chonjoo (Korea)

    1998-04-01

    The utilization of remote-control robot system in atomic power plants or nuclear-related facilities grows rapidly, to protect workers form high radiation environments. Such applications require complete stability of the robot system, so that precisely tracking the robot is essential for the whole system. This research is to accomplish the goal by developing appropriate algorithms for remote-control robot systems. A neural network tracking system is designed and experimented to trace a robot Endpoint. This model is aimed to utilized the excellent capabilities of neural networks; nonlinear mapping between inputs and outputs, learning capability, and generalization capability. The neural tracker consists of two networks for position detection and prediction. Tracking algorithms are developed and experimented for the two models. Results of the experiments show that both models are promising as real-time target-tracking systems for remote-control robot systems. (author). 10 refs., 47 figs.

  17. Launch vehicle selection model

    Science.gov (United States)

    Montoya, Alex J.

    1990-01-01

    Over the next 50 years, humans will be heading for the Moon and Mars to build scientific bases to gain further knowledge about the universe and to develop rewarding space activities. These large scale projects will last many years and will require large amounts of mass to be delivered to Low Earth Orbit (LEO). It will take a great deal of planning to complete these missions in an efficient manner. The planning of a future Heavy Lift Launch Vehicle (HLLV) will significantly impact the overall multi-year launching cost for the vehicle fleet depending upon when the HLLV will be ready for use. It is desirable to develop a model in which many trade studies can be performed. In one sample multi-year space program analysis, the total launch vehicle cost of implementing the program reduced from 50 percent to 25 percent. This indicates how critical it is to reduce space logistics costs. A linear programming model has been developed to answer such questions. The model is now in its second phase of development, and this paper will address the capabilities of the model and its intended uses. The main emphasis over the past year was to make the model user friendly and to incorporate additional realistic constraints that are difficult to represent mathematically. We have developed a methodology in which the user has to be knowledgeable about the mission model and the requirements of the payloads. We have found a representation that will cut down the solution space of the problem by inserting some preliminary tests to eliminate some infeasible vehicle solutions. The paper will address the handling of these additional constraints and the methodology for incorporating new costing information utilizing learning curve theory. The paper will review several test cases that will explore the preferred vehicle characteristics and the preferred period of construction, i.e., within the next decade, or in the first decade of the next century. Finally, the paper will explore the interaction

  18. Technical and Economical Feasibility of SSTO and TSTO Launch Vehicles

    Science.gov (United States)

    Lerch, Jens

    This paper discusses whether it is more cost effective to launch to low earth orbit in one or two stages, assuming current or near future technologies. First the paper provides an overview of the current state of the launch market and the hurdles to introducing new launch vehicles capable of significantly lowering the cost of access to space and discusses possible routes to solve those problems. It is assumed that reducing the complexity of launchers by reducing the number of stages and engines, and introducing reusability will result in lower launch costs. A number of operational and historic launch vehicle stages capable of near single stage to orbit (SSTO) performance are presented and the necessary steps to modify them into an expendable SSTO launcher and an optimized two stage to orbit (TSTO) launcher are shown, through parametric analysis. Then a ballistic reentry and recovery system is added to show that reusable SSTO and TSTO vehicles are also within the current state of the art. The development and recurring costs of the SSTO and the TSTO systems are estimated and compared. This analysis shows whether it is more economical to develop and operate expendable or reusable SSTO or TSTO systems under different assumption for launch rate and initial investment.

  19. Development of web-based reliability data analysis algorithm model and its application

    International Nuclear Information System (INIS)

    Hwang, Seok-Won; Oh, Ji-Yong; Moosung-Jae

    2010-01-01

    For this study, a database model of plant reliability was developed for the effective acquisition and management of plant-specific data that can be used in various applications of plant programs as well as in Probabilistic Safety Assessment (PSA). Through the development of a web-based reliability data analysis algorithm, this approach systematically gathers specific plant data such as component failure history, maintenance history, and shift diary. First, for the application of the developed algorithm, this study reestablished the raw data types, data deposition procedures and features of the Enterprise Resource Planning (ERP) system process. The component codes and system codes were standardized to make statistical analysis between different types of plants possible. This standardization contributes to the establishment of a flexible database model that allows the customization of reliability data for the various applications depending on component types and systems. In addition, this approach makes it possible for users to perform trend analyses and data comparisons for the significant plant components and systems. The validation of the algorithm is performed through a comparison of the importance measure value (Fussel-Vesely) of the mathematical calculation and that of the algorithm application. The development of a reliability database algorithm is one of the best approaches for providing systemic management of plant-specific reliability data with transparency and continuity. This proposed algorithm reinforces the relationships between raw data and application results so that it can provide a comprehensive database that offers everything from basic plant-related data to final customized data.

  20. Development of web-based reliability data analysis algorithm model and its application

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Seok-Won, E-mail: swhwang@khnp.co.k [Korea Hydro and Nuclear Power Co. Ltd., Jang-Dong 25-1, Yuseong-Gu, 305-343 Daejeon (Korea, Republic of); Oh, Ji-Yong [Korea Hydro and Nuclear Power Co. Ltd., Jang-Dong 25-1, Yuseong-Gu, 305-343 Daejeon (Korea, Republic of); Moosung-Jae [Department of Nuclear Engineering Hanyang University 17 Haengdang, Sungdong, Seoul (Korea, Republic of)

    2010-02-15

    For this study, a database model of plant reliability was developed for the effective acquisition and management of plant-specific data that can be used in various applications of plant programs as well as in Probabilistic Safety Assessment (PSA). Through the development of a web-based reliability data analysis algorithm, this approach systematically gathers specific plant data such as component failure history, maintenance history, and shift diary. First, for the application of the developed algorithm, this study reestablished the raw data types, data deposition procedures and features of the Enterprise Resource Planning (ERP) system process. The component codes and system codes were standardized to make statistical analysis between different types of plants possible. This standardization contributes to the establishment of a flexible database model that allows the customization of reliability data for the various applications depending on component types and systems. In addition, this approach makes it possible for users to perform trend analyses and data comparisons for the significant plant components and systems. The validation of the algorithm is performed through a comparison of the importance measure value (Fussel-Vesely) of the mathematical calculation and that of the algorithm application. The development of a reliability database algorithm is one of the best approaches for providing systemic management of plant-specific reliability data with transparency and continuity. This proposed algorithm reinforces the relationships between raw data and application results so that it can provide a comprehensive database that offers everything from basic plant-related data to final customized data.

  1. Pre-Service Teachers' Perceptions on Tpack Development after Designing Educational Games

    Science.gov (United States)

    Sancar Tokmak, Hatice

    2015-01-01

    This qualitative case study aimed to investigate Early Childhood Education (ECE) pre-service teachers' perception of development in their technological, pedagogical, content knowledge (TPACK) after designing educational computer games for young children. Participants included 21 ECE pre-service teachers enrolled in the course Instructional…

  2. Evolved Expendable Launch Vehicle (EELV)

    Science.gov (United States)

    2015-12-15

    FY13+ Phase I Buy Contractor: United Launch Services, LLC Contractor Location: 9501 East Panorama Circle Centennial , CO 80112 Contract Number...Contract Name: FY13+ Phase I Buy Contractor: United Launch Services, LLC Contractor Location: 9501 East Panorama Circle Centennial , CO 80112 Contract...FY12 EELV Launch Services (ELS5) Contractor: United Launch Services, LLC. Contractor Location: 9501 East Panorama Circle Centennial , CO 80112

  3. Development of an inter-layer solute transport algorithm for SOLTR computer program. Part 1. The algorithm

    International Nuclear Information System (INIS)

    Miller, I.; Roman, K.

    1979-12-01

    In order to perform studies of the influence of regional groundwater flow systems on the long-term performance of potential high-level nuclear waste repositories, it was determined that an adequate computer model would have to consider the full three-dimensional flow system. Golder Associates' SOLTR code, while three-dimensional, has an overly simple algorithm for simulating the passage of radionuclides from one aquifier to another above or below it. Part 1 of this report describes the algorithm developed to provide SOLTR with an improved capability for simulating interaquifer transport

  4. Evaluation of Two Absolute Radiometric Normalization Algorithms for Pre-processing of Landsat Imagery

    Institute of Scientific and Technical Information of China (English)

    Xu Hanqiu

    2006-01-01

    In order to evaluate radiometric normalization techniques, two image normalization algorithms for absolute radiometric correction of Landsat imagery were quantitatively compared in this paper, which are the Illumination Correction Model proposed by Markham and Irish and the Illumination and Atmospheric Correction Model developed by the Remote Sensing and GIS Laboratory of the Utah State University. Relative noise, correlation coefficient and slope value were used as the criteria for the evaluation and comparison, which were derived from pseudo-invariant features identified from multitemtween the normalized multitemporal images were significantly reduced when the seasons of multitemporal images were different. However, there was no significant difference between the normalized and unnormalized images with a similar seasonal condition. Furthermore, the correction results of two algorithms are similar when the images are relatively clear with a uniform atmospheric condition. Therefore, the radiometric normalization procedures should be carried out if the multitemporal images have a significant seasonal difference.

  5. Mentally-Retarded Children of a Pre-School Age and the Development of Movement Skills

    OpenAIRE

    Morávková, Šárka

    2006-01-01

    The diploma work covers the issues of children with mental retardation in pre-school age aimed to the development of the movement abilities. It focuses on the relationships between the pre-school child with mental retardation and possibilities of developing its motor skills in context of an organized pre-school education. Theoretical part of the Diploma work indicates the development specifics of the indi- vidual due to mental retardation, describes mainly the movement development of the chil...

  6. Powered Explicit Guidance Modifications and Enhancements for Space Launch System Block-1 and Block-1B Vehicles

    Science.gov (United States)

    Von der Porten, Paul; Ahmad, Naeem; Hawkins, Matt; Fill, Thomas

    2018-01-01

    NASA is currently building the Space Launch System (SLS) Block-1 launch vehicle for the Exploration Mission 1 (EM-1) test flight. NASA is also currently designing the next evolution of SLS, the Block-1B. The Block-1 and Block-1B vehicles will use the Powered Explicit Guidance (PEG) algorithm (of Space Shuttle heritage) for closed loop guidance. To accommodate vehicle capabilities and design for future evolutions of SLS, modifications were made to PEG for Block-1 to handle multi-phase burns, provide PEG updated propulsion information, and react to a core stage engine out. In addition, due to the relatively low thrust-to-weight ratio of the Exploration Upper Stage (EUS) and EUS carrying out Lunar Vicinity and Earth Escape missions, certain enhancements to the Block-1 PEG algorithm are needed to perform Block-1B missions to account for long burn arcs and target translunar and hyperbolic orbits. This paper describes the design and implementation of modifications to the Block-1 PEG algorithm as compared to Space Shuttle. Furthermore, this paper illustrates challenges posed by the Block-1B vehicle and the required PEG enhancements. These improvements make PEG capable for use on the SLS Block-1B vehicle as part of the Guidance, Navigation, and Control (GN&C) System.

  7. Use of Probabilistic Engineering Methods in the Detailed Design and Development Phases of the NASA Ares Launch Vehicle

    Science.gov (United States)

    Fayssal, Safie; Weldon, Danny

    2008-01-01

    The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program called Constellation to send crew and cargo to the international Space Station, to the moon, and beyond. As part of the Constellation program, a new launch vehicle, Ares I, is being developed by NASA Marshall Space Flight Center. Designing a launch vehicle with high reliability and increased safety requires a significant effort in understanding design variability and design uncertainty at the various levels of the design (system, element, subsystem, component, etc.) and throughout the various design phases (conceptual, preliminary design, etc.). In a previous paper [1] we discussed a probabilistic functional failure analysis approach intended mainly to support system requirements definition, system design, and element design during the early design phases. This paper provides an overview of the application of probabilistic engineering methods to support the detailed subsystem/component design and development as part of the "Design for Reliability and Safety" approach for the new Ares I Launch Vehicle. Specifically, the paper discusses probabilistic engineering design analysis cases that had major impact on the design and manufacturing of the Space Shuttle hardware. The cases represent important lessons learned from the Space Shuttle Program and clearly demonstrate the significance of probabilistic engineering analysis in better understanding design deficiencies and identifying potential design improvement for Ares I. The paper also discusses the probabilistic functional failure analysis approach applied during the early design phases of Ares I and the forward plans for probabilistic design analysis in the detailed design and development phases.

  8. Space Launch System for Exploration and Science

    Science.gov (United States)

    Klaus, K.

    2013-12-01

    Introduction: The Space Launch System (SLS) is the most powerful rocket ever built and provides a critical heavy-lift launch capability enabling diverse deep space missions. The exploration class vehicle launches larger payloads farther in our solar system and faster than ever before. The vehicle's 5 m to 10 m fairing allows utilization of existing systems which reduces development risks, size limitations and cost. SLS lift capacity and superior performance shortens mission travel time. Enhanced capabilities enable a myriad of missions including human exploration, planetary science, astrophysics, heliophysics, planetary defense and commercial space exploration endeavors. Human Exploration: SLS is the first heavy-lift launch vehicle capable of transporting crews beyond low Earth orbit in over four decades. Its design maximizes use of common elements and heritage hardware to provide a low-risk, affordable system that meets Orion mission requirements. SLS provides a safe and sustainable deep space pathway to Mars in support of NASA's human spaceflight mission objectives. The SLS enables the launch of large gateway elements beyond the moon. Leveraging a low-energy transfer that reduces required propellant mass, components are then brought back to a desired cislunar destination. SLS provides a significant mass margin that can be used for additional consumables or a secondary payloads. SLS lowers risks for the Asteroid Retrieval Mission by reducing mission time and improving mass margin. SLS lift capacity allows for additional propellant enabling a shorter return or the delivery of a secondary payload, such as gateway component to cislunar space. SLS enables human return to the moon. The intermediate SLS capability allows both crew and cargo to fly to translunar orbit at the same time which will simplify mission design and reduce launch costs. Science Missions: A single SLS launch to Mars will enable sample collection at multiple, geographically dispersed locations and a

  9. Pre-Service Physics Teachers' Views on Designing and Developing Physics Digital Stories

    Science.gov (United States)

    Kocakaya, Serhat; Kotluk, Nihat; Karakoyun, Ferit

    2016-01-01

    The aim of this study is to determine the pre-service physics teachers' views on the effect of designing and developing physics digital stories (DST) on improving their 21st century skills. The study is a qualitative research carried out with 13 pre-service physics teachers, who participated in the course of designing and developing DST, during 6…

  10. Assessment of Atmospheric Algorithms to Retrieve Vegetation in Natural Protected Areas Using Multispectral High Resolution Imagery

    Directory of Open Access Journals (Sweden)

    Javier Marcello

    2016-09-01

    Full Text Available The precise mapping of vegetation covers in semi-arid areas is a complex task as this type of environment consists of sparse vegetation mainly composed of small shrubs. The launch of high resolution satellites, with additional spectral bands and the ability to alter the viewing angle, offers a useful technology to focus on this objective. In this context, atmospheric correction is a fundamental step in the pre-processing of such remote sensing imagery and, consequently, different algorithms have been developed for this purpose over the years. They are commonly categorized as imaged-based methods as well as in more advanced physical models based on the radiative transfer theory. Despite the relevance of this topic, a few comparative studies covering several methods have been carried out using high resolution data or which are specifically applied to vegetation covers. In this work, the performance of five representative atmospheric correction algorithms (DOS, QUAC, FLAASH, ATCOR and 6S has been assessed, using high resolution Worldview-2 imagery and field spectroradiometer data collected simultaneously, with the goal of identifying the most appropriate techniques. The study also included a detailed analysis of the parameterization influence on the final results of the correction, the aerosol model and its optical thickness being important parameters to be properly adjusted. The effects of corrections were studied in vegetation and soil sites belonging to different protected semi-arid ecosystems (high mountain and coastal areas. In summary, the superior performance of model-based algorithms, 6S in particular, has been demonstrated, achieving reflectance estimations very close to the in-situ measurements (RMSE of between 2% and 3%. Finally, an example of the importance of the atmospheric correction in the vegetation estimation in these natural areas is presented, allowing the robust mapping of species and the analysis of multitemporal variations

  11. Space Launch System (SLS) Mission Planner's Guide

    Science.gov (United States)

    Smith, David Alan

    2017-01-01

    The purpose of this Space Launch System (SLS) Mission Planner's Guide (MPG) is to provide future payload developers/users with sufficient insight to support preliminary SLS mission planning. Consequently, this SLS MPG is not intended to be a payload requirements document; rather, it organizes and details SLS interfaces/accommodations in a manner similar to that of current Expendable Launch Vehicle (ELV) user guides to support early feasibility assessment. Like ELV Programs, once approved to fly on SLS, specific payload requirements will be defined in unique documentation.

  12. NASA Exploration Launch Projects Overview: The Crew Launch Vehicle and the Cargo Launch Vehicle Systems

    Science.gov (United States)

    Snoddy, Jimmy R.; Dumbacher, Daniel L.; Cook, Stephen A.

    2006-01-01

    The U.S. Vision for Space Exploration (January 2004) serves as the foundation for the National Aeronautics and Space Administration's (NASA) strategic goals and objectives. As the NASA Administrator outlined during his confirmation hearing in April 2005, these include: 1) Flying the Space Shuttle as safely as possible until its retirement, not later than 2010. 2) Bringing a new Crew Exploration Vehicle (CEV) into service as soon as possible after Shuttle retirement. 3) Developing a balanced overall program of science, exploration, and aeronautics at NASA, consistent with the redirection of the human space flight program to focus on exploration. 4) Completing the International Space Station (ISS) in a manner consistent with international partner commitments and the needs of human exploration. 5) Encouraging the pursuit of appropriate partnerships with the emerging commercial space sector. 6) Establishing a lunar return program having the maximum possible utility for later missions to Mars and other destinations. In spring 2005, the Agency commissioned a team of aerospace subject matter experts to perform the Exploration Systems Architecture Study (ESAS). The ESAS team performed in-depth evaluations of a number of space transportation architectures and provided recommendations based on their findings? The ESAS analysis focused on a human-rated Crew Launch Vehicle (CLV) for astronaut transport and a heavy lift Cargo Launch Vehicle (CaLV) to carry equipment, materials, and supplies for lunar missions and, later, the first human journeys to Mars. After several months of intense study utilizing safety and reliability, technical performance, budget, and schedule figures of merit in relation to design reference missions, the ESAS design options were unveiled in summer 2005. As part of NASA's systems engineering approach, these point of departure architectures have been refined through trade studies during the ongoing design phase leading to the development phase that

  13. Experimental Results and Issues on Equalization for Nonlinear Memory Channel: Pre-Cursor Enhanced Ram-DFE Canceler

    Science.gov (United States)

    Yuan, Lu; LeBlanc, James

    1998-01-01

    This thesis investigates the effects of the High Power Amplifier (HPA) and the filters over a satellite or telemetry channel. The Volterra series expression is presented for the nonlinear channel with memory, and the algorithm is based on the finite-state machine model. A RAM-based algorithm operating on the receiver side, Pre-cursor Enhanced RAM-FSE Canceler (PERC) is developed. A high order modulation scheme , 16-QAM is used for simulation, the results show that PERC provides an efficient and reliable method to transmit data on the bandlimited nonlinear channel. The contribution of PERC algorithm is that it includes both pre-cursors and post-cursors as the RAM address lines, and suggests a new way to make decision on the pre-addresses. Compared with the RAM-DFE structure that only includes post- addresses, the BER versus Eb/NO performance of PERC is substantially enhanced. Experiments are performed for PERC algorithms with different parameters on AWGN channels, and the results are compared and analyzed. The investigation of this thesis includes software simulation and hardware verification. Hardware is setup to collect actual TWT data. Simulation on both the software-generated data and the real-world data are performed. Practical limitations are considered for the hardware collected data. Simulation results verified the reliability of the PERC algorithm. This work was conducted at NMSU in the Center for Space Telemetering and Telecommunications Systems in the Klipsch School of Electrical and Computer Engineering Department.

  14. Epitrochoid Power-law Nozzle Concept for Reducing Launch Architecture Propulsion Costs

    Science.gov (United States)

    2010-11-16

    Merlin 1 C vacuum engine c. Energia booster RD-170-7Zenit RO-171-7Atlas V RD-180-7Angara RO-191 4. Develop a new propulsion system to incorporate...the four liquid boosters of the Energia launch vehicle designed to launch the Soviet Buran space shuttle. In parallel with the Buran development, a

  15. Unsupervised classification of neocortical activity patterns in neonatal and pre-juvenile rodents

    Directory of Open Access Journals (Sweden)

    Nicole eCichon

    2014-05-01

    Full Text Available Flexible communication within the brain, which relies on oscillatory activity, is not confined to adult neuronal networks. Experimental evidence has documented the presence of discontinuous patterns of oscillatory activity already during early development. Their highly variable spatial and time-frequency organization has been related to region specificity. However, it might be equally due to the absence of unitary criteria for classifying the early activity patterns, since they have been mainly characterized by visual inspection. Therefore, robust and unbiased methods for categorizing these discontinuous oscillations are needed for increasingly complex data sets from different labs. Here, we introduce an unsupervised detection and classification algorithm for the discontinuous activity patterns of rodents during early development. For this, firstly time windows with discontinuous oscillations vs. epochs of network silence were identified. In a second step, the major features of detected events were identified and processed by principal component analysis for deciding on their contribution to the classification of different oscillatory patterns. Finally, these patterns were categorized using an unsupervised cluster algorithm. The results were validated on manually characterized neonatal spindle bursts, which ubiquitously entrain neocortical areas of rats and mice, and prelimbic nested gamma spindle bursts. Moreover, the algorithm led to satisfactory results for oscillatory events that, due to increased similarity of their features, were more difficult to classify, e.g. during the pre-juvenile developmental period. Based on a linear classification, the optimal number of features to consider increased with the difficulty of detection. This algorithm allows the comparison of neonatal and pre-juvenile oscillatory patterns in their spatial and temporal organization. It might represent a first step for the unbiased elucidation of activity patterns

  16. U.S. advanced launch vehicle technology programs : Quarterly Launch Report : special report

    Science.gov (United States)

    1996-01-01

    U.S. firms and U.S. government agencies are jointly investing in advanced launch vehicle technology. This Special Report summarizes U.S. launch vehicle technology programs and highlights the changing : roles of government and industry players in pick...

  17. Statistical methods for launch vehicle guidance, navigation, and control (GN&C) system design and analysis

    Science.gov (United States)

    Rose, Michael Benjamin

    A novel trajectory and attitude control and navigation analysis tool for powered ascent is developed. The tool is capable of rapid trade-space analysis and is designed to ultimately reduce turnaround time for launch vehicle design, mission planning, and redesign work. It is streamlined to quickly determine trajectory and attitude control dispersions, propellant dispersions, orbit insertion dispersions, and navigation errors and their sensitivities to sensor errors, actuator execution uncertainties, and random disturbances. The tool is developed by applying both Monte Carlo and linear covariance analysis techniques to a closed-loop, launch vehicle guidance, navigation, and control (GN&C) system. The nonlinear dynamics and flight GN&C software models of a closed-loop, six-degree-of-freedom (6-DOF), Monte Carlo simulation are formulated and developed. The nominal reference trajectory (NRT) for the proposed lunar ascent trajectory is defined and generated. The Monte Carlo truth models and GN&C algorithms are linearized about the NRT, the linear covariance equations are formulated, and the linear covariance simulation is developed. The performance of the launch vehicle GN&C system is evaluated using both Monte Carlo and linear covariance techniques and their trajectory and attitude control dispersion, propellant dispersion, orbit insertion dispersion, and navigation error results are validated and compared. Statistical results from linear covariance analysis are generally within 10% of Monte Carlo results, and in most cases the differences are less than 5%. This is an excellent result given the many complex nonlinearities that are embedded in the ascent GN&C problem. Moreover, the real value of this tool lies in its speed, where the linear covariance simulation is 1036.62 times faster than the Monte Carlo simulation. Although the application and results presented are for a lunar, single-stage-to-orbit (SSTO), ascent vehicle, the tools, techniques, and mathematical

  18. Launch vehicle design and GNC sizing with ASTOS

    Science.gov (United States)

    Cremaschi, Francesco; Winter, Sebastian; Rossi, Valerio; Wiegand, Andreas

    2018-03-01

    The European Space Agency (ESA) is currently involved in several activities related to launch vehicle designs (Future Launcher Preparatory Program, Ariane 6, VEGA evolutions, etc.). Within these activities, ESA has identified the importance of developing a simulation infrastructure capable of supporting the multi-disciplinary design and preliminary guidance navigation and control (GNC) design of different launch vehicle configurations. Astos Solutions has developed the multi-disciplinary optimization and launcher GNC simulation and sizing tool (LGSST) under ESA contract. The functionality is integrated in the Analysis, Simulation and Trajectory Optimization Software for space applications (ASTOS) and is intended to be used from the early design phases up to phase B1 activities. ASTOS shall enable the user to perform detailed vehicle design tasks and assessment of GNC systems, covering all aspects of rapid configuration and scenario management, sizing of stages, trajectory-dependent estimation of structural masses, rigid and flexible body dynamics, navigation, guidance and control, worst case analysis, launch safety analysis, performance analysis, and reporting.

  19. Algorithms for optimal dyadic decision trees

    Energy Technology Data Exchange (ETDEWEB)

    Hush, Don [Los Alamos National Laboratory; Porter, Reid [Los Alamos National Laboratory

    2009-01-01

    A new algorithm for constructing optimal dyadic decision trees was recently introduced, analyzed, and shown to be very effective for low dimensional data sets. This paper enhances and extends this algorithm by: introducing an adaptive grid search for the regularization parameter that guarantees optimal solutions for all relevant trees sizes, revising the core tree-building algorithm so that its run time is substantially smaller for most regularization parameter values on the grid, and incorporating new data structures and data pre-processing steps that provide significant run time enhancement in practice.

  20. Launching Garbage-Bag Balloons.

    Science.gov (United States)

    Kim, Hy

    1997-01-01

    Presents a modification of a procedure for making and launching hot air balloons made out of garbage bags. Student instructions for balloon construction, launching instructions, and scale diagrams are included. (DDR)

  1. Tsunami early warning in the Mediterranean: role, structure and tricks of pre-computed tsunami simulation databases and matching/forecasting algorithms

    Science.gov (United States)

    Armigliato, Alberto; Pagnoni, Gianluca; Tinti, Stefano

    2014-05-01

    The general idea that pre-computed simulated scenario databases can play a key role in conceiving tsunami early warning systems is commonly accepted by now. But it was only in the last decade that it started to be applied to the Mediterranean region, taking special impulse from initiatives like the GDACS and from recently concluded EU-funded projects such as TRIDEC and NearToWarn. With reference to these two projects and with the possibility of further developing this research line in the frame of the FP7 ASTARTE project, we discuss some results we obtained regarding two major topics, namely the strategies applicable to the tsunami scenario database building and the design and performance assessment of a timely and "reliable" elementary-scenario combination algorithm to be run in real-time. As for the first theme, we take advantage of the experience gained in the test areas of Western Iberia, Rhodes (Greece) and Cyprus to illustrate the criteria with which a "Matching Scenario Database" (MSDB) can be built. These involve 1) the choice of the main tectonic tsunamigenic sources (or areas), 2) their tessellation with matrices of elementary faults whose dimension heavily depend on the particular studied area and must be a compromise between the needs to represent the tsunamigenic area in sufficient detail and of limiting the number of scenarios to be simulated, 3) the computation of the scenarios themselves, 4) the choice of the relevant simulation outputs and the standardisation of their formats. Regarding the matching/forecast algorithm, we want it to select and combine the MSDB elements based on the initial earthquake magnitude and location estimate, and to produce a forecast of (at least) the tsunami arrival time, amplitude and period at the closest tide-level sensors and in all needed forecast points. We discuss the performance of the algorithm in terms of the time needed to produce the forecast after the earthquake is detected. In particular, we analyse the

  2. Much Lower Launch Costs Make Resupply Cheaper than Recycling for Space Life Support

    Science.gov (United States)

    Jones, Harry W.

    2017-01-01

    The development of commercial launch vehicles by SpaceX has greatly reduced the cost of launching mass to Low Earth Orbit (LEO). Reusable launch vehicles may further reduce the launch cost per kilogram. The new low launch cost makes open loop life support much cheaper than before. Open loop systems resupply water and oxygen in tanks for crew use and provide disposable lithium hydroxide (LiOH) in canisters to remove carbon dioxide. Short human space missions such as Apollo and shuttle have used open loop life support, but the long duration International Space Station (ISS) recycles water and oxygen and removes carbon dioxide with a regenerative molecular sieve. These ISS regenerative and recycling life support systems have significantly reduced the total launch mass needed for life support. But, since the development cost of recycling systems is much higher than the cost of tanks and canisters, the relative cost savings have been much less than the launch mass savings. The Life Cycle Cost (LCC) includes development, launch, and operations. If another space station was built in LEO, resupply life support would be much cheaper than the current recycling systems. The mission most favorable to recycling would be a long term lunar base, since the resupply mass would be large, the proximity to Earth would reduce the need for recycling reliability and spares, and the launch cost would be much higher than for LEO due to the need for lunar transit and descent propulsion systems. For a ten-year lunar base, the new low launch costs make resupply cheaper than recycling systems similar to ISS life support.

  3. Texas Medication Algorithm Project: development and feasibility testing of a treatment algorithm for patients with bipolar disorder.

    Science.gov (United States)

    Suppes, T; Swann, A C; Dennehy, E B; Habermacher, E D; Mason, M; Crismon, M L; Toprac, M G; Rush, A J; Shon, S P; Altshuler, K Z

    2001-06-01

    Use of treatment guidelines for treatment of major psychiatric illnesses has increased in recent years. The Texas Medication Algorithm Project (TMAP) was developed to study the feasibility and process of developing and implementing guidelines for bipolar disorder, major depressive disorder, and schizophrenia in the public mental health system of Texas. This article describes the consensus process used to develop the first set of TMAP algorithms for the Bipolar Disorder Module (Phase 1) and the trial testing the feasibility of their implementation in inpatient and outpatient psychiatric settings across Texas (Phase 2). The feasibility trial answered core questions regarding implementation of treatment guidelines for bipolar disorder. A total of 69 patients were treated with the original algorithms for bipolar disorder developed in Phase 1 of TMAP. Results support that physicians accepted the guidelines, followed recommendations to see patients at certain intervals, and utilized sequenced treatment steps differentially over the course of treatment. While improvements in clinical symptoms (24-item Brief Psychiatric Rating Scale) were observed over the course of enrollment in the trial, these conclusions are limited by the fact that physician volunteers were utilized for both treatment and ratings. and there was no control group. Results from Phases 1 and 2 indicate that it is possible to develop and implement a treatment guideline for patients with a history of mania in public mental health clinics in Texas. TMAP Phase 3, a recently completed larger and controlled trial assessing the clinical and economic impact of treatment guidelines and patient and family education in the public mental health system of Texas, improves upon this methodology.

  4. Crowdsourcing seizure detection: algorithm development and validation on human implanted device recordings.

    Science.gov (United States)

    Baldassano, Steven N; Brinkmann, Benjamin H; Ung, Hoameng; Blevins, Tyler; Conrad, Erin C; Leyde, Kent; Cook, Mark J; Khambhati, Ankit N; Wagenaar, Joost B; Worrell, Gregory A; Litt, Brian

    2017-06-01

    There exist significant clinical and basic research needs for accurate, automated seizure detection algorithms. These algorithms have translational potential in responsive neurostimulation devices and in automatic parsing of continuous intracranial electroencephalography data. An important barrier to developing accurate, validated algorithms for seizure detection is limited access to high-quality, expertly annotated seizure data from prolonged recordings. To overcome this, we hosted a kaggle.com competition to crowdsource the development of seizure detection algorithms using intracranial electroencephalography from canines and humans with epilepsy. The top three performing algorithms from the contest were then validated on out-of-sample patient data including standard clinical data and continuous ambulatory human data obtained over several years using the implantable NeuroVista seizure advisory system. Two hundred teams of data scientists from all over the world participated in the kaggle.com competition. The top performing teams submitted highly accurate algorithms with consistent performance in the out-of-sample validation study. The performance of these seizure detection algorithms, achieved using freely available code and data, sets a new reproducible benchmark for personalized seizure detection. We have also shared a 'plug and play' pipeline to allow other researchers to easily use these algorithms on their own datasets. The success of this competition demonstrates how sharing code and high quality data results in the creation of powerful translational tools with significant potential to impact patient care. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. ASTP (SA-210) Launch vehicle operational flight trajectory. Part 3: Final documentation

    Science.gov (United States)

    Carter, A. B.; Klug, G. W.; Williams, N. W.

    1975-01-01

    Trajectory data are presented for a nominal and two launch window trajectory simulations. These trajectories are designed to insert a manned Apollo spacecraft into a 150/167 km. (81/90 n. mi.) earth orbit inclined at 51.78 degrees for rendezvous with a Soyuz spacecraft, which will be orbiting at approximately 225 km. (121.5 n. mi.). The launch window allocation defined for this launch is 500 pounds of S-IVB stage propellant. The launch window opening trajectory simulation depicts the earliest launch time deviation from a planar flight launch which conforms to this constraint. The launch window closing trajectory simulation was developed for the more stringent Air Force Eastern Test Range (AFETR) flight azimuth restriction of 37.4 degrees east-of-north. These trajectories enclose a 12.09 minute launch window, pertinent features of which are provided in a tabulation. Planar flight data are included for mid-window reference.

  6. Structural Weight Estimation for Launch Vehicles

    Science.gov (United States)

    Cerro, Jeff; Martinovic, Zoran; Su, Philip; Eldred, Lloyd

    2002-01-01

    This paper describes some of the work in progress to develop automated structural weight estimation procedures within the Vehicle Analysis Branch (VAB) of the NASA Langley Research Center. One task of the VAB is to perform system studies at the conceptual and early preliminary design stages on launch vehicles and in-space transportation systems. Some examples of these studies for Earth to Orbit (ETO) systems are the Future Space Transportation System [1], Orbit On Demand Vehicle [2], Venture Star [3], and the Personnel Rescue Vehicle[4]. Structural weight calculation for launch vehicle studies can exist on several levels of fidelity. Typically historically based weight equations are used in a vehicle sizing program. Many of the studies in the vehicle analysis branch have been enhanced in terms of structural weight fraction prediction by utilizing some level of off-line structural analysis to incorporate material property, load intensity, and configuration effects which may not be captured by the historical weight equations. Modification of Mass Estimating Relationships (MER's) to assess design and technology impacts on vehicle performance are necessary to prioritize design and technology development decisions. Modern CAD/CAE software, ever increasing computational power and platform independent computer programming languages such as JAVA provide new means to create greater depth of analysis tools which can be included into the conceptual design phase of launch vehicle development. Commercial framework computing environments provide easy to program techniques which coordinate and implement the flow of data in a distributed heterogeneous computing environment. It is the intent of this paper to present a process in development at NASA LaRC for enhanced structural weight estimation using this state of the art computational power.

  7. Mars Science Laboratory Launch-Arrival Space Study: A Pork Chop Plot Analysis

    Science.gov (United States)

    Cianciolo, Alicia Dwyer; Powell, Richard; Lockwood, Mary Kae

    2006-01-01

    Launch-Arrival, or "pork chop", plot analysis can provide mission designers with valuable information and insight into a specific launch and arrival space selected for a mission. The study begins with the array of entry states for each pair of selected Earth launch and Mars arrival dates, and nominal entry, descent and landing trajectories are simulated for each pair. Parameters of interest, such as maximum heat rate, are plotted in launch-arrival space. The plots help to quickly identify launch and arrival regions that are not feasible under current constraints or technology and also provide information as to what technologies may need to be developed to reach a desired region. This paper provides a discussion of the development, application, and results of a pork chop plot analysis to the Mars Science Laboratory mission. This technique is easily applicable to other missions at Mars and other destinations.

  8. Applying Advances in GPM Radiometer Intercalibration and Algorithm Development to a Long-Term TRMM/GPM Global Precipitation Dataset

    Science.gov (United States)

    Berg, W. K.

    2016-12-01

    The Global Precipitation Mission (GPM) Core Observatory, which was launched in February of 2014, provides a number of advances for satellite monitoring of precipitation including a dual-frequency radar, high frequency channels on the GPM Microwave Imager (GMI), and coverage over middle and high latitudes. The GPM concept, however, is about producing unified precipitation retrievals from a constellation of microwave radiometers to provide approximately 3-hourly global sampling. This involves intercalibration of the input brightness temperatures from the constellation radiometers, development of an apriori precipitation database using observations from the state-of-the-art GPM radiometer and radars, and accounting for sensor differences in the retrieval algorithm in a physically-consistent way. Efforts by the GPM inter-satellite calibration working group, or XCAL team, and the radiometer algorithm team to create unified precipitation retrievals from the GPM radiometer constellation were fully implemented into the current version 4 GPM precipitation products. These include precipitation estimates from a total of seven conical-scanning and six cross-track scanning radiometers as well as high spatial and temporal resolution global level 3 gridded products. Work is now underway to extend this unified constellation-based approach to the combined TRMM/GPM data record starting in late 1997. The goal is to create a long-term global precipitation dataset employing these state-of-the-art calibration and retrieval algorithm approaches. This new long-term global precipitation dataset will incorporate the physics provided by the combined GPM GMI and DPR sensors into the apriori database, extend prior TRMM constellation observations to high latitudes, and expand the available TRMM precipitation data to the full constellation of available conical and cross-track scanning radiometers. This combined TRMM/GPM precipitation data record will thus provide a high-quality high

  9. Iraq Radiosonde Launch Records

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Iraqi upper air records loaned to NCDC from the Air Force 14th Weather Squadron. Scanned notebooks containing upper air radiosonde launch records and data. Launches...

  10. Performance Analysis of ZigBee Wireless Networks for AAL through Hybrid Ray Launching and Collaborative Filtering

    Directory of Open Access Journals (Sweden)

    Peio Lopez-Iturri

    2016-01-01

    Full Text Available This paper presents a novel hybrid simulation method based on the combination of an in-house developed 3D ray launching algorithm and a collaborative filtering (CF technique, which will be used to analyze the performance of ZigBee-based wireless sensor networks (WSNs to enable ambient assisted living (AAL. The combination of Low Definition results obtained by means of a deterministic ray launching method and the application of a CF technique leads to a drastic reduction of the time and computational cost required to obtain accurate simulation results. The paper also reports that this kind of AAL indoor complex scenario with multiple wireless devices needs a thorough and personalized radioplanning analysis as radiopropagation has a strong dependence on the network topology and the specific morphology of the scenario. The wireless channel analysis performed by our hybrid method provides valuable insight into network design phases of complex wireless systems, typical in AAL-oriented environments. Thus, it results in optimizing network deployment, reducing overall interference levels, and increasing the overall system performance in terms of cost reduction, transmission rates, and energy efficiency.

  11. Study of the pitting effects during the pre-ignition plasma–propellant interaction process

    International Nuclear Information System (INIS)

    Hang, Yuhua; Li, Xingwen; Wu, Jian; Jia, Shenli; Zhao, Weiyu; Murphy, Anthony B

    2016-01-01

    The propellant ignition mechanism has become a central issue in the electrothermal chemical (ETC) launch technology, and the pre-ignition plasma–propellant interactions are critical in determining the ignition characteristics. In this work, both an open-air ablation test and an interrupted burning test are conducted for three different propellants. A fused silica window, which is transparent in all relevant wavelengths, is utilized to investigate the role of the plasma radiation. Surface pitting of the propellants after interaction with the plasma is analyzed using a scanning electron microscope (SEM). The effect of pits on the plasma ignition is then studied and a possible formation mechanism of pits is proposed. The input heat flux and the surface temperature of the propellants are obtained by solving a pre-ignition plasma–propellant interaction model. The results shed light on the pre-ignition plasma ignition mechanisms and will assist in the development of propellants for an ETC launcher. (paper)

  12. Modeling Powered Aerodynamics for the Orion Launch Abort Vehicle Aerodynamic Database

    Science.gov (United States)

    Chan, David T.; Walker, Eric L.; Robinson, Philip E.; Wilson, Thomas M.

    2011-01-01

    Modeling the aerodynamics of the Orion Launch Abort Vehicle (LAV) has presented many technical challenges to the developers of the Orion aerodynamic database. During a launch abort event, the aerodynamic environment around the LAV is very complex as multiple solid rocket plumes interact with each other and the vehicle. It is further complicated by vehicle separation events such as between the LAV and the launch vehicle stack or between the launch abort tower and the crew module. The aerodynamic database for the LAV was developed mainly from wind tunnel tests involving powered jet simulations of the rocket exhaust plumes, supported by computational fluid dynamic simulations. However, limitations in both methods have made it difficult to properly capture the aerodynamics of the LAV in experimental and numerical simulations. These limitations have also influenced decisions regarding the modeling and structure of the aerodynamic database for the LAV and led to compromises and creative solutions. Two database modeling approaches are presented in this paper (incremental aerodynamics and total aerodynamics), with examples showing strengths and weaknesses of each approach. In addition, the unique problems presented to the database developers by the large data space required for modeling a launch abort event illustrate the complexities of working with multi-dimensional data.

  13. Launch Services, a Proven Model

    Science.gov (United States)

    Trafton, W. C.; Simpson, J.

    2002-01-01

    - Ukranian, Russian, American and Norwegian; Delta - U.S., Swedish and Japanese; Arianespace - European; RSC H2A - Japanese and U.S. This approach will continue because of the cost of new engine development, to name one, versus acquiring other new technology will continue to be evaluated from a business perspective. The commercial market will remain flat for the near and mid term unless broadband or some other "killer application" emerges. A fragmented multiple player launch services market will service customers for the near term. Some degree of consolidation or elimination of existing launch services alternatives is expected. We are already seeing some consolidation - Boeing Launch Services (BLS) marketing Sea Launch and Delta; International Launch Services (ILS) marketing Atlas and Proton; Arianespace/Starsem marketing Ariane and Soyuz. So what will be the key for Space Transportation Success in the future? Focusing on the "Whole Product Offering," providing a product that provides not only the generic and expected services, but also augmented services that provide differentiation and raises the value. At the Boeing Company, we are continually evaluating the augmented product, focusing on high problem solving value to provide a substantial, not incremental value of improvement. Our focus is on not just our customer, but also our customer's customer. And our focus is on how we can effect a positive change in their current business plan. We evaluate the areas of space segment risk, price and finance, and performance. Through these three areas, we are continuing to improve our product and become more integrated with the Customer and participants in ensuring the successful implementation of their business plans. Our augmented offerings include - Risk Management - Financial Performance - Performance Assurance We continue to build upon and extend these features to move beyond an augmented product and to prepare ourselves to offer "Potential Products" to recognize changes in the

  14. CubeSat Launch Initiative Overview and CubeSat 101

    Science.gov (United States)

    Higginbotham, Scott

    2017-01-01

    The National Aeronautics and Space Administration (NASA) recognizes the tremendous potential that CubeSats (very small satellites) have to inexpensively demonstrate advanced technologies, collect scientific data, and enhance student engagement in Science, Technology, Engineering, and Mathematics (STEM). The CubeSat Launch Initiative (CSLI) was created to provide launch opportunities for CubeSats developed by academic institutions, non-profit entities, and NASA centers. This presentation will provide an overview of the CSLI, its benefits, and its results. This presentation will also provide high level CubeSat 101 information for prospective CubeSat developers, describing the development process from concept through mission operations while highlighting key points that developers need to be mindful of.

  15. Using Facebook as an E-Portfolio in Enhancing Pre-Service Teachers' Professional Development

    Science.gov (United States)

    Kabilan, Muhammad Kamarul

    2016-01-01

    This study aims to determine if "Facebook," when used as an online teacher portfolio (OTP), could contribute meaningfully to pre-service teachers' professional development (PD) and in what ways the OTP can be meaningful. Pre-service teachers (n = 91) were asked to develop OTP using "Facebook" and engage in learning and…

  16. Development of a MELCOR self-initialization algorithm for boiling water reactors

    International Nuclear Information System (INIS)

    Chien, C.S.; Wang, S.J.; Cheng, S.K.

    1996-01-01

    The MELCOR code, developed by Sandia National Laboratories, is suitable for calculating source terms and simulating severe accident phenomena of nuclear power plants. Prior to simulating a severe accident transient with MELCOR, the initial steady-state conditions must be generated in advance. The current MELCOR users' manuals do not provide a self-initialization procedure; this is the reason users have to adjust the initial conditions by themselves through a trial-and-error approach. A MELCOR self-initialization algorithm for boiling water reactor plants has been developed, which eliminates the tedious trial-and-error procedures and improves the simulation accuracy. This algorithm adjusts the important plant variable such as the dome pressure, downcomer level, and core flow rate to the desired conditions automatically. It is implemented through input with control functions provided in MELCOR. The reactor power and feedwater temperature are fed as input data. The initialization work of full-power conditions of the Kuosheng nuclear power station is cited as an example. These initial conditions are generated successfully with the developed algorithm. The generated initial conditions can be stored in a restart file and used for transient analysis. The methodology in this study improves the accuracy and consistency of transient calculations. Meanwhile, the algorithm provides all MELCOR users an easy and correct method for establishing the initial conditions

  17. Advanced information processing system for advanced launch system: Avionics architecture synthesis

    Science.gov (United States)

    Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.

    1991-01-01

    The Advanced Information Processing System (AIPS) is a fault-tolerant distributed computer system architecture that was developed to meet the real time computational needs of advanced aerospace vehicles. One such vehicle is the Advanced Launch System (ALS) being developed jointly by NASA and the Department of Defense to launch heavy payloads into low earth orbit at one tenth the cost (per pound of payload) of the current launch vehicles. An avionics architecture that utilizes the AIPS hardware and software building blocks was synthesized for ALS. The AIPS for ALS architecture synthesis process starting with the ALS mission requirements and ending with an analysis of the candidate ALS avionics architecture is described.

  18. Development of Non-Optimum Factors for Launch Vehicle Propellant Tank Bulkhead Weight Estimation

    Science.gov (United States)

    Wu, K. Chauncey; Wallace, Matthew L.; Cerro, Jeffrey A.

    2012-01-01

    Non-optimum factors are used during aerospace conceptual and preliminary design to account for the increased weights of as-built structures due to future manufacturing and design details. Use of higher-fidelity non-optimum factors in these early stages of vehicle design can result in more accurate predictions of a concept s actual weights and performance. To help achieve this objective, non-optimum factors are calculated for the aluminum-alloy gores that compose the ogive and ellipsoidal bulkheads of the Space Shuttle Super-Lightweight Tank propellant tanks. Minimum values for actual gore skin thicknesses and weld land dimensions are extracted from selected production drawings, and are used to predict reference gore weights. These actual skin thicknesses are also compared to skin thicknesses predicted using classical structural mechanics and tank proof-test pressures. Both coarse and refined weights models are developed for the gores. The coarse model is based on the proof pressure-sized skin thicknesses, and the refined model uses the actual gore skin thicknesses and design detail dimensions. To determine the gore non-optimum factors, these reference weights are then compared to flight hardware weights reported in a mass properties database. When manufacturing tolerance weight estimates are taken into account, the gore non-optimum factors computed using the coarse weights model range from 1.28 to 2.76, with an average non-optimum factor of 1.90. Application of the refined weights model yields non-optimum factors between 1.00 and 1.50, with an average non-optimum factor of 1.14. To demonstrate their use, these calculated non-optimum factors are used to predict heavier, more realistic gore weights for a proposed heavy-lift launch vehicle s propellant tank bulkheads. These results indicate that relatively simple models can be developed to better estimate the actual weights of large structures for future launch vehicles.

  19. Diagram of the Saturn V Launch Vehicle in Metric

    Science.gov (United States)

    1971-01-01

    This is a good cutaway diagram of the Saturn V launch vehicle showing the three stages, the instrument unit, and the Apollo spacecraft. The chart on the right presents the basic technical data in clear metric detail. The Saturn V is the largest and most powerful launch vehicle in the United States. The towering, 111 meter, Saturn V was a multistage, multiengine launch vehicle standing taller than the Statue of Liberty. Altogether, the Saturn V engines produced as much power as 85 Hoover Dams. Development of the Saturn V was the responsibility of the Marshall Space Flight Center at Huntsville, Alabama, directed by Dr. Wernher von Braun.

  20. Development of a Common User Interface for the Launch Decision Support System

    Science.gov (United States)

    Scholtz, Jean C.

    1991-01-01

    The Launch Decision Support System (LDSS) is software to be used by the NASA Test Director (NTD) in the firing room during countdown. This software is designed to assist the NTD with time management, that is, when to resume from a hold condition. This software will assist the NTD in making and evaluating alternate plans and will keep him advised of the existing situation. As such, the interface to this software must be designed to provide the maximum amount of information in the clearest fashion and in a timely manner. This research involves applying user interface guidelines to a mature prototype of LDSS and developing displays that will enable the users to easily and efficiently obtain information from the LDSS displays. This research also extends previous work on organizing and prioritizing human-computer interaction knowledge.

  1. Developing a Vision of Pre-College Engineering Education

    Science.gov (United States)

    Marshall, Jill A.; Berland, Leema K.

    2012-01-01

    We report the results of a study focused on identifying and articulating an ''epistemic foundation'' underlying a pre-collegiate focus on engineering. We do so in the context of UTeach"Engineering" (UTE), a program supported in part by funding by the National Science Foundation and designed to develop a model approach to address the…

  2. Ares Launch Vehicles Overview: Space Access Society

    Science.gov (United States)

    Cook, Steve

    2007-01-01

    America is returning to the Moon in preparation for the first human footprint on Mars, guided by the U.S. Vision for Space Exploration. This presentation will discuss NASA's mission, the reasons for returning to the Moon and going to Mars, and how NASA will accomplish that mission in ways that promote leadership in space and economic expansion on the new frontier. The primary goals of the Vision for Space Exploration are to finish the International Space Station, retire the Space Shuttle, and build the new spacecraft needed to return people to the Moon and go to Mars. The Vision commits NASA and the nation to an agenda of exploration that also includes robotic exploration and technology development, while building on lessons learned over 50 years of hard-won experience. NASA is building on common hardware, shared knowledge, and unique experience derived from the Apollo Saturn, Space Shuttle, and contemporary commercial launch vehicle programs. The journeys to the Moon and Mars will require a variety of vehicles, including the Ares I Crew Launch Vehicle, which transports the Orion Crew Exploration Vehicle, and the Ares V Cargo Launch Vehicle, which transports the Lunar Surface Access Module. The architecture for the lunar missions will use one launch to ferry the crew into orbit, where it will rendezvous with the Lunar Module in the Earth Departure Stage, which will then propel the combination into lunar orbit. The imperative to explore space with the combination of astronauts and robots will be the impetus for inventions such as solar power and water and waste recycling. This next chapter in NASA's history promises to write the next chapter in American history, as well. It will require this nation to provide the talent to develop tools, machines, materials, processes, technologies, and capabilities that can benefit nearly all aspects of life on Earth. Roles and responsibilities are shared between a nationwide Government and industry team. The Exploration Launch

  3. Launch Pad Escape System Design (Human Spaceflight)

    Science.gov (United States)

    Maloney, Kelli

    2011-01-01

    A launch pad escape system for human spaceflight is one of those things that everyone hopes they will never need but is critical for every manned space program. Since men were first put into space in the early 1960s, the need for such an Emergency Escape System (EES) has become apparent. The National Aeronautics and Space Administration (NASA) has made use of various types of these EESs over the past 50 years. Early programs, like Mercury and Gemini, did not have an official launch pad escape system. Rather, they relied on a Launch Escape System (LES) of a separate solid rocket motor attached to the manned capsule that could pull the astronauts to safety in the event of an emergency. This could only occur after hatch closure at the launch pad or during the first stage of flight. A version of a LES, now called a Launch Abort System (LAS) is still used today for all manned capsule type launch vehicles. However, this system is very limited in that it can only be used after hatch closure and it is for flight crew only. In addition, the forces necessary for the LES/LAS to get the capsule away from a rocket during the first stage of flight are quite high and can cause injury to the crew. These shortcomings led to the development of a ground based EES for the flight crew and ground support personnel as well. This way, a much less dangerous mode of egress is available for any flight or ground personnel up to a few seconds before launch. The early EESs were fairly simple, gravity-powered systems to use when thing's go bad. And things can go bad very quickly and catastrophically when dealing with a flight vehicle fueled with millions of pounds of hazardous propellant. With this in mind, early EES designers saw such a passive/unpowered system as a must for last minute escapes. This and other design requirements had to be derived for an EES, and this section will take a look at the safety design requirements had to be derived for an EES, and this section will take a look at

  4. Enhanced Deep Blue Aerosol Retrieval Algorithm: The Second Generation

    Science.gov (United States)

    Hsu, N. C.; Jeong, M.-J.; Bettenhausen, C.; Sayer, A. M.; Hansell, R.; Seftor, C. S.; Huang, J.; Tsay, S.-C.

    2013-01-01

    The aerosol products retrieved using the MODIS collection 5.1 Deep Blue algorithm have provided useful information about aerosol properties over bright-reflecting land surfaces, such as desert, semi-arid, and urban regions. However, many components of the C5.1 retrieval algorithm needed to be improved; for example, the use of a static surface database to estimate surface reflectances. This is particularly important over regions of mixed vegetated and non- vegetated surfaces, which may undergo strong seasonal changes in land cover. In order to address this issue, we develop a hybrid approach, which takes advantage of the combination of pre-calculated surface reflectance database and normalized difference vegetation index in determining the surface reflectance for aerosol retrievals. As a result, the spatial coverage of aerosol data generated by the enhanced Deep Blue algorithm has been extended from the arid and semi-arid regions to the entire land areas.

  5. Maternal pre-pregnancy body mass index and pubertal development among sons

    DEFF Research Database (Denmark)

    Hounsgaard, M L; Håkonsen, L B; Vested, A

    2014-01-01

    Maternal overweight and obesity in pregnancy has been associated with earlier age of menarche in daughters as well as reduced semen quality in sons. We aimed at investigating pubertal development in sons born by mothers with a high body mass index (BMI). The study included 2522 sons of mothers...... indicators of pubertal development, results also indicated earlier pubertal development among sons of obese mothers. After excluding sons of underweight mothers in a subanalysis, we observed an inverse trend between maternal pre-pregnancy BMI and age at regular shaving, acne and first nocturnal emission....... In conclusion, maternal pre-pregnant obesity may be related to earlier timing of pubertal milestones among sons. More research, preferably based on prospectively collected information about pubertal development, is needed to draw firm conclusions....

  6. LauncherOne Small Launch Vehicle Propulsion Advancement

    Data.gov (United States)

    National Aeronautics and Space Administration — Virgin Orbit, LLC (“Virgin Orbit”) is currently well into the development for our LauncherOne (L1) small satellite launch vehicle. LauncherOne is a dedicated small...

  7. Intelligent launch and range operations virtual testbed (ILRO-VTB)

    Science.gov (United States)

    Bardina, Jorge; Rajkumar, Thirumalainambi

    2003-09-01

    Intelligent Launch and Range Operations Virtual Test Bed (ILRO-VTB) is a real-time web-based command and control, communication, and intelligent simulation environment of ground-vehicle, launch and range operation activities. ILRO-VTB consists of a variety of simulation models combined with commercial and indigenous software developments (NASA Ames). It creates a hybrid software/hardware environment suitable for testing various integrated control system components of launch and range. The dynamic interactions of the integrated simulated control systems are not well understood. Insight into such systems can only be achieved through simulation/emulation. For that reason, NASA has established a VTB where we can learn the actual control and dynamics of designs for future space programs, including testing and performance evaluation. The current implementation of the VTB simulates the operations of a sub-orbital vehicle of mission, control, ground-vehicle engineering, launch and range operations. The present development of the test bed simulates the operations of Space Shuttle Vehicle (SSV) at NASA Kennedy Space Center. The test bed supports a wide variety of shuttle missions with ancillary modeling capabilities like weather forecasting, lightning tracker, toxic gas dispersion model, debris dispersion model, telemetry, trajectory modeling, ground operations, payload models and etc. To achieve the simulations, all models are linked using Common Object Request Broker Architecture (CORBA). The test bed provides opportunities for government, universities, researchers and industries to do a real time of shuttle launch in cyber space.

  8. The Standard Deviation of Launch Vehicle Environments

    Science.gov (United States)

    Yunis, Isam

    2005-01-01

    Statistical analysis is used in the development of the launch vehicle environments of acoustics, vibrations, and shock. The standard deviation of these environments is critical to accurate statistical extrema. However, often very little data exists to define the standard deviation and it is better to use a typical standard deviation than one derived from a few measurements. This paper uses Space Shuttle and expendable launch vehicle flight data to define a typical standard deviation for acoustics and vibrations. The results suggest that 3dB is a conservative and reasonable standard deviation for the source environment and the payload environment.

  9. Development of CAD implementing the algorithm of boundary elements’ numerical analytical method

    Directory of Open Access Journals (Sweden)

    Yulia V. Korniyenko

    2015-03-01

    Full Text Available Up to recent days the algorithms for numerical-analytical boundary elements method had been implemented with programs written in MATLAB environment language. Each program had a local character, i.e. used to solve a particular problem: calculation of beam, frame, arch, etc. Constructing matrices in these programs was carried out “manually” therefore being time-consuming. The research was purposed onto a reasoned choice of programming language for new CAD development, allows to implement algorithm of numerical analytical boundary elements method and to create visualization tools for initial objects and calculation results. Research conducted shows that among wide variety of programming languages the most efficient one for CAD development, employing the numerical analytical boundary elements method algorithm, is the Java language. This language provides tools not only for development of calculating CAD part, but also to build the graphic interface for geometrical models construction and calculated results interpretation.

  10. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    Science.gov (United States)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  11. Efficient algorithms for flow simulation related to nuclear reactor safety

    International Nuclear Information System (INIS)

    Gornak, Tatiana

    2013-01-01

    Safety analysis is of ultimate importance for operating Nuclear Power Plants (NPP). The overall modeling and simulation of physical and chemical processes occuring in the course of an accident is an interdisciplinary problem and has origins in fluid dynamics, numerical analysis, reactor technology and computer programming. The aim of the study is therefore to create the foundations of a multi-dimensional non-isothermal fluid model for a NPP containment and software tool based on it. The numerical simulations allow to analyze and predict the behavior of NPP systems under different working and accident conditions, and to develop proper action plans for minimizing the risks of accidents, and/or minimizing the consequences of possible accidents. A very large number of scenarios have to be simulated, and at the same time acceptable accuracy for the critical parameters, such as radioactive pollution, temperature, etc., have to be achieved. The existing software tools are either too slow, or not accurate enough. This thesis deals with developing customized algorithm and software tools for simulation of isothermal and non-isothermal flows in a containment pool of NPP. Requirements to such a software are formulated, and proper algorithms are presented. The goal of the work is to achieve a balance between accuracy and speed of calculation, and to develop customized algorithm for this special case. Different discretization and solution approaches are studied and those which correspond best to the formulated goal are selected, adjusted, and when possible, analysed. Fast directional splitting algorithm for Navier-Stokes equations in complicated geometries, in presence of solid and porous obstacles, is in the core of the algorithm. Developing suitable pre-processor and customized domain decomposition algorithms are essential part of the overall algorithm amd software. Results from numerical simulations in test geometries and in real geometries are presented and discussed.

  12. Using qualitative research to inform development of a diagnostic algorithm for UTI in children.

    Science.gov (United States)

    de Salis, Isabel; Whiting, Penny; Sterne, Jonathan A C; Hay, Alastair D

    2013-06-01

    Diagnostic and prognostic algorithms can help reduce clinical uncertainty. The selection of candidate symptoms and signs to be measured in case report forms (CRFs) for potential inclusion in diagnostic algorithms needs to be comprehensive, clearly formulated and relevant for end users. To investigate whether qualitative methods could assist in designing CRFs in research developing diagnostic algorithms. Specifically, the study sought to establish whether qualitative methods could have assisted in designing the CRF for the Health Technology Association funded Diagnosis of Urinary Tract infection in Young children (DUTY) study, which will develop a diagnostic algorithm to improve recognition of urinary tract infection (UTI) in children aged children in primary care and a Children's Emergency Department. We elicited features that clinicians believed useful in diagnosing UTI and compared these for presence or absence and terminology with the DUTY CRF. Despite much agreement between clinicians' accounts and the DUTY CRFs, we identified a small number of potentially important symptoms and signs not included in the CRF and some included items that could have been reworded to improve understanding and final data analysis. This study uniquely demonstrates the role of qualitative methods in the design and content of CRFs used for developing diagnostic (and prognostic) algorithms. Research groups developing such algorithms should consider using qualitative methods to inform the selection and wording of candidate symptoms and signs.

  13. A risk assessment approach to support the launching of new products, services or processes

    OpenAIRE

    Steen, Riana

    2015-01-01

    This is the accepted, refereed and final manuscript to the article published This research paper aims to develop a practical method to highlight certain key risk factors involved in the product development process. A new definition of the term “launch risk” is introduced in this work. The term is defined as the uncertainty about and severity of the consequences of failed launch. The launch could be the further development of existing products or the introduction of new products/services or...

  14. Balloon launching station, Mildura, Victoria

    International Nuclear Information System (INIS)

    The Mildura Balloon Launching Station was established in 1960 by the Department of Supply (now the Department of Manufacturing Industry) on behalf of the United States Atomic Energy Commission (USAEC) to determine the content of radioactive material in the upper atmosphere over Australia. The Station location and layout, staffing, balloon launching equipment, launching, tracking and recovery are described. (R.L.)

  15. Development of antibiotic regimens using graph based evolutionary algorithms.

    Science.gov (United States)

    Corns, Steven M; Ashlock, Daniel A; Bryden, Kenneth M

    2013-12-01

    This paper examines the use of evolutionary algorithms in the development of antibiotic regimens given to production animals. A model is constructed that combines the lifespan of the animal and the bacteria living in the animal's gastro-intestinal tract from the early finishing stage until the animal reaches market weight. This model is used as the fitness evaluation for a set of graph based evolutionary algorithms to assess the impact of diversity control on the evolving antibiotic regimens. The graph based evolutionary algorithms have two objectives: to find an antibiotic treatment regimen that maintains the weight gain and health benefits of antibiotic use and to reduce the risk of spreading antibiotic resistant bacteria. This study examines different regimens of tylosin phosphate use on bacteria populations divided into Gram positive and Gram negative types, with a focus on Campylobacter spp. Treatment regimens were found that provided decreased antibiotic resistance relative to conventional methods while providing nearly the same benefits as conventional antibiotic regimes. By using a graph to control the information flow in the evolutionary algorithm, a variety of solutions along the Pareto front can be found automatically for this and other multi-objective problems. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Serial assessment of cardiovascular control shows early signs of developing pre-eclampsia

    NARCIS (Netherlands)

    Rang, Sasika; Wolf, H.; van Montfrans, G. A.; Karemaker, J. M.

    2004-01-01

    Purpose To evaluate whether differences in autonomic cardiovascular control between normal pregnant women and women who develop pre-eclampsia later in pregnancy can be detected even before or early in pregnancy. Design We studied 42 women, 21 multigravid with a history of pre-eclampsia and 21

  17. SPECIAL LIBRARIES OF FRAGMENTS OF ALGORITHMIC NETWORKS TO AUTOMATE THE DEVELOPMENT OF ALGORITHMIC MODELS

    Directory of Open Access Journals (Sweden)

    V. E. Marley

    2015-01-01

    Full Text Available Summary. The concept of algorithmic models appeared from the algorithmic approach in which the simulated object, the phenomenon appears in the form of process, subject to strict rules of the algorithm, which placed the process of operation of the facility. Under the algorithmic model is the formalized description of the scenario subject specialist for the simulated process, the structure of which is comparable with the structure of the causal and temporal relationships between events of the process being modeled, together with all information necessary for its software implementation. To represent the structure of algorithmic models used algorithmic network. Normally, they were defined as loaded finite directed graph, the vertices which are mapped to operators and arcs are variables, bound by operators. The language of algorithmic networks has great features, the algorithms that it can display indifference the class of all random algorithms. In existing systems, automation modeling based on algorithmic nets, mainly used by operators working with real numbers. Although this reduces their ability, but enough for modeling a wide class of problems related to economy, environment, transport, technical processes. The task of modeling the execution of schedules and network diagrams is relevant and useful. There are many counting systems, network graphs, however, the monitoring process based analysis of gaps and terms of graphs, no analysis of prediction execution schedule or schedules. The library is designed to build similar predictive models. Specifying source data to obtain a set of projections from which to choose one and take it for a new plan.

  18. Development of computed tomography system and image reconstruction algorithm

    International Nuclear Information System (INIS)

    Khairiah Yazid; Mohd Ashhar Khalid; Azaman Ahmad; Khairul Anuar Mohd Salleh; Ab Razak Hamzah

    2006-01-01

    Computed tomography is one of the most advanced and powerful nondestructive inspection techniques, which is currently used in many different industries. In several CT systems, detection has been by combination of an X-ray image intensifier and charge -coupled device (CCD) camera or by using line array detector. The recent development of X-ray flat panel detector has made fast CT imaging feasible and practical. Therefore this paper explained the arrangement of a new detection system which is using the existing high resolution (127 μm pixel size) flat panel detector in MINT and the image reconstruction technique developed. The aim of the project is to develop a prototype flat panel detector based CT imaging system for NDE. The prototype consisted of an X-ray tube, a flat panel detector system, a rotation table and a computer system to control the sample motion and image acquisition. Hence this project is divided to two major tasks, firstly to develop image reconstruction algorithm and secondly to integrate X-ray imaging components into one CT system. The image reconstruction algorithm using filtered back-projection method is developed and compared to other techniques. The MATLAB program is the tools used for the simulations and computations for this project. (Author)

  19. Orientation estimation algorithm applied to high-spin projectiles

    International Nuclear Information System (INIS)

    Long, D F; Lin, J; Zhang, X M; Li, J

    2014-01-01

    High-spin projectiles are low cost military weapons. Accurate orientation information is critical to the performance of the high-spin projectiles control system. However, orientation estimators have not been well translated from flight vehicles since they are too expensive, lack launch robustness, do not fit within the allotted space, or are too application specific. This paper presents an orientation estimation algorithm specific for these projectiles. The orientation estimator uses an integrated filter to combine feedback from a three-axis magnetometer, two single-axis gyros and a GPS receiver. As a new feature of this algorithm, the magnetometer feedback estimates roll angular rate of projectile. The algorithm also incorporates online sensor error parameter estimation performed simultaneously with the projectile attitude estimation. The second part of the paper deals with the verification of the proposed orientation algorithm through numerical simulation and experimental tests. Simulations and experiments demonstrate that the orientation estimator can effectively estimate the attitude of high-spin projectiles. Moreover, online sensor calibration significantly enhances the estimation performance of the algorithm. (paper)

  20. Orientation estimation algorithm applied to high-spin projectiles

    Science.gov (United States)

    Long, D. F.; Lin, J.; Zhang, X. M.; Li, J.

    2014-06-01

    High-spin projectiles are low cost military weapons. Accurate orientation information is critical to the performance of the high-spin projectiles control system. However, orientation estimators have not been well translated from flight vehicles since they are too expensive, lack launch robustness, do not fit within the allotted space, or are too application specific. This paper presents an orientation estimation algorithm specific for these projectiles. The orientation estimator uses an integrated filter to combine feedback from a three-axis magnetometer, two single-axis gyros and a GPS receiver. As a new feature of this algorithm, the magnetometer feedback estimates roll angular rate of projectile. The algorithm also incorporates online sensor error parameter estimation performed simultaneously with the projectile attitude estimation. The second part of the paper deals with the verification of the proposed orientation algorithm through numerical simulation and experimental tests. Simulations and experiments demonstrate that the orientation estimator can effectively estimate the attitude of high-spin projectiles. Moreover, online sensor calibration significantly enhances the estimation performance of the algorithm.

  1. Developing a Science and Technology Centre for Supporting the Launching of a Nuclear Power Programme

    International Nuclear Information System (INIS)

    Badawy, I.

    2013-01-01

    The present investigation aims at developing a science and technology centre for supporting the launching of a nuclear power [NP] programme in a developing country with a relatively high economic growth rate. The development approach is based on enhancing the roles and functions of the proposed centre with respect to the main pillars that would have effect on the safe, secure and peaceful uses of the nuclear energy -particularly- in the field of electricity generation and sea-water desalination. The study underlines the importance of incorporating advanced research and development work, concepts and services provided by the proposed centre to the NP programme, to the regulatory systems of the concerned State and to the national nuclear industry in the fields of nuclear safety, radiation safety, nuclear safeguards, nuclear security and other related scientific and technical fields including human resources and nuclear knowledge management.

  2. Launching a world-class joint venture.

    Science.gov (United States)

    Bamford, James; Ernst, David; Fubini, David G

    2004-02-01

    More than 5,000 joint ventures, and many more contractual alliances, have been launched worldwide in the past five years. Companies are realizing that JVs and alliances can be lucrative vehicles for developing new products, moving into new markets, and increasing revenues. The problem is, the success rate for JVs and alliances is on a par with that for mergers and acquisitions--which is to say not very good. The authors, all McKinsey consultants, argue that JV success remains elusive for most companies because they don't pay enough attention to launch planning and execution. Most companies are highly disciplined about integrating the companies they target through M&A, but they rarely commit sufficient resources to launching similarly sized joint ventures or alliances. As a result, the parent companies experience strategic conflicts, governance gridlock, and missed operational synergies. Often, they walk away from the deal. The launch phase begins with the parent companies' signing of a memorandum of understanding and continues through the first 100 days of the JV or alliance's operation. During this period, it's critical for the parents to convene a team dedicated to exposing inherent tensions early. Specifically, the launch team must tackle four basic challenges. First, build and maintain strategic alignment across the separate corporate entities, each of which has its own goals, market pressures, and shareholders. Second, create a shared governance system for the two parent companies. Third, manage the economic interdependencies between the corporate parents and the JV. And fourth, build a cohesive, high-performing organization (the JV or alliance)--not a simple task, since most managers come from, will want to return to, and may even hold simultaneous positions in the parent companies. Using real-world examples, the authors offer their suggestions for meeting these challenges.

  3. Risk Perception and Communication in Commercial Reusable Launch Vehicle Operations

    Science.gov (United States)

    Hardy, Terry L.

    2005-12-01

    A number of inventors and entrepreneurs are currently attempting to develop and commercially operate reusable launch vehicles to carry voluntary participants into space. The operation of these launch vehicles, however, produces safety risks to the crew, to the space flight participants, and to the uninvolved public. Risk communication therefore becomes increasingly important to assure that those involved in the flight understand the risk and that those who are not directly involved understand the personal impact of RLV operations on their lives. Those involved in the launch vehicle flight may perceive risk differently from those non-participants, and these differences in perception must be understood to effectively communicate this risk. This paper summarizes existing research in risk perception and communication and applies that research to commercial reusable launch vehicle operations. Risk communication is discussed in the context of requirements of United States law for informed consent from any space flight participants on reusable suborbital launch vehicles.

  4. Edge enhancement algorithm for low-dose X-ray fluoroscopic imaging.

    Science.gov (United States)

    Lee, Min Seok; Park, Chul Hee; Kang, Moon Gi

    2017-12-01

    Low-dose X-ray fluoroscopy has continually evolved to reduce radiation risk to patients during clinical diagnosis and surgery. However, the reduction in dose exposure causes quality degradation of the acquired images. In general, an X-ray device has a time-average pre-processor to remove the generated quantum noise. However, this pre-processor causes blurring and artifacts within the moving edge regions, and noise remains in the image. During high-pass filtering (HPF) to enhance edge detail, this noise in the image is amplified. In this study, a 2D edge enhancement algorithm comprising region adaptive HPF with the transient improvement (TI) method, as well as artifacts and noise reduction (ANR), was developed for degraded X-ray fluoroscopic images. The proposed method was applied in a static scene pre-processed by a low-dose X-ray fluoroscopy device. First, the sharpness of the X-ray image was improved using region adaptive HPF with the TI method, which facilitates sharpening of edge details without overshoot problems. Then, an ANR filter that uses an edge directional kernel was developed to remove the artifacts and noise that can occur during sharpening, while preserving edge details. The quantitative and qualitative results obtained by applying the developed method to low-dose X-ray fluoroscopic images and visually and numerically comparing the final images with images improved using conventional edge enhancement techniques indicate that the proposed method outperforms existing edge enhancement methods in terms of objective criteria and subjective visual perception of the actual X-ray fluoroscopic image. The developed edge enhancement algorithm performed well when applied to actual low-dose X-ray fluoroscopic images, not only by improving the sharpness, but also by removing artifacts and noise, including overshoot. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. High Altitude Launch for a Practical SSTO

    Science.gov (United States)

    Landis, Geoffrey A.; Denis, Vincent

    2003-01-01

    Existing engineering materials allow the constuction of towers to heights of many kilometers. Orbital launch from a high altitude has significant advantages over sea-level launch due to the reduced atmospheric pressure, resulting in lower atmospheric drag on the vehicle and allowing higher rocket engine performance. High-altitude launch sites are particularly advantageous for single-stage to orbit (SSTO) vehicles, where the payload is typically 2% of the initial launch mass. An earlier paper enumerated some of the advantages of high altitude launch of SSTO vehicles. In this paper, we calculate launch trajectories for a candidate SSTO vehicle, and calculate the advantage of launch at launch altitudes 5 to 25 kilometer altitudes above sea level. The performance increase can be directly translated into increased payload capability to orbit, ranging from 5 to 20% increase in the mass to orbit. For a candidate vehicle with an initial payload fraction of 2% of gross lift-off weight, this corresponds to 31% increase in payload (for 5-km launch altitude) to 122% additional payload (for 25-km launch altitude).

  6. A filtered backprojection algorithm with characteristics of the iterative landweber algorithm

    OpenAIRE

    L. Zeng, Gengsheng

    2012-01-01

    Purpose: In order to eventually develop an analytical algorithm with noise characteristics of an iterative algorithm, this technical note develops a window function for the filtered backprojection (FBP) algorithm in tomography that behaves as an iterative Landweber algorithm.

  7. Learning Other People's History: Pre-Service Teachers' Developing African American Historical Knowledge

    Science.gov (United States)

    King, LaGarrett Jarriel

    2014-01-01

    Drawing from the historical lens of cultural memory, I examined the development of three social studies pre-service teachers' African American history knowledge. The participants were engaged in a rigorous summer reading program dedicated to learning African American history. This qualitative case study examined both pre and post interpretations…

  8. Comparison of switching control algorithms effective in restricting the switching in the neighborhood of the origin

    International Nuclear Information System (INIS)

    Joung, JinWook; Chung, Lan; Smyth, Andrew W

    2010-01-01

    The active interaction control (AIC) system consisting of a primary structure, an auxiliary structure and an interaction element was proposed to protect the primary structure against earthquakes and winds. The objective of the AIC system in reducing the responses of the primary structure is fulfilled by activating or deactivating the switching between the engagement and the disengagement of the primary and auxiliary structures through the interaction element. The status of the interaction element is controlled by switching control algorithms. The previously developed switching control algorithms require an excessive amount of switching, which is inefficient. In this paper, the excessive amount of switching is restricted by imposing an appropriately designed switching boundary region, where switching is prohibited, on pre-designed engagement–disengagement conditions. Two different approaches are used in designing the newly proposed AID-off and AID-off 2 algorithms. The AID-off 2 algorithm is designed to affect deactivated switching regions explicitly, unlike the AID-off algorithm, which follows the same procedure of designing the engagement–disengagement conditions of the previously developed algorithms, by using the current status of the AIC system. Both algorithms are shown to be effective in reducing the amount of switching times triggered from the previously developed AID algorithm under an appropriately selected control sampling period for different earthquakes, but the AID-off 2 algorithm outperforms the AID-off algorithm in reducing the number of switching times

  9. Development of a Thermal Equilibrium Prediction Algorithm

    International Nuclear Information System (INIS)

    Aviles-Ramos, Cuauhtemoc

    2002-01-01

    A thermal equilibrium prediction algorithm is developed and tested using a heat conduction model and data sets from calorimetric measurements. The physical model used in this study is the exact solution of a system of two partial differential equations that govern the heat conduction in the calorimeter. A multi-parameter estimation technique is developed and implemented to estimate the effective volumetric heat generation and thermal diffusivity in the calorimeter measurement chamber, and the effective thermal diffusivity of the heat flux sensor. These effective properties and the exact solution are used to predict the heat flux sensor voltage readings at thermal equilibrium. Thermal equilibrium predictions are carried out considering only 20% of the total measurement time required for thermal equilibrium. A comparison of the predicted and experimental thermal equilibrium voltages shows that the average percentage error from 330 data sets is only 0.1%. The data sets used in this study come from calorimeters of different sizes that use different kinds of heat flux sensors. Furthermore, different nuclear material matrices were assayed in the process of generating these data sets. This study shows that the integration of this algorithm into the calorimeter data acquisition software will result in an 80% reduction of measurement time. This reduction results in a significant cutback in operational costs for the calorimetric assay of nuclear materials. (authors)

  10. developed algorithm for the application of british method of concret

    African Journals Online (AJOL)

    t-iyke

    Most of the methods of concrete mix design developed over the years were geared towards manual approach. ... Key words: Concrete mix design; British method; Manual Approach; Algorithm. ..... Statistics for Science and Engineering.

  11. Utilization of Ancillary Data Sets for Conceptual SMAP Mission Algorithm Development and Product Generation

    Science.gov (United States)

    O'Neill, P.; Podest, E.

    2011-01-01

    The planned Soil Moisture Active Passive (SMAP) mission is one of the first Earth observation satellites being developed by NASA in response to the National Research Council's Decadal Survey, Earth Science and Applications from Space: National Imperatives for the Next Decade and Beyond [1]. Scheduled to launch late in 2014, the proposed SMAP mission would provide high resolution and frequent revisit global mapping of soil moisture and freeze/thaw state, utilizing enhanced Radio Frequency Interference (RFI) mitigation approaches to collect new measurements of the hydrological condition of the Earth's surface. The SMAP instrument design incorporates an L-band radar (3 km) and an L band radiometer (40 km) sharing a single 6-meter rotating mesh antenna to provide measurements of soil moisture and landscape freeze/thaw state [2]. These observations would (1) improve our understanding of linkages between the Earth's water, energy, and carbon cycles, (2) benefit many application areas including numerical weather and climate prediction, flood and drought monitoring, agricultural productivity, human health, and national security, (3) help to address priority questions on climate change, and (4) potentially provide continuity with brightness temperature and soil moisture measurements from ESA's SMOS (Soil Moisture Ocean Salinity) and NASA's Aquarius missions. In the planned SMAP mission prelaunch time frame, baseline algorithms are being developed for generating (1) soil moisture products both from radiometer measurements on a 36 km grid and from combined radar/radiometer measurements on a 9 km grid, and (2) freeze/thaw products from radar measurements on a 3 km grid. These retrieval algorithms need a variety of global ancillary data, both static and dynamic, to run the retrieval models, constrain the retrievals, and provide flags for indicating retrieval quality. The choice of which ancillary dataset to use for a particular SMAP product would be based on a number of factors

  12. Pre-School Attendance and Child Development

    DEFF Research Database (Denmark)

    Bauchmüller, Robert; Gørtz, Mette; Rasmussen, Astrid Würtz

    Earlier research suggests that children's development is shaped in their early years of life. This paper examines whether differences in day-care experiences during pre-school age are important for children's cognitive and language development at the age of 15. The analysis is based on class...... performance at the end of elementary schooling. We assess the effects of attended types and qualities of day-care institutions on various child outcomes as measured by school grades in mathematics, science, English and Danish for the whole Danish population as well as outcomes from the 2006 PISA Denmark...... survey and a 2007 PISA Copenhagen survey. We use administrative registries to generate indicators such as child-staff ratios, child-pedagogues ratios, and the share of male staff and of staff with non-Danish origins. Furthermore, we use information on the average levels of educational attainments...

  13. Low-Cost Launch Systems for the Dual-Launch Concept

    National Research Council Canada - National Science Library

    Pearson, Jerone; Zukauskas, Wally; Weeks, Thomas; Cass, Stein; Stytz, Martin

    2000-01-01

    .... Performing fewer engine tests, designing structures with lower structural margins, parallel processing, eliminating payload clean room requirements and extensive testing before launch, horizontal...

  14. Towards a integrated blueprint for climate and development

    Energy Technology Data Exchange (ETDEWEB)

    Hourcade, J.C.

    2002-05-01

    This paper aims at launching our exchange about the Climate - Development issue. I will venture to encompass the many facets of this debate in order not to preclude discussions by pre-organizing these facets into too strict a hierarchy. I will also raise questions about the perception of issues by developed and non-developed countries while providing elements of a dispassionate analysis which may be an oxymoron in these matters. (author)

  15. The Texas Medication Algorithm Project (TMAP) schizophrenia algorithms.

    Science.gov (United States)

    Miller, A L; Chiles, J A; Chiles, J K; Crismon, M L; Rush, A J; Shon, S P

    1999-10-01

    In the Texas Medication Algorithm Project (TMAP), detailed guidelines for medication management of schizophrenia and related disorders, bipolar disorders, and major depressive disorders have been developed and implemented. This article describes the algorithms developed for medication treatment of schizophrenia and related disorders. The guidelines recommend a sequence of medications and discuss dosing, duration, and switch-over tactics. They also specify response criteria at each stage of the algorithm for both positive and negative symptoms. The rationale and evidence for each aspect of the algorithms are presented.

  16. Dynamic airspace configuration by genetic algorithm

    Directory of Open Access Journals (Sweden)

    Marina Sergeeva

    2017-06-01

    Full Text Available With the continuous air traffic growth and limits of resources, there is a need for reducing the congestion of the airspace systems. Nowadays, several projects are launched, aimed at modernizing the global air transportation system and air traffic management. In recent years, special interest has been paid to the solution of the dynamic airspace configuration problem. Airspace sector configurations need to be dynamically adjusted to provide maximum efficiency and flexibility in response to changing weather and traffic conditions. The main objective of this work is to automatically adapt the airspace configurations according to the evolution of traffic. In order to reach this objective, the airspace is considered to be divided into predefined 3D airspace blocks which have to be grouped or ungrouped depending on the traffic situation. The airspace structure is represented as a graph and each airspace configuration is created using a graph partitioning technique. We optimize airspace configurations using a genetic algorithm. The developed algorithm generates a sequence of sector configurations for one day of operation with the minimized controller workload. The overall methodology is implemented and successfully tested with air traffic data taken for one day and for several different airspace control areas of Europe.

  17. Invariance algorithms for processing NDE signals

    Science.gov (United States)

    Mandayam, Shreekanth; Udpa, Lalita; Udpa, Satish S.; Lord, William

    1996-11-01

    Signals that are obtained in a variety of nondestructive evaluation (NDE) processes capture information not only about the characteristics of the flaw, but also reflect variations in the specimen's material properties. Such signal changes may be viewed as anomalies that could obscure defect related information. An example of this situation occurs during in-line inspection of gas transmission pipelines. The magnetic flux leakage (MFL) method is used to conduct noninvasive measurements of the integrity of the pipe-wall. The MFL signals contain information both about the permeability of the pipe-wall and the dimensions of the flaw. Similar operational effects can be found in other NDE processes. This paper presents algorithms to render NDE signals invariant to selected test parameters, while retaining defect related information. Wavelet transform based neural network techniques are employed to develop the invariance algorithms. The invariance transformation is shown to be a necessary pre-processing step for subsequent defect characterization and visualization schemes. Results demonstrating the successful application of the method are presented.

  18. Materials in NASA's Space Launch System: The Stuff Dreams are Made of

    Science.gov (United States)

    May, Todd A.

    2012-01-01

    Mr. Todd May, Program Manager for NASA's Space Launch System, will showcase plans and progress the nation s new super-heavy-lift launch vehicle, which is on track for a first flight to launch an Orion Multi-Purpose Crew Vehicle around the Moon in 2017. Mr. May s keynote address will share NASA's vision for future human and scientific space exploration and how SLS will advance those plans. Using new, in-development, and existing assets from the Space Shuttle and other programs, SLS will provide safe, affordable, and sustainable space launch capabilities for exploration payloads starting at 70 metric tons (t) and evolving through 130 t for entirely new deep-space missions. Mr. May will also highlight the impact of material selection, development, and manufacturing as they contribute to reducing risk and cost while simultaneously supporting the nation s exploration goals.

  19. Pre-service Physics Teachers Views on Designing and Developing Physics Digital Stories

    OpenAIRE

    Kocakaya, Serhat; Karakoyun, Ferit; Kotluk, Nihat

    2016-01-01

    The aim of this study is to determine the pre-service physics teachers views on the effect of designing and developing physics digital stories (DST) on improving their 21st centuries skills. The study is a qualitative research carried out with 13 pre-service physics teachers, who participated in the course of designing and developing DST, during 6 weeks, at Yuzuncu Yil University, Turkey, in the spring term of 2013-2014 academic year. Data were collected using semi-structured interview...

  20. Development of information preserving data compression algorithm for CT images

    International Nuclear Information System (INIS)

    Kobayashi, Yoshio

    1989-01-01

    Although digital imaging techniques in radiology develop rapidly, problems arise in archival storage and communication of image data. This paper reports on a new information preserving data compression algorithm for computed tomographic (CT) images. This algorithm consists of the following five processes: 1. Pixels surrounding the human body showing CT values smaller than -900 H.U. are eliminated. 2. Each pixel is encoded by its numerical difference from its neighboring pixel along a matrix line. 3. Difference values are encoded by a newly designed code rather than the natural binary code. 4. Image data, obtained with the above process, are decomposed into bit planes. 5. The bit state transitions in each bit plane are encoded by run length coding. Using this new algorithm, the compression ratios of brain, chest, and abdomen CT images are 4.49, 4.34. and 4.40 respectively. (author)

  1. Launch vehicle tracking enhancement through Global Positioning System Metric Tracking

    Science.gov (United States)

    Moore, T. C.; Li, Hanchu; Gray, T.; Doran, A.

    United Launch Alliance (ULA) initiated operational flights of both the Atlas V and Delta IV launch vehicle families in 2002. The Atlas V and Delta IV launch vehicles were developed jointly with the US Air Force (USAF) as part of the Evolved Expendable Launch Vehicle (EELV) program. Both Launch Vehicle (LV) families have provided 100% mission success since their respective inaugural launches and demonstrated launch capability from both Vandenberg Air Force Base (VAFB) on the Western Test Range and Cape Canaveral Air Force Station (CCAFS) on the Eastern Test Range. However, the current EELV fleet communications, tracking, & control architecture & technology, which date back to the origins of the space launch business, require support by a large and high cost ground footprint. The USAF has embarked on an initiative known as Future Flight Safety System (FFSS) that will significantly reduce Test Range Operations and Maintenance (O& M) cost by closing facilities and decommissioning ground assets. In support of the FFSS, a Global Positioning System Metric Tracking (GPS MT) System based on the Global Positioning System (GPS) satellite constellation has been developed for EELV which will allow both Ranges to divest some of their radar assets. The Air Force, ULA and Space Vector have flown the first 2 Atlas Certification vehicles demonstrating the successful operation of the GPS MT System. The first Atlas V certification flight was completed in February 2012 from CCAFS, the second Atlas V certification flight from VAFB was completed in September 2012 and the third certification flight on a Delta IV was completed October 2012 from CCAFS. The GPS MT System will provide precise LV position, velocity and timing information that can replace ground radar tracking resource functionality. The GPS MT system will provide an independent position/velocity S-Band telemetry downlink to support the current man-in-the-loop ground-based commanded destruct of an anomalous flight- The system

  2. Sentinel-1A - Launching the first satellite and launching the operational Copernicus programme

    Science.gov (United States)

    Aschbacher, Josef; Milagro Perez, Maria Pilar

    2014-05-01

    The first Copernicus satellite, Sentinel-1A, is prepared for launch in April 2014. It will provide continuous, systematic and highly reliable radar images of the Earth. Sentinel-1B will follow around 18 months later to increase observation frequency and establish an operational system. Sentinel-1 is designed to work in a pre-programmed conflict-free operation mode ensuring the reliability required by operational services and creating a consistent long-term data archive for applications based on long time series. This mission will ensure the continuation and improvement of SAR operational services and applications addressing primarily medium- to high-resolution applications through a main mode of operation that features both a wide swath (250 km) and high geometric (5 × 20 m) and radiometric resolution, allowing imaging of global landmasses, coastal zones, sea ice, polar areas, and shipping routes at high resolution. The Sentinel-1 main operational mode (Interferometric Wide Swath) will allow to have a complete coverage of the Earth in 6 days in the operational configuration when the two Sentinel-1 spacecraft will be in orbit simultaneously. High priority areas like Europe, Canada and some shipping routes will be covered almost daily. This high global observation frequency is unprecedented and cannot be reached with any other current radar mission. Envisat, for example, which was the 'workhorse' in this domain up to April 2012, reached global coverage every 35 days. Sentinel-1 data products will be made available systematically and free of charge to all users including institutional users, the general public, scientific and commercial users. The transition of the Copernicus programme from the development to operational phase will take place at about the same time when the first Sentinel-1 satellite will be launched. During the operational phase, funding of the programme will come from the European Union Multiannual Financial Framework (MFF) for the years 2014

  3. Primarily Statistics: Developing an Introductory Statistics Course for Pre-Service Elementary Teachers

    Science.gov (United States)

    Green, Jennifer L.; Blankenship, Erin E.

    2013-01-01

    We developed an introductory statistics course for pre-service elementary teachers. In this paper, we describe the goals and structure of the course, as well as the assessments we implemented. Additionally, we use example course work to demonstrate pre-service teachers' progress both in learning statistics and as novice teachers. Overall, the…

  4. Algorithms for the process management of sealed source brachytherapy

    International Nuclear Information System (INIS)

    Engler, M.J.; Ulin, K.; Sternick, E.S.

    1996-01-01

    Incidents and misadministrations suggest that brachytherapy may benefit form clarification of the quality management program and other mandates of the US Nuclear Regulatory Commission. To that end, flowcharts of step by step subprocesses were developed and formatted with dedicated software. The overall process was similarly organized in a complex flowchart termed a general process map. Procedural and structural indicators associated with each flowchart and map were critiqued and pre-existing documentation was revised. open-quotes Step-regulation tablesclose quotes were created to refer steps and subprocesses to Nuclear Regulatory Commission rules and recommendations in their sequences of applicability. Brachytherapy algorithms were specified as programmable, recursive processes, including therapeutic dose determination and monitoring doses to the public. These algorithms are embodied in flowcharts and step-regulation tables. A general algorithm is suggested as a template form which other facilities may derive tools to facilitate process management of sealed source brachytherapy. 11 refs., 9 figs., 2 tabs

  5. Reusable Military Launch Systems (RMLS)

    Science.gov (United States)

    2008-02-01

    shown in Figure 11. The second configuration is an axisymmetric, rocket-based combined cycle (RBCC) powered, SSTO vehicle, similar to the GTX...McCormick, D., and Sorensen, K., “Hyperion: An SSTO Vision Vehicle Concept Utilizing Rocket-Based Combined Cycle Propulsion”, AIAA paper 99-4944...there have been several failedattempts at the development of reusable rocket or air-breathing launch vehicle systems. Single-stage-to-orbit ( SSTO

  6. Reusable launch vehicle facts and fantasies

    Science.gov (United States)

    Kaplan, Marshall H.

    2002-01-01

    Many people refuse to address many of the realities of reusable launch vehicle systems, technologies, operations and economics. Basic principles of physics, space flight operations, and business limitations are applied to the creation of a practical vision of future expectations. While reusable launcher concepts have been proposed for several decades, serious review of potential designs began in the mid-1990s, when NASA decided that a Space Shuttle replacement had to be pursued. A great deal of excitement and interest was quickly generated by the prospect of ``orders-of-magnitude'' reduction in launch costs. The potential for a vastly expanded space program motivated the entire space community. By the late-1990s, and after over one billion dollars were spent on the technology development and privately-funded concepts, it had become clear that there would be no new, near-term operational reusable vehicle. Many factors contributed to a very expensive and disappointing effort to create a new generation of launch vehicles. It began with overly optimistic projections of technology advancements and the belief that a greatly increased demand for satellite launches would be realized early in the 21st century. Contractors contributed to the perception of quickly reachable technology and business goals, thus, accelerating the enthusiasm and helping to create a ``gold rush'' euphoria. Cost, schedule and performance margins were all highly optimistic. Several entrepreneurs launched start up companies to take advantage of the excitement and the availability of investor capital. Millions were raised from private investors and venture capitalists, based on little more than flashy presentations and animations. Well over $500 million were raised by little-known start up groups to create reusable systems, which might complete for the coming market in launch services. By 1999, it was clear that market projections, made just two years earlier, were not going to be realized. Investors

  7. The development of controller and navigation algorithm for underwater wall crawler

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Hyung Suck; Kim, Kyung Hoon; Kim, Min Young [Korea Advanced Institute of Science and Technology, Taejon (Korea)

    1999-01-01

    In this project, the control system of a underwater robotic vehicle(URV) for underwater wall inspection in the nuclear reactor pool or the related facilities has been developed. The following 4-sub projects have been studied for this project: (1) Development of the controller and motor driver for the URV (2) Development of the control algorithm for the tracking control of the URV (3) Development of the localization system (4) Underwater experiments of the developed system. First, the dynamic characteristic of thruster with the DC servo-motor was analyzed experimentally. Second the controller board using the INTEL 80C196 was designed and constructed, and the software for the communication and motor control is developed. Third the PWM motor-driver was developed. Fourth the localization system using the laser scanner and inclinometer was developed and tested in the pool. Fifth the dynamics of the URV was studied and the proper control algorithms for the URV was proposed. Lastly the validation of the integrated system was experimentally performed. (author). 27 refs., 51 figs., 8 tabs.

  8. Peer Review of Launch Environments

    Science.gov (United States)

    Wilson, Timmy R.

    2011-01-01

    Catastrophic failures of launch vehicles during launch and ascent are currently modeled using equivalent trinitrotoluene (TNT) estimates. This approach tends to over-predict the blast effect with subsequent impact to launch vehicle and crew escape requirements. Bangham Engineering, located in Huntsville, Alabama, assembled a less-conservative model based on historical failure and test data coupled with physical models and estimates. This white paper summarizes NESC's peer review of the Bangham analytical work completed to date.

  9. An ultrafast line-by-line algorithm for calculating spectral transmittance and radiance

    International Nuclear Information System (INIS)

    Tan, X.

    2013-01-01

    An ultrafast line-by-line algorithm for calculating spectral transmittance and radiance of gases is presented. The algorithm is based on fast convolution of the Voigt line profile using Fourier transform and a binning technique. The algorithm breaks a radiative transfer calculation into two steps: a one-time pre-computation step in which a set of pressure independent coefficients are computed using the spectral line information; a normal calculation step in which the Fourier transform coefficients of the optical depth are calculated using the line of sight information and the coefficients pre-computed in the first step, the optical depth is then calculated using an inverse Fourier transform and the spectral transmittance and radiance are calculated. The algorithm is significantly faster than line-by-line algorithms that do not employ special speedup techniques by a factor of 10 3 –10 6 . A case study of the 2.7 μm band of H 2 O vapor is presented. -- Highlights: •An ultrafast line-by-line model based on FFT and a binning technique is presented. •Computationally expensive calculations are factored out into a pre-computation step. •It is 10 3 –10 8 times faster than LBL algorithms that do not employ speedup techniques. •Good agreement with experimental data for the 2.7 μm band of H 2 O

  10. Application of statistical distribution theory to launch-on-time for space construction logistic support

    Science.gov (United States)

    Morgenthaler, George W.

    1989-01-01

    The ability to launch-on-time and to send payloads into space has progressed dramatically since the days of the earliest missile and space programs. Causes for delay during launch, i.e., unplanned 'holds', are attributable to several sources: weather, range activities, vehicle conditions, human performance, etc. Recent developments in space program, particularly the need for highly reliable logistic support of space construction and the subsequent planned operation of space stations, large unmanned space structures, lunar and Mars bases, and the necessity of providing 'guaranteed' commercial launches have placed increased emphasis on understanding and mastering every aspect of launch vehicle operations. The Center of Space Construction has acquired historical launch vehicle data and is applying these data to the analysis of space launch vehicle logistic support of space construction. This analysis will include development of a better understanding of launch-on-time capability and simulation of required support systems for vehicle assembly and launch which are necessary to support national space program construction schedules. In this paper, the author presents actual launch data on unscheduled 'hold' distributions of various launch vehicles. The data have been supplied by industrial associate companies of the Center for Space Construction. The paper seeks to determine suitable probability models which describe these historical data and that can be used for several purposes such as: inputs to broader simulations of launch vehicle logistic space construction support processes and the determination of which launch operations sources cause the majority of the unscheduled 'holds', and hence to suggest changes which might improve launch-on-time. In particular, the paper investigates the ability of a compound distribution probability model to fit actual data, versus alternative models, and recommends the most productive avenues for future statistical work.

  11. Launch Pad Coatings for Smart Corrosion Control

    Science.gov (United States)

    Calle, Luz M.; Hintze, Paul E.; Bucherl, Cori N.; Li, Wenyan; Buhrow, Jerry W.; Curran, Jerome P.; Whitten, Mary C.

    2010-01-01

    Corrosion is the degradation of a material as a result of its interaction with the environment. The environment at the KSC launch pads has been documented by ASM International (formerly American Society for Metals) as the most corrosive in the US. The 70 tons of highly corrosive hydrochloric acid that are generated by the solid rocket boosters during a launch exacerbate the corrosiveness of the environment at the pads. Numerous failures at the pads are caused by the pitting of stainless steels, rebar corrosion, and the degradation of concrete. Corrosion control of launch pad structures relies on the use of coatings selected from the qualified products list (QPL) of the NASA Standard 5008A for Protective Coating of Carbon Steel, Stainless Steel, and Aluminum on Launch Structures, Facilities, and Ground Support Equipment. This standard was developed to establish uniform engineering practices and methods and to ensure the inclusion of essential criteria in the coating of ground support equipment (GSE) and facilities used by or for NASA. This standard is applicable to GSE and facilities that support space vehicle or payload programs or projects and to critical facilities at all NASA locations worldwide. Environmental regulation changes have dramatically reduced the production, handling, use, and availability of conventional protective coatings for application to KSC launch structures and ground support equipment. Current attrition rate of qualified KSC coatings will drastically limit the number of commercial off the shelf (COTS) products available for the Constellation Program (CxP) ground operations (GO). CxP GO identified corrosion detection and control technologies as a critical, initial capability technology need for ground processing of Ares I and Ares V to meet Constellation Architecture Requirements Document (CARD) CxP 70000 operability requirements for reduced ground processing complexity, streamlined integrated testing, and operations phase affordability

  12. Correlation signatures of wet soils and snows. [algorithm development and computer programming

    Science.gov (United States)

    Phillips, M. R.

    1972-01-01

    Interpretation, analysis, and development of algorithms have provided the necessary computational programming tools for soil data processing, data handling and analysis. Algorithms that have been developed thus far, are adequate and have been proven successful for several preliminary and fundamental applications such as software interfacing capabilities, probability distributions, grey level print plotting, contour plotting, isometric data displays, joint probability distributions, boundary mapping, channel registration and ground scene classification. A description of an Earth Resources Flight Data Processor, (ERFDP), which handles and processes earth resources data under a users control is provided.

  13. NASA Space Technology Draft Roadmap Area 13: Ground and Launch Systems Processing

    Science.gov (United States)

    Clements, Greg

    2011-01-01

    This slide presentation reviews the technology development roadmap for the area of ground and launch systems processing. The scope of this technology area includes: (1) Assembly, integration, and processing of the launch vehicle, spacecraft, and payload hardware (2) Supply chain management (3) Transportation of hardware to the launch site (4) Transportation to and operations at the launch pad (5) Launch processing infrastructure and its ability to support future operations (6) Range, personnel, and facility safety capabilities (7) Launch and landing weather (8) Environmental impact mitigations for ground and launch operations (9) Launch control center operations and infrastructure (10) Mission integration and planning (11) Mission training for both ground and flight crew personnel (12) Mission control center operations and infrastructure (13) Telemetry and command processing and archiving (14) Recovery operations for flight crews, flight hardware, and returned samples. This technology roadmap also identifies ground, launch and mission technologies that will: (1) Dramatically transform future space operations, with significant improvement in life-cycle costs (2) Improve the quality of life on earth, while exploring in co-existence with the environment (3) Increase reliability and mission availability using low/zero maintenance materials and systems, comprehensive capabilities to ascertain and forecast system health/configuration, data integration, and the use of advanced/expert software systems (4) Enhance methods to assess safety and mission risk posture, which would allow for timely and better decision making. Several key technologies are identified, with a couple of slides devoted to one of these technologies (i.e., corrosion detection and prevention). Development of these technologies can enhance life on earth and have a major impact on how we can access space, eventually making routine commercial space access and improve building and manufacturing, and weather

  14. Development of Pre-Service Physics Teachers' Pedagogical Content Knowledge (PCK) throughout Their Initial Training

    Science.gov (United States)

    Karal, Isik Saliha; Alev, Nedim

    2016-01-01

    The purpose of this study was to investigate the development of pre-service physics teachers' pedagogical content knowledge (PCK) on the subject of electricity and magnetism after their completion of physics and mathematics courses. A descriptive longitudinal development research was carried out with 13 pre-service teachers (PTs) who completed…

  15. The Next Great Ship: NASA's Space Launch System

    Science.gov (United States)

    May, Todd A.

    2013-01-01

    Topics covered include: Most Capable U.S. Launch Vehicle; Liquid engines Progress; Boosters Progress; Stages and Avionics Progress; Systems Engineering and Integration Progress; Spacecraft and Payload Integration Progress; Advanced Development Progress.

  16. WinePeer - A Pre-Launch Strategic Analysis

    OpenAIRE

    Lee, Larry; McLeod, Kevin; Renke, Martin

    2010-01-01

    WinePeer is a mobile application that enables wine consumers to rate wines in 60 seconds for the purposes of developing an evolving taste profile with the potential to be leveraged in many different ways. This work determines the viability of WinePeer as a business venture through providing a comprehensive analysis of the external environment including the wine industry supply chain, regulatory influences and global wine industry trends. Drawing on the work of Kim and Mauborgne, this analysis...

  17. Pre-service teachers’ blog reflections: Illuminating their growth and development

    Directory of Open Access Journals (Sweden)

    Rubén Garza

    2015-12-01

    Full Text Available Blogging, a mode of electronic journaling, has been identified as an effective means to help pre-service teachers to construct meaning about their experiences. The purpose of this study was to examine pre-service teachers’ reflections about their praxis through blogging and to describe the nature of their growth and development. The use of reflective writing through blogging helped to identify three themes that emerged from the data: (1 validation, (2 prescriptive, and (3 self-assessment. Our findings suggest that blogging facilitated a community of learners that provided support and encouragement. Pre-service teachers’ reflections revealed a focus on the mechanistic aspects of teaching without critically examining the nature of what was observed. However, our findings also suggest that structuring reflective thinking through blogging has the potential to foster a nascent understanding about teaching and learning.

  18. Enhancement and evaluation of an algorithm for atmospheric profiling continuity from Aqua to Suomi-NPP

    Science.gov (United States)

    Lipton, A.; Moncet, J. L.; Payne, V.; Lynch, R.; Polonsky, I. N.

    2017-12-01

    We will present recent results from an algorithm for producing climate-quality atmospheric profiling earth system data records (ESDRs) for application to data from hyperspectral sounding instruments, including the Atmospheric InfraRed Sounder (AIRS) on EOS Aqua and the Cross-track Infrared Sounder (CrIS) on Suomi-NPP, along with their companion microwave sounders, AMSU and ATMS, respectively. The ESDR algorithm uses an optimal estimation approach and the implementation has a flexible, modular software structure to support experimentation and collaboration. Data record continuity benefits from the fact that the same algorithm can be applied to different sensors, simply by providing suitable configuration and data files. Developments to be presented include the impact of a radiance-based pre-classification method for the atmospheric background. In addition to improving retrieval performance, pre-classification has the potential to reduce the sensitivity of the retrievals to the climatological data from which the background estimate and its error covariance are derived. We will also discuss evaluation of a method for mitigating the effect of clouds on the radiances, and enhancements of the radiative transfer forward model.

  19. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    Science.gov (United States)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  20. Protein Expression Landscape of Mouse Embryos during Pre-implantation Development

    Directory of Open Access Journals (Sweden)

    Yawei Gao

    2017-12-01

    Full Text Available Pre-implantation embryo development is an intricate and precisely regulated process orchestrated by maternally inherited proteins and newly synthesized proteins following zygotic genome activation. Although genomic and transcriptomic studies have enriched our understanding of the genetic programs underlying this process, the protein expression landscape remains unexplored. Using quantitative mass spectrometry, we identified nearly 5,000 proteins from 8,000 mouse embryos of each stage (zygote, 2-cell, 4-cell, 8-cell, morula, and blastocyst. We found that protein expression in zygotes, morulas, and blastocysts is distinct from 2- to 8-cell embryos. Analysis of protein phosphorylation identified critical kinases and signal transduction pathways. We highlight key factors and their important roles in embryo development. Combined analysis of transcriptomic and proteomic data reveals coordinated control of RNA degradation, transcription, and translation and identifies previously undefined exon-junction-derived peptides. Our study provides an invaluable resource for further mechanistic studies and suggests core factors regulating pre-implantation embryo development.

  1. Development of transmission dose estimation algorithm for in vivo dosimetry in high energy radiation treatment

    International Nuclear Information System (INIS)

    Yun, Hyong Geun; Shin, Kyo Chul; Hun, Soon Nyung; Woo, Hong Gyun; Ha, Sung Whan; Lee, Hyoung Koo

    2004-01-01

    In vivo dosimetry is very important for quality assurance purpose in high energy radiation treatment. Measurement of transmission dose is a new method of in vivo dosimetry which is noninvasive and easy for daily performance. This study is to develop a tumor dose estimation algorithm using measured transmission dose for open radiation field. For basic beam data, transmission dose was measured with various field size (FS) of square radiation field, phantom thickness (Tp), and phantom chamber distance (PCD) with a acrylic phantom for 6 MV and 10 MV X-ray. Source to chamber distance (SCD) was set to 150 cm. Measurement was conducted with a 0.6 cc Farmer type ion chamber. By using regression analysis of measured basic beam data, a transmission dose estimation algorithm was developed. Accuracy of the algorithm was tested with flat solid phantom with various thickness in various settings of rectangular fields and various PCD. In our developed algorithm, transmission dose was equated to quadratic function of log(A/P) (where A/P is area-perimeter ratio) and the coefficients of the quadratic functions were equated to tertiary functions of PCD. Our developed algorithm could estimate the radiation dose with the errors within ±0.5% for open square field, and with the errors within ±1.0% for open elongated radiation field. Developed algorithm could accurately estimate the transmission dose in open radiation fields with various treatment settings of high energy radiation treatment. (author)

  2. jClustering, an open framework for the development of 4D clustering algorithms.

    Directory of Open Access Journals (Sweden)

    José María Mateos-Pérez

    Full Text Available We present jClustering, an open framework for the design of clustering algorithms in dynamic medical imaging. We developed this tool because of the difficulty involved in manually segmenting dynamic PET images and the lack of availability of source code for published segmentation algorithms. Providing an easily extensible open tool encourages publication of source code to facilitate the process of comparing algorithms and provide interested third parties with the opportunity to review code. The internal structure of the framework allows an external developer to implement new algorithms easily and quickly, focusing only on the particulars of the method being implemented and not on image data handling and preprocessing. This tool has been coded in Java and is presented as an ImageJ plugin in order to take advantage of all the functionalities offered by this imaging analysis platform. Both binary packages and source code have been published, the latter under a free software license (GNU General Public License to allow modification if necessary.

  3. KSC facilities status and planned management operations. [for Shuttle launches

    Science.gov (United States)

    Gray, R. H.; Omalley, T. J.

    1979-01-01

    A status report is presented on facilities and planned operations at the Kennedy Space Center with reference to Space Shuttle launch activities. The facilities are essentially complete, with all new construction and modifications to existing buildings almost finished. Some activity is still in progress at Pad A and on the Mobile Launcher due to changes in requirements but is not expected to affect the launch schedule. The installation and testing of the ground checkout equipment that will be used to test the flight hardware is now in operation. The Launch Processing System is currently supporting the development of the applications software that will perform the testing of this flight hardware.

  4. Magnetic Launch Assist System Demonstration Test

    Science.gov (United States)

    2001-01-01

    Engineers at the Marshall Space Flight Center (MSFC) have been testing Magnetic Launch Assist Systems, formerly known as Magnetic Levitation (MagLev) technologies. To launch spacecraft into orbit, a Magnetic Launch Assist system would use magnetic fields to levitate and accelerate a vehicle along a track at a very high speed. Similar to high-speed trains and roller coasters that use high-strength magnets to lift and propel a vehicle a couple of inches above a guideway, the launch-assist system would electromagnetically drive a space vehicle along the track. A full-scale, operational track would be about 1.5-miles long and capable of accelerating a vehicle to 600 mph in 9.5 seconds. This photograph shows a subscale model of an airplane running on the experimental track at MSFC during the demonstration test. This track is an advanced linear induction motor. Induction motors are common in fans, power drills, and sewing machines. Instead of spinning in a circular motion to turn a shaft or gears, a linear induction motor produces thrust in a straight line. Mounted on concrete pedestals, the track is 100-feet long, about 2-feet wide, and about 1.5- feet high. The major advantages of launch assist for NASA launch vehicles is that it reduces the weight of the take-off, the landing gear, the wing size, and less propellant resulting in significant cost savings. The US Navy and the British MOD (Ministry of Defense) are planning to use magnetic launch assist for their next generation aircraft carriers as the aircraft launch system. The US Army is considering using this technology for launching target drones for anti-aircraft training.

  5. Performance and development for the Inner Detector Trigger algorithms at ATLAS

    CERN Document Server

    Penc, O; The ATLAS collaboration

    2014-01-01

    The performance of the ATLAS Inner Detector (ID) Trigger algorithms being developed for running on the ATLAS High Level Trigger (HLT) processor farm during Run 2 of the LHC are presented. During the 2013-14 LHC long shutdown modifications are being carried out to the LHC accelerator to increase both the beam energy and luminosity. These modifications will pose significant challenges for the ID Trigger algorithms, both in terms execution time and physics performance. To meet these challenges, the ATLAS HLT software is being restructured to run as a more flexible single stage HLT, instead of two separate stages (Level2 and Event Filter) as in Run 1. This will reduce the overall data volume that needs to be requested by the HLT system, since data will no longer need to be requested for each of the two separate processing stages. Development of the ID Trigger algorithms for Run 2, currently expected to be ready for detector commissioning near the end of 2014, is progressing well and the current efforts towards op...

  6. A similarity based agglomerative clustering algorithm in networks

    Science.gov (United States)

    Liu, Zhiyuan; Wang, Xiujuan; Ma, Yinghong

    2018-04-01

    The detection of clusters is benefit for understanding the organizations and functions of networks. Clusters, or communities, are usually groups of nodes densely interconnected but sparsely linked with any other clusters. To identify communities, an efficient and effective community agglomerative algorithm based on node similarity is proposed. The proposed method initially calculates similarities between each pair of nodes, and form pre-partitions according to the principle that each node is in the same community as its most similar neighbor. After that, check each partition whether it satisfies community criterion. For the pre-partitions who do not satisfy, incorporate them with others that having the biggest attraction until there are no changes. To measure the attraction ability of a partition, we propose an attraction index that based on the linked node's importance in networks. Therefore, our proposed method can better exploit the nodes' properties and network's structure. To test the performance of our algorithm, both synthetic and empirical networks ranging in different scales are tested. Simulation results show that the proposed algorithm can obtain superior clustering results compared with six other widely used community detection algorithms.

  7. Graph 500 on OpenSHMEM: Using a Practical Survey of Past Work to Motivate Novel Algorithmic Developments

    Energy Technology Data Exchange (ETDEWEB)

    Grossman, Max [Rice Univ., Houston, TX (United States); Pritchard Jr., Howard Porter [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Budimlic, Zoran [Rice Univ., Houston, TX (United States); Sarkar, Vivek [Rice Univ., Houston, TX (United States)

    2016-12-22

    Graph500 [14] is an effort to offer a standardized benchmark across large-scale distributed platforms which captures the behavior of common communicationbound graph algorithms. Graph500 differs from other large-scale benchmarking efforts (such as HPL [6] or HPGMG [7]) primarily in the irregularity of its computation and data access patterns. The core computational kernel of Graph500 is a breadth-first search (BFS) implemented on an undirected graph. The output of Graph500 is a spanning tree of the input graph, usually represented by a predecessor mapping for every node in the graph. The Graph500 benchmark defines several pre-defined input sizes for implementers to test against. This report summarizes investigation into implementing the Graph500 benchmark on OpenSHMEM, and focuses on first building a strong and practical understanding of the strengths and limitations of past work before proposing and developing novel extensions.

  8. Launch team training system

    Science.gov (United States)

    Webb, J. T.

    1988-01-01

    A new approach to the training, certification, recertification, and proficiency maintenance of the Shuttle launch team is proposed. Previous training approaches are first reviewed. Short term program goals include expanding current training methods, improving the existing simulation capability, and scheduling training exercises with the same priority as hardware tests. Long-term goals include developing user requirements which would take advantage of state-of-the-art tools and techniques. Training requirements for the different groups of people to be trained are identified, and future goals are outlined.

  9. MODIS Science Algorithms and Data Systems Lessons Learned

    Science.gov (United States)

    Wolfe, Robert E.; Ridgway, Bill L.; Patt, Fred S.; Masuoka, Edward J.

    2009-01-01

    For almost 10 years, standard global products from NASA's Earth Observing System s (EOS) two Moderate Resolution Imaging Spectroradiometer (MODIS) sensors are being used world-wide for earth science research and applications. This paper discusses the lessons learned in developing the science algorithms and the data systems needed to produce these high quality data products for the earth sciences community. Strong science team leadership and communication, an evolvable and scalable data system, and central coordination of QA and validation activities enabled the data system to grow by two orders of magnitude from the initial at-launch system to the current system able to reprocess data from both the Terra and Aqua missions in less than a year. Many of the lessons learned from MODIS are already being applied to follow-on missions.

  10. Evidence-based algorithm for heparin dosing before cardiopulmonary bypass. Part 1: Development of the algorithm.

    Science.gov (United States)

    McKinney, Mark C; Riley, Jeffrey B

    2007-12-01

    The incidence of heparin resistance during adult cardiac surgery with cardiopulmonary bypass has been reported at 15%-20%. The consistent use of a clinical decision-making algorithm may increase the consistency of patient care and likely reduce the total required heparin dose and other problems associated with heparin dosing. After a directed survey of practicing perfusionists regarding treatment of heparin resistance and a literature search for high-level evidence regarding the diagnosis and treatment of heparin resistance, an evidence-based decision-making algorithm was constructed. The face validity of the algorithm decisive steps and logic was confirmed by a second survey of practicing perfusionists. The algorithm begins with review of the patient history to identify predictors for heparin resistance. The definition for heparin resistance contained in the algorithm is an activated clotting time 450 IU/kg heparin loading dose. Based on the literature, the treatment for heparin resistance used in the algorithm is anti-thrombin III supplement. The algorithm seems to be valid and is supported by high-level evidence and clinician opinion. The next step is a human randomized clinical trial to test the clinical procedure guideline algorithm vs. current standard clinical practice.

  11. Evaluation of a new preconditioning algorithm based on the 3-D even-parity simplified SN equations for discrete ordinates in parallel environments

    International Nuclear Information System (INIS)

    Longoni, G.; Haghighat, A.; Sjoden, G.

    2005-01-01

    This paper discusses a new preconditioned Sn algorithm referred to as FAST (Flux Acceleration Sn Transport). This algorithm uses the PENSSn code as the pre-conditioner, and the PENTRANSSn code system as the transport solver. PENSSn is developed based on the even-parity simplified Sn formulation in a parallel environment, and PENTRAN-SSn is a version of PENTRAN that uses PENSSn as the pre-conditioner with the FAST system. The paper briefly discusses the EP-SSn formulation and important numerical features of PENSSn. The FAST algorithm is discussed and tested for the C5G7 MOX eigenvalue benchmark problem. It is demonstrated that FAST leads to significant speedups (∼7) over the standard PENTRAN code. Moreover, FAST shows closer agreement with a reference Monte Carlo simulation. (authors)

  12. An Adaptive Pruning Algorithm for the Discrete L-Curve Criterion

    DEFF Research Database (Denmark)

    Hansen, Per Christian; Jensen, Toke Koldborg; Rodriguez, Giuseppe

    2004-01-01

    SVD or regularizing CG iterations). Our algorithm needs no pre-defined parameters, and in order to capture the global features of the curve in an adaptive fashion, we use a sequence of pruned L-curves that correspond to considering the curves at different scales. We compare our new algorithm...

  13. Development and testing of incident detection algorithms. Vol. 2, research methodology and detailed results.

    Science.gov (United States)

    1976-04-01

    The development and testing of incident detection algorithms was based on Los Angeles and Minneapolis freeway surveillance data. Algorithms considered were based on times series and pattern recognition techniques. Attention was given to the effects o...

  14. A prediction algorithm for first onset of major depression in the general population: development and validation.

    Science.gov (United States)

    Wang, JianLi; Sareen, Jitender; Patten, Scott; Bolton, James; Schmitz, Norbert; Birney, Arden

    2014-05-01

    Prediction algorithms are useful for making clinical decisions and for population health planning. However, such prediction algorithms for first onset of major depression do not exist. The objective of this study was to develop and validate a prediction algorithm for first onset of major depression in the general population. Longitudinal study design with approximate 3-year follow-up. The study was based on data from a nationally representative sample of the US general population. A total of 28 059 individuals who participated in Waves 1 and 2 of the US National Epidemiologic Survey on Alcohol and Related Conditions and who had not had major depression at Wave 1 were included. The prediction algorithm was developed using logistic regression modelling in 21 813 participants from three census regions. The algorithm was validated in participants from the 4th census region (n=6246). Major depression occurred since Wave 1 of the National Epidemiologic Survey on Alcohol and Related Conditions, assessed by the Alcohol Use Disorder and Associated Disabilities Interview Schedule-diagnostic and statistical manual for mental disorders IV. A prediction algorithm containing 17 unique risk factors was developed. The algorithm had good discriminative power (C statistics=0.7538, 95% CI 0.7378 to 0.7699) and excellent calibration (F-adjusted test=1.00, p=0.448) with the weighted data. In the validation sample, the algorithm had a C statistic of 0.7259 and excellent calibration (Hosmer-Lemeshow χ(2)=3.41, p=0.906). The developed prediction algorithm has good discrimination and calibration capacity. It can be used by clinicians, mental health policy-makers and service planners and the general public to predict future risk of having major depression. The application of the algorithm may lead to increased personalisation of treatment, better clinical decisions and more optimal mental health service planning.

  15. GPS Attitude Determination for Launch Vehicles, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Toyon Research Corporation proposes to develop a family of compact, low-cost GPS-based attitude (GPS/A) sensors for launch vehicles. In order to obtain 3-D attitude...

  16. Leadership development in the age of the algorithm.

    Science.gov (United States)

    Buckingham, Marcus

    2012-06-01

    By now we expect personalized content--it's routinely served up by online retailers and news services, for example. But the typical leadership development program still takes a formulaic, one-size-fits-all approach. And it rarely happens that an excellent technique can be effectively transferred from one leader to all others. Someone trying to adopt a practice from a leader with a different style usually seems stilted and off--a Franken-leader. Breakthrough work at Hilton Hotels and other organizations shows how companies can use an algorithmic model to deliver training tips uniquely suited to each individual's style. It's a five-step process: First, a company must choose a tool with which to identify each person's leadership type. Second, it should assess its best leaders, and third, it should interview them about their techniques. Fourth, it should use its algorithmic model to feed tips drawn from those techniques to developing leaders of the same type. And fifth, it should make the system dynamically intelligent, with user reactions sharpening the content and targeting of tips. The power of this kind of system--highly customized, based on peer-to-peer sharing, and continually evolving--will soon overturn the generic model of leadership development. And such systems will inevitably break through any one organization, until somewhere in the cloud the best leadership tips from all over are gathered, sorted, and distributed according to which ones suit which people best.

  17. DOOCS environment for FPGA-based cavity control system and control algorithms development

    International Nuclear Information System (INIS)

    Pucyk, P.; Koprek, W.; Kaleta, P.; Szewinski, J.; Pozniak, K.T.; Czarski, T.; Romaniuk, R.S.

    2005-01-01

    The paper describes the concept and realization of the DOOCS control software for FPGAbased TESLA cavity controller and simulator (SIMCON). It bases on universal software components, created for laboratory purposes and used in MATLAB based control environment. These modules have been recently adapted to the DOOCS environment to ensure a unified software to hardware communication model. The presented solution can be also used as a general platform for control algorithms development. The proposed interfaces between MATLAB and DOOCS modules allow to check the developed algorithm in the operation environment before implementation in the FPGA. As the examples two systems have been presented. (orig.)

  18. Algorithm improvement program nuclide identification algorithm scoring criteria and scoring application.

    Energy Technology Data Exchange (ETDEWEB)

    Enghauser, Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-02-01

    The goal of the Domestic Nuclear Detection Office (DNDO) Algorithm Improvement Program (AIP) is to facilitate gamma-radiation detector nuclide identification algorithm development, improvement, and validation. Accordingly, scoring criteria have been developed to objectively assess the performance of nuclide identification algorithms. In addition, a Microsoft Excel spreadsheet application for automated nuclide identification scoring has been developed. This report provides an overview of the equations, nuclide weighting factors, nuclide equivalencies, and configuration weighting factors used by the application for scoring nuclide identification algorithm performance. Furthermore, this report presents a general overview of the nuclide identification algorithm scoring application including illustrative examples.

  19. Advanced defect detection algorithm using clustering in ultrasonic NDE

    Science.gov (United States)

    Gongzhang, Rui; Gachagan, Anthony

    2016-02-01

    A range of materials used in industry exhibit scattering properties which limits ultrasonic NDE. Many algorithms have been proposed to enhance defect detection ability, such as the well-known Split Spectrum Processing (SSP) technique. Scattering noise usually cannot be fully removed and the remaining noise can be easily confused with real feature signals, hence becoming artefacts during the image interpretation stage. This paper presents an advanced algorithm to further reduce the influence of artefacts remaining in A-scan data after processing using a conventional defect detection algorithm. The raw A-scan data can be acquired from either traditional single transducer or phased array configurations. The proposed algorithm uses the concept of unsupervised machine learning to cluster segmental defect signals from pre-processed A-scans into different classes. The distinction and similarity between each class and the ensemble of randomly selected noise segments can be observed by applying a classification algorithm. Each class will then be labelled as `legitimate reflector' or `artefacts' based on this observation and the expected probability of defection (PoD) and probability of false alarm (PFA) determined. To facilitate data collection and validate the proposed algorithm, a 5MHz linear array transducer is used to collect A-scans from both austenitic steel and Inconel samples. Each pulse-echo A-scan is pre-processed using SSP and the subsequent application of the proposed clustering algorithm has provided an additional reduction to PFA while maintaining PoD for both samples compared with SSP results alone.

  20. Parallel optimization of IDW interpolation algorithm on multicore platform

    Science.gov (United States)

    Guan, Xuefeng; Wu, Huayi

    2009-10-01

    Due to increasing power consumption, heat dissipation, and other physical issues, the architecture of central processing unit (CPU) has been turning to multicore rapidly in recent years. Multicore processor is packaged with multiple processor cores in the same chip, which not only offers increased performance, but also presents significant challenges to application developers. As a matter of fact, in GIS field most of current GIS algorithms were implemented serially and could not best exploit the parallelism potential on such multicore platforms. In this paper, we choose Inverse Distance Weighted spatial interpolation algorithm (IDW) as an example to study how to optimize current serial GIS algorithms on multicore platform in order to maximize performance speedup. With the help of OpenMP, threading methodology is introduced to split and share the whole interpolation work among processor cores. After parallel optimization, execution time of interpolation algorithm is greatly reduced and good performance speedup is achieved. For example, performance speedup on Intel Xeon 5310 is 1.943 with 2 execution threads and 3.695 with 4 execution threads respectively. An additional output comparison between pre-optimization and post-optimization is carried out and shows that parallel optimization does to affect final interpolation result.

  1. Quality Control Algorithms for the Kennedy Space Center 50-Megahertz Doppler Radar Wind Profiler Winds Database

    Science.gov (United States)

    Barbre, Robert E., Jr.

    2012-01-01

    This paper presents the process used by the Marshall Space Flight Center Natural Environments Branch (EV44) to quality control (QC) data from the Kennedy Space Center's 50-MHz Doppler Radar Wind Profiler for use in vehicle wind loads and steering commands. The database has been built to mitigate limitations of using the currently archived databases from weather balloons. The DRWP database contains wind measurements from approximately 2.7-18.6 km altitude at roughly five minute intervals for the August 1997 to December 2009 period of record, and the extensive QC process was designed to remove spurious data from various forms of atmospheric and non-atmospheric artifacts. The QC process is largely based on DRWP literature, but two new algorithms have been developed to remove data contaminated by convection and excessive first guess propagations from the Median Filter First Guess Algorithm. In addition to describing the automated and manual QC process in detail, this paper describes the extent of the data retained. Roughly 58% of all possible wind observations exist in the database, with approximately 100 times as many complete profile sets existing relative to the EV44 balloon databases. This increased sample of near-continuous wind profile measurements may help increase launch availability by reducing the uncertainty of wind changes during launch countdown

  2. Response of Launch Pad Structures to Random Acoustic Excitation

    Directory of Open Access Journals (Sweden)

    Ravi N. Margasahayam

    1994-01-01

    Full Text Available The design of launch pad structures, particularly those having a large area-to-mass ratio, is governed by launch-induced acoustics, a relatively short transient with random pressure amplitudes having a non-Gaussian distribution. The factors influencing the acoustic excitation and resulting structural responses are numerous and cannot be predicted precisely. Two solutions (probabilistic and deterministic for the random vibration problem are presented in this article from the standpoint of their applicability to predict the response of ground structures exposed to rocket noise. Deficiencies of the probabilistic method, especially to predict response in the low-frequency range of launch transients (below 20 Hz, prompted the development of the deterministic analysis. The relationship between the two solutions is clarified for future implementation in a finite element method (FEM code.

  3. DISCOVERY OF A PSEUDOBULGE GALAXY LAUNCHING POWERFUL RELATIVISTIC JETS

    Energy Technology Data Exchange (ETDEWEB)

    Kotilainen, Jari K.; Olguín-Iglesias, Alejandro [Finnish Centre for Astronomy with ESO (FINCA), University of Turku, Väisäläntie 20, FI-21500 Piikkiö (Finland); León-Tavares, Jonathan; Baes, Maarten [Sterrenkundig Observatorium, Universiteit Gent, Krijgslaan 281-S9, B-9000 Gent (Belgium); Anórve, Christopher [Facultad de Ciencias de la Tierra y del Espacio de la Universidad Autónoma de Sinaloa, Blvd. de la Americas y Av. Universitarios S/N, Ciudad Universitaria, C.P. 80010, Culiacán Sinaloa, México (Mexico); Chavushyan, Vahram; Carrasco, Luis, E-mail: jarkot@utu.fi [Instituto Nacional de Astrofísica Óptica y Electrónica (INAOE), Apartado Postal 51 y 216, 72000 Puebla (Mexico)

    2016-12-01

    Supermassive black holes launching plasma jets at close to the speed of light, producing gamma-rays, have ubiquitously been found to be hosted by massive elliptical galaxies. Since elliptical galaxies are generally believed to be built through galaxy mergers, active galactic nuclei (AGN) launching relativistic jets are associated with the latest stages of galaxy evolution. We have discovered a pseudobulge morphology in the host galaxy of the gamma-ray AGN PKS 2004-447. This is the first gamma-ray emitter radio-loud AGN found to have been launched from a system where both the black hole and host galaxy have been actively growing via secular processes. This is evidence of an alternative black hole–galaxy co-evolutionary path to develop powerful relativistic jets, which is not merger driven.

  4. An adaptive deep-coupled GNSS/INS navigation system with hybrid pre-filter processing

    Science.gov (United States)

    Wu, Mouyan; Ding, Jicheng; Zhao, Lin; Kang, Yingyao; Luo, Zhibin

    2018-02-01

    The deep-coupling of a global navigation satellite system (GNSS) with an inertial navigation system (INS) can provide accurate and reliable navigation information. There are several kinds of deeply-coupled structures. These can be divided mainly into coherent and non-coherent pre-filter based structures, which have their own strong advantages and disadvantages, especially in accuracy and robustness. In this paper, the existing pre-filters of the deeply-coupled structures are analyzed and modified to improve them firstly. Then, an adaptive GNSS/INS deeply-coupled algorithm with hybrid pre-filters processing is proposed to combine the advantages of coherent and non-coherent structures. An adaptive hysteresis controller is designed to implement the hybrid pre-filters processing strategy. The simulation and vehicle test results show that the adaptive deeply-coupled algorithm with hybrid pre-filters processing can effectively improve navigation accuracy and robustness, especially in a GNSS-challenged environment.

  5. The Profile Envision and Splicing Tool (PRESTO): Developing an Atmospheric Wind Analysis Tool for Space Launch Vehicles Using Python

    Science.gov (United States)

    Orcutt, John M.; Barbre, Robert E., Jr.; Brenton, James C.; Decker, Ryan K.

    2017-01-01

    Launch vehicle programs require vertically complete atmospheric profiles. Many systems at the ER to make the necessary measurements, but all have different EVR, vertical coverage, and temporal coverage. MSFC Natural Environments Branch developed a tool to create a vertically complete profile from multiple inputs using Python. Forward work: Finish Formal Testing Acceptance Testing, End-to-End Testing. Formal Release

  6. Heavy Lift Launch Capability with a New Hydrocarbon Engine (NHE)

    Science.gov (United States)

    Threet, Grady E., Jr.; Holt, James B.; Philips, Alan D.; Garcia, Jessica A.

    2011-01-01

    The Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center has analyzed over 2000 Ares V and other heavy lift concepts in the last 3 years. These concepts were analyzed for Lunar Exploration Missions, heavy lift capability to Low Earth Orbit (LEO) as well as exploratory missions to other near earth objects in our solar system. With the pending retirement of the Shuttle fleet, our nation will be without a civil heavy lift launch capability, so the future development of a new heavy lift capability is imperative for the exploration and large science missions our Agency has been tasked to deliver. The majority of the heavy lift concepts analyzed by ACO during the last 3 years have been based on liquid oxygen / liquid hydrogen (LOX/LH2) core stage and solids booster stage propulsion technologies (Ares V / Shuttle Derived and their variants). These concepts were driven by the decisions made from the results of the Exploration Systems Architecture Study (ESAS), which in turn, led to the Ares V launch vehicle that has been baselined in the Constellation Program. Now that the decision has been made at the Agency level to cancel Constellation, other propulsion options such as liquid hydrocarbon fuels are back in the exploration trade space. NASA is still planning exploration missions with the eventual destination of Mars and a new heavy lift launch vehicle is still required and will serve as the centerpiece of our nation s next exploration architecture s infrastructure. With an extensive launch vehicle database already developed on LOX/LH2 based heavy lift launch vehicles, ACO initiated a study to look at using a new high thrust (> 1.0 Mlb vacuum thrust) hydrocarbon engine as the primary main stage propulsion in such a launch vehicle.

  7. A Comparison between Fixed Priority and EDF Scheduling accounting for Cache Related Pre-emption Delays

    Directory of Open Access Journals (Sweden)

    Will Lunniss

    2014-04-01

    Full Text Available In multitasking real-time systems, the choice of scheduling algorithm is an important factor to ensure that response time requirements are met while maximising limited system resources. Two popular scheduling algorithms include fixed priority (FP and earliest deadline first (EDF. While they have been studied in great detail before, they have not been compared when taking into account cache related pre-emption delays (CRPD. Memory and cache are split into a number of blocks containing instructions and data. During a pre-emption, cache blocks from the pre-empting task can evict those of the pre-empted task. When the pre-empted task is resumed, if it then has to re-load the evicted blocks, CRPD are introduced which then affect the schedulability of the task. In this paper we compare FP and EDF scheduling algorithms in the presence of CRPD using the state-of-the-art CRPD analysis. We find that when CRPD is accounted for, the performance gains offered by EDF over FP, while still notable, are diminished. Furthermore, we find that under scenarios that cause relatively high CRPD, task layout optimisation techniques can be applied to allow FP to schedule tasksets at a similar processor utilisation to EDF. Thus making the choice of the task layout in memory as important as the choice of scheduling algorithm. This is very relevant for industry, as it is much cheaper and simpler to adjust the task layout through the linker than it is to switch the scheduling algorithm.

  8. Inverted light-sheet microscope for imaging mouse pre-implantation development.

    Science.gov (United States)

    Strnad, Petr; Gunther, Stefan; Reichmann, Judith; Krzic, Uros; Balazs, Balint; de Medeiros, Gustavo; Norlin, Nils; Hiiragi, Takashi; Hufnagel, Lars; Ellenberg, Jan

    2016-02-01

    Despite its importance for understanding human infertility and congenital diseases, early mammalian development has remained inaccessible to in toto imaging. We developed an inverted light-sheet microscope that enabled us to image mouse embryos from zygote to blastocyst, computationally track all cells and reconstruct a complete lineage tree of mouse pre-implantation development. We used this unique data set to show that the first cell fate specification occurs at the 16-cell stage.

  9. NASA Space Launch System Operations Outlook

    Science.gov (United States)

    Hefner, William Keith; Matisak, Brian P.; McElyea, Mark; Kunz, Jennifer; Weber, Philip; Cummings, Nicholas; Parsons, Jeremy

    2014-01-01

    The National Aeronautics and Space Administration's (NASA) Space Launch System (SLS) Program, managed at the Marshall Space Flight Center (MSFC), is working with the Ground Systems Development and Operations (GSDO) Program, based at the Kennedy Space Center (KSC), to deliver a new safe, affordable, and sustainable capability for human and scientific exploration beyond Earth's orbit (BEO). Larger than the Saturn V Moon rocket, SLS will provide 10 percent more thrust at liftoff in its initial 70 metric ton (t) configuration and 20 percent more in its evolved 130-t configuration. The primary mission of the SLS rocket will be to launch astronauts to deep space destinations in the Orion Multi- Purpose Crew Vehicle (MPCV), also in development and managed by the Johnson Space Center. Several high-priority science missions also may benefit from the increased payload volume and reduced trip times offered by this powerful, versatile rocket. Reducing the lifecycle costs for NASA's space transportation flagship will maximize the exploration and scientific discovery returned from the taxpayer's investment. To that end, decisions made during development of SLS and associated systems will impact the nation's space exploration capabilities for decades. This paper will provide an update to the operations strategy presented at SpaceOps 2012. It will focus on: 1) Preparations to streamline the processing flow and infrastructure needed to produce and launch the world's largest rocket (i.e., through incorporation and modification of proven, heritage systems into the vehicle and ground systems); 2) Implementation of a lean approach to reach-back support of hardware manufacturing, green-run testing, and launch site processing and activities; and 3) Partnering between the vehicle design and operations communities on state-of-the-art predictive operations analysis techniques. An example of innovation is testing the integrated vehicle at the processing facility in parallel, rather than

  10. Wikis: Developing pre-service teachers’ leadership skills and knowledge of content standards

    Directory of Open Access Journals (Sweden)

    Angelia Reid-Griffin

    2016-03-01

    Full Text Available In this initial phase of our multi-year research study we set out to explore the development of leadership skills in our pre-service secondary teachers after using an online wiki, Wikispaces. This paper presents our methods for preparing a group of 13 mathematics and 3 science secondary pre-service teachers to demonstrate the essential knowledge, skills and dispositions of beginning teacher leaders. Our findings indicate the pre-service teachers' overall satisfaction with demonstrating leadership through collaborative practices. They were successful in these new roles as teacher/collaborator within the context of communication about content standards. Though the candidates participated in other collaborative tasks, this effort was noted for bringing together technology, content standards and leadership qualities that are critical for beginning teachers. Implications for addressing the preservice teachers' development of leadership skills, as they become professional teachers will be shared.

  11. Comparison between iterative wavefront control algorithm and direct gradient wavefront control algorithm for adaptive optics system

    International Nuclear Information System (INIS)

    Cheng Sheng-Yi; Liu Wen-Jin; Chen Shan-Qiu; Dong Li-Zhi; Yang Ping; Xu Bing

    2015-01-01

    Among all kinds of wavefront control algorithms in adaptive optics systems, the direct gradient wavefront control algorithm is the most widespread and common method. This control algorithm obtains the actuator voltages directly from wavefront slopes through pre-measuring the relational matrix between deformable mirror actuators and Hartmann wavefront sensor with perfect real-time characteristic and stability. However, with increasing the number of sub-apertures in wavefront sensor and deformable mirror actuators of adaptive optics systems, the matrix operation in direct gradient algorithm takes too much time, which becomes a major factor influencing control effect of adaptive optics systems. In this paper we apply an iterative wavefront control algorithm to high-resolution adaptive optics systems, in which the voltages of each actuator are obtained through iteration arithmetic, which gains great advantage in calculation and storage. For AO system with thousands of actuators, the computational complexity estimate is about O(n 2 ) ∼ O(n 3 ) in direct gradient wavefront control algorithm, while the computational complexity estimate in iterative wavefront control algorithm is about O(n) ∼ (O(n) 3/2 ), in which n is the number of actuators of AO system. And the more the numbers of sub-apertures and deformable mirror actuators, the more significant advantage the iterative wavefront control algorithm exhibits. (paper)

  12. DEVELOPING AND INSTRUCTING PRE-PERFORMANCE ROUTINES FOR TENPIN BOWLING COMPETITIONS (1).

    Science.gov (United States)

    Lee, Seungmin; Lee, Keunchul; Kwon, Sungho

    2015-06-01

    This preliminary study developed pre-performance routines for tenpin bowlers and instructed them. To develop the routine, the situations before throwing the ball were divided into four phases; participants were examined through interviews and observations. This study used an A-B design; the A stage included the development of the routines for 3 wk., while the B stage included the instruction and two evaluations of the routine consistency. Practice was implemented for 4 hr. per day for 9 wk. The participants noted they understood the developed routine easily and experienced an atmosphere similar to that of a competition during training through the routines. They found it difficult to practice the relaxation phase, but emphasized that the relaxation phase was helpful. Consistent routines were associated with an improved mental state and performance in a competition. This study suggests that pre-performance routines stabilize the mental state of the athletes, apparently giving them a competitive advantage.

  13. Next Generation Launch Technology Program Lessons Learned

    Science.gov (United States)

    Cook, Stephen; Tyson, Richard

    2005-01-01

    In November 2002, NASA revised its Integrated Space Transportation Plan (ISTP) to evolve the Space Launch Initiative (SLI) to serve as a theme for two emerging programs. The first of these, the Orbital Space Plane (OSP), was intended to provide crew-escape and crew-transfer functions for the ISS. The second, the NGLT Program, developed technologies needed for safe, routine space access for scientific exploration, commerce, and national defense. The NGLT Program was comprised of 12 projects, ranging from fundamental high-temperature materials research to full-scale engine system developments (turbine and rocket) to scramjet flight test. The Program included technology advancement activities with a broad range of objectives, ultimate applications/timeframes, and technology maturity levels. An over-arching Systems Engineering and Analysis (SE&A) approach was employed to focus technology advancements according to a common set of requirements. Investments were categorized into three segments of technology maturation: propulsion technologies, launch systems technologies, and SE&A.

  14. Improved core protection calculator system algorithm

    International Nuclear Information System (INIS)

    Yoon, Tae Young; Park, Young Ho; In, Wang Kee; Bae, Jong Sik; Baeg, Seung Yeob

    2009-01-01

    Core Protection Calculator System (CPCS) is a digitized core protection system which provides core protection functions based on two reactor core operation parameters, Departure from Nucleate Boiling Ratio (DNBR) and Local Power Density (LPD). It generates a reactor trip signal when the core condition exceeds the DNBR or LPD design limit. It consists of four independent channels which adapted a two out of four trip logic. CPCS algorithm improvement for the newly designed core protection calculator system, RCOPS (Reactor COre Protection System), is described in this paper. New features include the improvement of DNBR algorithm for thermal margin, the addition of pre trip alarm generation for auxiliary trip function, VOPT (Variable Over Power Trip) prevention during RPCS (Reactor Power Cutback System) actuation and the improvement of CEA (Control Element Assembly) signal checking algorithm. To verify the improved CPCS algorithm, CPCS algorithm verification tests, 'Module Test' and 'Unit Test', would be performed on RCOPS single channel facility. It is expected that the improved CPCS algorithm will increase DNBR margin and enhance the plant availability by reducing unnecessary reactor trips

  15. Arianespace Launch Service Operator Policy for Space Safety (Regulations and Standards for Safety)

    Science.gov (United States)

    Jourdainne, Laurent

    2013-09-01

    Since December 10, 2010, the French Space Act has entered into force. This French Law, referenced as LOS N°2008-518 ("Loi relative aux Opérations Spatiales"), is compliant with international rules. This French Space Act (LOS) is now applicable for any French private company whose business is dealing with rocket launch or in orbit satellites operations. Under CNES leadership, Arianespace contributed to the consolidation of technical regulation applicable to launch service operators.Now for each launch operation, the operator Arianespace has to apply for an authorization to proceed to the French ministry in charge of space activities. In the files issued for this purpose, the operator is able to justify a high level of warranties in the management of risks through robust processes in relation with the qualification maintenance, the configuration management, the treatment of technical facts and relevant conclusions and risks reduction implementation when needed.Thanks to the historic success of Ariane launch systems through its more than 30 years of exploitation experience (54 successes in a row for latest Ariane 5 launches), Arianespace as well as European public and industrial partners developed key experiences and knowledge as well as competences in space security and safety. Soyuz-ST and Vega launch systems are now in operation from Guiana Space Center with identical and proved risks management processes. Already existing processes have been slightly adapted to cope with the new roles and responsibilities of each actor contributing to the launch preparation and additional requirements like potential collision avoidance with inhabited space objects.Up to now, more than 12 Ariane 5 launches and 4 Soyuz-ST launches have been authorized under the French Space Act regulations. Ariane 5 and Soyuz- ST generic demonstration of conformity have been issued, including exhaustive danger and impact studies for each launch system.This article will detail how Arianespace

  16. Performance of the Falling Snow Retrieval Algorithms for the Global Precipitation Measurement (GPM) Mission

    Science.gov (United States)

    Skofronick-Jackson, Gail; Munchak, Stephen J.; Ringerud, Sarah

    2016-01-01

    Retrievals of falling snow from space represent an important data set for understanding the Earth's atmospheric, hydrological, and energy cycles, especially during climate change. Estimates of falling snow must be captured to obtain the true global precipitation water cycle, snowfall accumulations are required for hydrological studies, and without knowledge of the frozen particles in clouds one cannot adequately understand the energy and radiation budgets. While satellite-based remote sensing provides global coverage of falling snow events, the science is relatively new and retrievals are still undergoing development with challenges remaining). This work reports on the development and testing of retrieval algorithms for the Global Precipitation Measurement (GPM) mission Core Satellite, launched February 2014.

  17. Recommended Screening Practices for Launch Collision Aviodance

    Science.gov (United States)

    Beaver, Brian A.; Hametz, Mark E.; Ollivierre, Jarmaine C.; Newman, Lauri K.; Hejduk, Matthew D.

    2015-01-01

    The objective of this document is to assess the value of launch collision avoidance (COLA) practices and provide recommendations regarding its implementation for NASA robotic missions. The scope of this effort is limited to launch COLA screens against catalog objects that are either spacecraft or debris. No modifications to manned safety COLA practices are considered in this effort. An assessment of the value of launch COLA can be broken down into two fundamental questions: 1) Does collision during launch represent a significant risk to either the payload being launched or the space environment? 2) Can launch collision mitigation be performed in a manner that provides meaningful risk reduction at an acceptable level of operational impact? While it has been possible to piece together partial answers to these questions for some time, the first attempt to comprehensively address them is documented in reference (a), Launch COLA Operations: an Examination of Data Products, Procedures, and Thresholds, Revision A. This report is the product of an extensive study that addressed fundamental technical questions surrounding launch collision avoidance analysis and practice. The results provided in reference (a) will be cited throughout this document as these two questions are addressed. The premise of this assessment is that in order to conclude that launch COLA is a value-added activity, the answer to both of these questions must be affirmative. A "no" answer to either of these questions points toward the conclusion that launch COLA provides little or no risk mitigation benefit. The remainder of this assessment will focus on addressing these two questions.

  18. Building and Leading the Next Generation of Exploration Launch Vehicles

    Science.gov (United States)

    Cook, Stephen A.; Vanhooser, Teresa

    2010-01-01

    NASA s Constellation Program is depending on the Ares Projects to deliver the crew and cargo launch capabilities needed to send human explorers to the Moon and beyond. Ares I and V will provide the core space launch capabilities needed to continue providing crew and cargo access to the International Space Station (ISS), and to build upon the U.S. history of human spaceflight to the Moon and beyond. Since 2005, Ares has made substantial progress on designing, developing, and testing the Ares I crew launch vehicle and has continued its in-depth studies of the Ares V cargo launch vehicle. In 2009, the Ares Projects plan to: conduct the first flight test of Ares I, test-fire the Ares I first stage solid rocket motor; build the first integrated Ares I upper stage; continue testing hardware for the J-2X upper stage engine, and continue refining the design of the Ares V cargo launch vehicle. These efforts come with serious challenges for the project leadership team as it continues to foster a culture of ownership and accountability, operate with limited funding, and works to maintain effective internal and external communications under intense external scrutiny.

  19. Pre-validation methods for developing a patient reported outcome instrument

    Directory of Open Access Journals (Sweden)

    Castillo Mayret M

    2011-08-01

    Full Text Available Abstract Background Measures that reflect patients' assessment of their health are of increasing importance as outcome measures in randomised controlled trials. The methodological approach used in the pre-validation development of new instruments (item generation, item reduction and question formatting should be robust and transparent. The totality of the content of existing PRO instruments for a specific condition provides a valuable resource (pool of items that can be utilised to develop new instruments. Such 'top down' approaches are common, but the explicit pre-validation methods are often poorly reported. This paper presents a systematic and generalisable 5-step pre-validation PRO instrument methodology. Methods The method is illustrated using the example of the Aberdeen Glaucoma Questionnaire (AGQ. The five steps are: 1 Generation of a pool of items; 2 Item de-duplication (three phases; 3 Item reduction (two phases; 4 Assessment of the remaining items' content coverage against a pre-existing theoretical framework appropriate to the objectives of the instrument and the target population (e.g. ICF; and 5 qualitative exploration of the target populations' views of the new instrument and the items it contains. Results The AGQ 'item pool' contained 725 items. Three de-duplication phases resulted in reduction of 91, 225 and 48 items respectively. The item reduction phases discarded 70 items and 208 items respectively. The draft AGQ contained 83 items with good content coverage. The qualitative exploration ('think aloud' study resulted in removal of a further 15 items and refinement to the wording of others. The resultant draft AGQ contained 68 items. Conclusions This study presents a novel methodology for developing a PRO instrument, based on three sources: literature reporting what is important to patient; theoretically coherent framework; and patients' experience of completing the instrument. By systematically accounting for all items dropped

  20. Launch Environmental Test for KITSAT-3 FM

    Directory of Open Access Journals (Sweden)

    Sang-Hyun Lee

    1999-06-01

    Full Text Available The satellite experiences the severe launch environment such as vibration, acceleration, shock, and acoustics induced by rocket. Therefore, the satellite should be designed and manufactured to endure such severe launch environments. In this paper, we describe the structure of the KITSAT-3 FM(Flight Model and the processes and results of the launch environmental test to ensure the reliability during launch period.

  1. Low-Cost, Scalable, Hybrid Launch Propulsion Technology, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Physical Sciences Inc. (PSI), in collaboration Purdue University, proposes to develop a novel launch propulsion technology for rapid insertion of nano/micro...

  2. Expendable launch vehicles technology: A report to the US Senate and the US House of Representatives

    Science.gov (United States)

    1990-01-01

    As directed in Public Law 100-657, Commercial Space Launch Act Amendments of 1988, and consistent with National Space Policy, NASA has prepared a report on a potential program of research on technologies to reduce the initial and recurring costs, increase reliability, and improve performance of expendable launch vehicles for the launch of commercial and government spacecraft into orbit. The report was developed in consultation with industry and in recognition of relevant ongoing and planned NASA and DoD technology programs which will provide much of the required launch systems technology for U.S. Government needs. Additional efforts which could be undertaken to strengthen the technology base are identified. To this end, focus is on needs for launch vehicle technology development and, in selected areas, includes verification to permit private-sector new technology application at reduced risk. If such a program were to be implemented, it would entail both government and private-sector effort and resources. The additional efforts identified would augment the existing launch vehicle technology programs. The additional efforts identified have not been funded, based upon agency assessments of relative priority vis-a-vis the existing programs. Throughout the consultation and review process, the industry representatives stressed the overriding importance of continuing the DoD/NASA Advanced Launch Development activity and other government technology programs as a primary source of essential launch vehicle technology.

  3. Expendable launch vehicles technology: A report to the US Senate and the US House of Representatives

    Science.gov (United States)

    1990-07-01

    As directed in Public Law 100-657, Commercial Space Launch Act Amendments of 1988, and consistent with National Space Policy, NASA has prepared a report on a potential program of research on technologies to reduce the initial and recurring costs, increase reliability, and improve performance of expendable launch vehicles for the launch of commercial and government spacecraft into orbit. The report was developed in consultation with industry and in recognition of relevant ongoing and planned NASA and DoD technology programs which will provide much of the required launch systems technology for U.S. Government needs. Additional efforts which could be undertaken to strengthen the technology base are identified. To this end, focus is on needs for launch vehicle technology development and, in selected areas, includes verification to permit private-sector new technology application at reduced risk. If such a program were to be implemented, it would entail both government and private-sector effort and resources. The additional efforts identified would augment the existing launch vehicle technology programs. The additional efforts identified have not been funded, based upon agency assessments of relative priority vis-a-vis the existing programs. Throughout the consultation and review process, the industry representatives stressed the overriding importance of continuing the DoD/NASA Advanced Launch Development activity and other government technology programs as a primary source of essential launch vehicle technology.

  4. The development of an algebraic multigrid algorithm for symmetric positive definite linear systems

    Energy Technology Data Exchange (ETDEWEB)

    Vanek, P.; Mandel, J.; Brezina, M. [Univ. of Colorado, Denver, CO (United States)

    1996-12-31

    An algebraic multigrid algorithm for symmetric, positive definite linear systems is developed based on the concept of prolongation by smoothed aggregation. Coarse levels are generated automatically. We present a set of requirements motivated heuristically by a convergence theory. The algorithm then attempts to satisfy the requirements. Input to the method are the coefficient matrix and zero energy modes, which are determined from nodal coordinates and knowledge of the differential equation. Efficiency of the resulting algorithm is demonstrated by computational results on real world problems from solid elasticity, plate blending, and shells.

  5. Pre-processing for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der

    2001-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum

  6. CryoSat: ready to launch (again)

    Science.gov (United States)

    Francis, R.; Wingham, D.; Cullen, R.

    2009-12-01

    topography which, since the radar ranging is performed to the closest reflector rather than the point directly below, introduces uncertainty into the exactitude of repeat measurements. CryoSat's radar includes a second antenna and receiver chain so that interferometry may be used to determine the arrival angle of the echo and so improve localisation of the reflection. The new satellite was approved in late February 2006, less than 6 months after the failure, and development started almost immediately. In September 2009 the development was completed and the satellite placed into storage awaiting a launch vehicle: the launch, using a Dnepr vehicle (a converted SS-18 ICBM) is anticipated in late February 2010.

  7. A Review of Algorithms for Retinal Vessel Segmentation

    Directory of Open Access Journals (Sweden)

    Monserrate Intriago Pazmiño

    2014-10-01

    Full Text Available This paper presents a review of algorithms for extracting blood vessels network from retinal images. Since retina is a complex and delicate ocular structure, a huge effort in computer vision is devoted to study blood vessels network for helping the diagnosis of pathologies like diabetic retinopathy, hypertension retinopathy, retinopathy of prematurity or glaucoma. To carry out this process many works for normal and abnormal images have been proposed recently. These methods include combinations of algorithms like Gaussian and Gabor filters, histogram equalization, clustering, binarization, motion contrast, matched filters, combined corner/edge detectors, multi-scale line operators, neural networks, ants, genetic algorithms, morphological operators. To apply these algorithms pre-processing tasks are needed. Most of these algorithms have been tested on publicly retinal databases. We have include a table summarizing algorithms and results of their assessment.

  8. Pre-launch simulation experiment of microwave-ionosphere nonlinear interaction rocket experiment in the space plasma chamber

    Energy Technology Data Exchange (ETDEWEB)

    Kaya, N. (Kobe University, Kobe, Japan); Tsutsui, M. (Kyoto University, Uji, Japan); Matsumoto, H. (Kyoto University, Kyoto, Japan)

    1980-09-01

    A pre-flight test experiment of a microwave-ionosphere nonlinear interaction rocket experiment (MINIX) has been carried out in a space plasma simulation chamber. Though the first rocket experiment ended up in failure because of a high voltage trouble, interesting results are observed in the pre-flight experiment. A significant microwave heating of plasma up to 300% temperature increase is observed. Strong excitations of plasma waves by the transmitted microwaves in the VLF and HF range are observed as well. These microwave effects may have to be taken into account in solar power satellite projects in the future.

  9. Development of GPT-based optimization algorithm

    International Nuclear Information System (INIS)

    White, J.R.; Chapman, D.M.; Biswas, D.

    1985-01-01

    The University of Lowell and Westinghouse Electric Corporation are involved in a joint effort to evaluate the potential benefits of generalized/depletion perturbation theory (GPT/DTP) methods for a variety of light water reactor (LWR) physics applications. One part of that work has focused on the development of a GPT-based optimization algorithm for the overall design, analysis, and optimization of LWR reload cores. The use of GPT sensitivity data in formulating the fuel management optimization problem is conceptually straightforward; it is the actual execution of the concept that is challenging. Thus, the purpose of this paper is to address some of the major difficulties, to outline our approach to these problems, and to present some illustrative examples of an efficient GTP-based optimization scheme

  10. The development of a scalable parallel 3-D CFD algorithm for turbomachinery. M.S. Thesis Final Report

    Science.gov (United States)

    Luke, Edward Allen

    1993-01-01

    Two algorithms capable of computing a transonic 3-D inviscid flow field about rotating machines are considered for parallel implementation. During the study of these algorithms, a significant new method of measuring the performance of parallel algorithms is developed. The theory that supports this new method creates an empirical definition of scalable parallel algorithms that is used to produce quantifiable evidence that a scalable parallel application was developed. The implementation of the parallel application and an automated domain decomposition tool are also discussed.

  11. Algorithms evaluation for fundus images enhancement

    International Nuclear Information System (INIS)

    Braem, V; Marcos, M; Bizai, G; Drozdowicz, B; Salvatelli, A

    2011-01-01

    Color images of the retina inherently involve noise and illumination artifacts. In order to improve the diagnostic quality of the images, it is desirable to homogenize the non-uniform illumination and increase contrast while preserving color characteristics. The visual result of different pre-processing techniques can be very dissimilar and it is necessary to make an objective assessment of the techniques in order to select the most suitable. In this article the performance of eight algorithms to correct the non-uniform illumination, contrast modification and color preservation was evaluated. In order to choose the most suitable a general score was proposed. The results got good impression from experts, although some differences suggest that not necessarily the best statistical quality of image is the one of best diagnostic quality to the trained doctor eye. This means that the best pre-processing algorithm for an automatic classification may be different to the most suitable one for visual diagnosis. However, both should result in the same final diagnosis.

  12. A Hybrid CPU/GPU Pattern-Matching Algorithm for Deep Packet Inspection.

    Directory of Open Access Journals (Sweden)

    Chun-Liang Lee

    Full Text Available The large quantities of data now being transferred via high-speed networks have made deep packet inspection indispensable for security purposes. Scalable and low-cost signature-based network intrusion detection systems have been developed for deep packet inspection for various software platforms. Traditional approaches that only involve central processing units (CPUs are now considered inadequate in terms of inspection speed. Graphic processing units (GPUs have superior parallel processing power, but transmission bottlenecks can reduce optimal GPU efficiency. In this paper we describe our proposal for a hybrid CPU/GPU pattern-matching algorithm (HPMA that divides and distributes the packet-inspecting workload between a CPU and GPU. All packets are initially inspected by the CPU and filtered using a simple pre-filtering algorithm, and packets that might contain malicious content are sent to the GPU for further inspection. Test results indicate that in terms of random payload traffic, the matching speed of our proposed algorithm was 3.4 times and 2.7 times faster than those of the AC-CPU and AC-GPU algorithms, respectively. Further, HPMA achieved higher energy efficiency than the other tested algorithms.

  13. Design requirements and development of an airborne descent path definition algorithm for time navigation

    Science.gov (United States)

    Izumi, K. H.; Thompson, J. L.; Groce, J. L.; Schwab, R. W.

    1986-01-01

    The design requirements for a 4D path definition algorithm are described. These requirements were developed for the NASA ATOPS as an extension of the Local Flow Management/Profile Descent algorithm. They specify the processing flow, functional and data architectures, and system input requirements, and recommended the addition of a broad path revision (reinitialization) function capability. The document also summarizes algorithm design enhancements and the implementation status of the algorithm on an in-house PDP-11/70 computer. Finally, the requirements for the pilot-computer interfaces, the lateral path processor, and guidance and steering function are described.

  14. Autonomous spacecraft landing through human pre-attentive vision

    International Nuclear Information System (INIS)

    Schiavone, Giuseppina; Izzo, Dario; Simões, Luís F; De Croon, Guido C H E

    2012-01-01

    In this work, we exploit a computational model of human pre-attentive vision to guide the descent of a spacecraft on extraterrestrial bodies. Providing the spacecraft with high degrees of autonomy is a challenge for future space missions. Up to present, major effort in this research field has been concentrated in hazard avoidance algorithms and landmark detection, often by reference to a priori maps, ranked by scientists according to specific scientific criteria. Here, we present a bio-inspired approach based on the human ability to quickly select intrinsically salient targets in the visual scene; this ability is fundamental for fast decision-making processes in unpredictable and unknown circumstances. The proposed system integrates a simple model of the spacecraft and optimality principles which guarantee minimum fuel consumption during the landing procedure; detected salient sites are used for retargeting the spacecraft trajectory, under safety and reachability conditions. We compare the decisions taken by the proposed algorithm with that of a number of human subjects tested under the same conditions. Our results show how the developed algorithm is indistinguishable from the human subjects with respect to areas, occurrence and timing of the retargeting. (paper)

  15. Genetic algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  16. Practical Secure Computation with Pre-Processing

    DEFF Research Database (Denmark)

    Zakarias, Rasmus Winther

    Secure Multiparty Computation has been divided between protocols best suited for binary circuits and protocols best suited for arithmetic circuits. With their MiniMac protocol in [DZ13], Damgård and Zakarias take an important step towards bridging these worlds with an arithmetic protocol tuned...... space for pre-processing material than computing the non-linear parts online (depends on the quality of circuit of course). Surprisingly, even for our optimized AES-circuit this is not the case. We further improve the design of the pre-processing material and end up with only 10 megabyes of pre...... a protocol for small field arithmetic to do fast large integer multipli- cations. This is achieved by devising pre-processing material that allows the Toom-Cook multiplication algorithm to run between the parties with linear communication complexity. With this result computation on the CPU by the parties...

  17. Elementary pre-service teachers' conceptual understanding of dissolving: a Vygotskian concept development perspective

    Science.gov (United States)

    Harrell, Pamela; Subramaniam, Karthigeyan

    2015-09-01

    Background and purpose: The purpose of this study was to investigate and identify the nature and the interrelatedness of pre-service teachers' misconceptions and scientific concepts for explaining dissolving before, during, and after a 5E learning cycle lesson on dissolving, the intervention. Sample, design, and methods: Guided by Vygotsky's theory of concept development, the study focused specifically on the spontaneous, and spontaneous pseudo-concepts held by the 61 elementary pre-service teachers during a 15-week science methods course. Data included concept maps, interview transcripts, written artifacts, drawings, and narratives, and were thematically analyzed to classify concepts and interrelatedness. Results: Results of the study showed that spontaneous pseudo-concepts (1) dominated pre-service teachers' understandings about dissolving throughout the study, and (2) were simply associated with scientific concepts during and after the intervention. Conclusion: Collectively, the results indicated that the pre-service teachers' did not acquire a unified system of knowledge about dissolving that could be characterized as abstract, generalizable, and hierarchical. Implications include the need for (1) familiarity with pre-service teachers' prior knowledge about science content; (2) a variety of formative assessments to assess their misconceptions; (3) emphasizing the importance of dialectical method for concept development during instruction; and (4) skillful content instructors.

  18. Development of real-time plasma analysis and control algorithms for the TCV tokamak using SIMULINK

    International Nuclear Information System (INIS)

    Felici, F.; Le, H.B.; Paley, J.I.; Duval, B.P.; Coda, S.; Moret, J.-M.; Bortolon, A.; Federspiel, L.; Goodman, T.P.; Hommen, G.; Karpushov, A.; Piras, F.; Pitzschke, A.; Romero, J.; Sevillano, G.; Sauter, O.; Vijvers, W.

    2014-01-01

    Highlights: • A new digital control system for the TCV tokamak has been commissioned. • The system is entirely programmable by SIMULINK, allowing rapid algorithm development. • Different control system nodes can run different algorithms at varying sampling times. • The previous control system functions have been emulated and improved. • New capabilities include MHD control, profile control, equilibrium reconstruction. - Abstract: One of the key features of the new digital plasma control system installed on the TCV tokamak is the possibility to rapidly design, test and deploy real-time algorithms. With this flexibility the new control system has been used for a large number of new experiments which exploit TCV's powerful actuators consisting of 16 individually controllable poloidal field coils and 7 real-time steerable electron cyclotron (EC) launchers. The system has been used for various applications, ranging from event-based real-time MHD control to real-time current diffusion simulations. These advances have propelled real-time control to one of the cornerstones of the TCV experimental program. Use of the SIMULINK graphical programming language to directly program the control system has greatly facilitated algorithm development and allowed a multitude of different algorithms to be deployed in a short time. This paper will give an overview of the developed algorithms and their application in physics experiments

  19. Development of a large scale Chimera grid system for the Space Shuttle Launch Vehicle

    Science.gov (United States)

    Pearce, Daniel G.; Stanley, Scott A.; Martin, Fred W., Jr.; Gomez, Ray J.; Le Beau, Gerald J.; Buning, Pieter G.; Chan, William M.; Chiu, Ing-Tsau; Wulf, Armin; Akdag, Vedat

    1993-01-01

    The application of CFD techniques to large problems has dictated the need for large team efforts. This paper offers an opportunity to examine the motivations, goals, needs, problems, as well as the methods, tools, and constraints that defined NASA's development of a 111 grid/16 million point grid system model for the Space Shuttle Launch Vehicle. The Chimera approach used for domain decomposition encouraged separation of the complex geometry into several major components each of which was modeled by an autonomous team. ICEM-CFD, a CAD based grid generation package, simplified the geometry and grid topology definition by provoding mature CAD tools and patch independent meshing. The resulting grid system has, on average, a four inch resolution along the surface.

  20. Consequences of pre-natal radiation exposure for post-natal development

    International Nuclear Information System (INIS)

    Mole, R.H.

    1982-01-01

    A review of revised observations on Japanese bomb survivors suggests that 10-18 weeks of pregnancy is the period of greatest sensitivity for foetal brain damage leading to severe mental retardation. Severe food deficiencies suggest a cause for the apparently high frequency of severe mental retardation in the unexposed control population and may also have contributed to the dose-dependent increase in those irradiated in utero. The author concludes that there is no confirmed evidence to suggest that the pre-implantation stage of mammalian development is unusually radiosensitive. In the human, the succeeding period of major organogenesis seems to be less sensitive and important than the following 10-18 week period of pregnancy. It is suggested that malformation (teratogenesis) should be distinguished from maldevelopment. Malformations are the result of failure of embryonic organization and ionizing radiation is not an efficient teratogen in this sense. Maldevelopment after exposure to radiation is the consequence of cell depletion of sufficient degree randomly distributed throughout an irradiated tissue. It is concluded that dose thresholds for maldevelopments are to be expected after irradiation both in pre-implantation and post-implantation stages, and that somatic mutation has a possible role as a mechanism without threshold for development damage by pre-natal irradiation, but not likely to be of practical significance. (U.K.)

  1. Online Planning Algorithm

    Science.gov (United States)

    Rabideau, Gregg R.; Chien, Steve A.

    2010-01-01

    AVA v2 software selects goals for execution from a set of goals that oversubscribe shared resources. The term goal refers to a science or engineering request to execute a possibly complex command sequence, such as image targets or ground-station downlinks. Developed as an extension to the Virtual Machine Language (VML) execution system, the software enables onboard and remote goal triggering through the use of an embedded, dynamic goal set that can oversubscribe resources. From the set of conflicting goals, a subset must be chosen that maximizes a given quality metric, which in this case is strict priority selection. A goal can never be pre-empted by a lower priority goal, and high-level goals can be added, removed, or updated at any time, and the "best" goals will be selected for execution. The software addresses the issue of re-planning that must be performed in a short time frame by the embedded system where computational resources are constrained. In particular, the algorithm addresses problems with well-defined goal requests without temporal flexibility that oversubscribes available resources. By using a fast, incremental algorithm, goal selection can be postponed in a "just-in-time" fashion allowing requests to be changed or added at the last minute. Thereby enabling shorter response times and greater autonomy for the system under control.

  2. Impacts of Launch Vehicle Fairing Size on Human Exploration Architectures

    Science.gov (United States)

    Jefferies, Sharon; Collins, Tim; Dwyer Cianciolo, Alicia; Polsgrove, Tara

    2017-01-01

    Human missions to Mars, particularly to the Martian surface, are grand endeavors that place extensive demands on ground infrastructure, launch capabilities, and mission systems. The interplay of capabilities and limitations among these areas can have significant impacts on the costs and ability to conduct Mars missions and campaigns. From a mission and campaign perspective, decisions that affect element designs, including those based on launch vehicle and ground considerations, can create effects that ripple through all phases of the mission and have significant impact on the overall campaign. These effects result in impacts to element designs and performance, launch and surface manifesting, and mission operations. In current Evolvable Mars Campaign concepts, the NASA Space Launch System (SLS) is the primary launch vehicle for delivering crew and payloads to cis-lunar space. SLS is currently developing an 8.4m diameter cargo fairing, with a planned upgrade to a 10m diameter fairing in the future. Fairing diameter is a driving factor that impacts many aspects of system design, vehicle performance, and operational concepts. It creates a ripple effect that influences all aspects of a Mars mission, including: element designs, grounds operations, launch vehicle design, payload packaging on the lander, launch vehicle adapter design to meet structural launch requirements, control and thermal protection during entry and descent at Mars, landing stability, and surface operations. Analyses have been performed in each of these areas to assess and, where possible, quantify the impacts of fairing diameter selection on all aspects of a Mars mission. Several potential impacts of launch fairing diameter selection are identified in each of these areas, along with changes to system designs that result. Solutions for addressing these impacts generally result in increased systems mass and propellant needs, which can further exacerbate packaging and flight challenges. This paper

  3. Hypervelocity Launching and Frozen Fuels as a Major Contribution to Spaceflight

    Science.gov (United States)

    Cocks, F. H.; Harman, C. M.; Klenk, P. A.; Simmons, W. N.

    Acting as a virtual first stage, a hypervelocity launch together with the use of frozen hydrogen/frozen oxygen propellant, offers a Single-Stage-To-Orbit (SSTO) system that promises an enormous increase in SSTO mass-ratio. Ram acceleration provides hypervelocity (2 km/sec) to the orbital vehicle with a gas gun supplying the initial velocity required for ram operation. The vehicle itself acts as the center body of a ramjet inside a launch tube, filled with gaseous fuel and oxidizer, acting as an engine cowling. The high acceleration needed to achieve hypervelocity precludes a crew, and it would require greatly increased liquid fuel tank structural mass if a liquid propellant is used for post-launch vehicle propulsion. Solid propellants do not require as much fuel- chamber strengthening to withstand a hypervelocity launch as do liquid propellants, but traditional solid fuels have lower exhaust velocities than liquid hydrogen/liquid oxygen. The shock-stability of frozen hydrogen/frozen oxygen propellant has been experimentally demonstrated. A hypervelocity launch system using frozen hydrogen/frozen oxygen propellant would be a revolutionary new development in spaceflight.

  4. The Day-1 GPM Combined Precipitation Algorithm: IMERG

    Science.gov (United States)

    Huffman, G. J.; Bolvin, D. T.; Braithwaite, D.; Hsu, K.; Joyce, R.; Kidd, C.; Sorooshian, S.; Xie, P.

    2012-12-01

    The Integrated Multi-satellitE Retrievals for Global Precipitation Measurement (GPM) mission (IMERG) algorithm will provide the at-launch combined-sensor precipitation dataset being produced by the U.S. GPM Science Team. IMERG is being developed as a unified U.S. algorithm that takes advantage of strengths in three current U.S. algorithms: - the TRMM Multi-satellite Precipitation Analysis (TMPA), which addresses inter-satellite calibration of precipitation estimates and monthly scale combination of satellite and gauge analyses; - the CPC Morphing algorithm with Kalman Filtering (KF-CMORPH), which provides quality-weighted time interpolation of precipitation patterns following storm motion; and - the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks using a Cloud Classification System (PERSIANN-CCS), which provides a neural-network-based scheme for generating microwave-calibrated precipitation estimates from geosynchronous infrared brightness temperatures, and filters out some non-raining cold clouds. The goal is to provide a long-term, fine-scale record of global precipitation from the entire constellation of precipitation-relevant satellite sensors, with input from surface precipitation gauges. The record will begin January 1998 at the start of the Tropical Rainfall Measuring Mission (TRMM) and extend as GPM records additional data. Although homogeneity is considered desirable, the use of diverse and evolving data sources works against the strict long-term homogeneity that characterizes a Climate Data Record (CDR). This talk will briefly review the design requirements for IMERG, including multiple runs at different latencies (most likely around 4 hours, 12 hours, and 2 months after observation time), various intermediate data fields as part of the IMERG data file, and the plans to bring up IMERG with calibration by TRMM initially, transitioning to GPM when its individual-sensor precipitation algorithms are fully functional

  5. Unified C/VHDL Model Generation of FPGA-based LHCb VELO algorithms

    CERN Document Server

    Muecke, Manfred

    2007-01-01

    We show an alternative design approach for signal processing algorithms implemented on FPGAs. Instead of writing VHDL code for implementation and maintaining a C-model for algorithm simulation, we derive both models from one common source, allowing generation of synthesizeable VHDL and cycleand bit-accurate C-Code. We have tested our approach on the LHCb VELO pre-processing algorithms and report on experiences gained during the course of our work.

  6. Progressive geometric algorithms

    NARCIS (Netherlands)

    Alewijnse, S.P.A.; Bagautdinov, T.M.; de Berg, M.T.; Bouts, Q.W.; ten Brink, Alex P.; Buchin, K.A.; Westenberg, M.A.

    2015-01-01

    Progressive algorithms are algorithms that, on the way to computing a complete solution to the problem at hand, output intermediate solutions that approximate the complete solution increasingly well. We present a framework for analyzing such algorithms, and develop efficient progressive algorithms

  7. Progressive geometric algorithms

    NARCIS (Netherlands)

    Alewijnse, S.P.A.; Bagautdinov, T.M.; Berg, de M.T.; Bouts, Q.W.; Brink, ten A.P.; Buchin, K.; Westenberg, M.A.

    2014-01-01

    Progressive algorithms are algorithms that, on the way to computing a complete solution to the problem at hand, output intermediate solutions that approximate the complete solution increasingly well. We present a framework for analyzing such algorithms, and develop efficient progressive algorithms

  8. Advanced Oil Spill Detection Algorithms For Satellite Based Maritime Environment Monitoring

    Science.gov (United States)

    Radius, Andrea; Azevedo, Rui; Sapage, Tania; Carmo, Paulo

    2013-12-01

    During the last years, the increasing pollution occurrence and the alarming deterioration of the environmental health conditions of the sea, lead to the need of global monitoring capabilities, namely for marine environment management in terms of oil spill detection and indication of the suspected polluter. The sensitivity of Synthetic Aperture Radar (SAR) to the different phenomena on the sea, especially for oil spill and vessel detection, makes it a key instrument for global pollution monitoring. The SAR performances in maritime pollution monitoring are being operationally explored by a set of service providers on behalf of the European Maritime Safety Agency (EMSA), which has launched in 2007 the CleanSeaNet (CSN) project - a pan-European satellite based oil monitoring service. EDISOFT, which is from the beginning a service provider for CSN, is continuously investing in R&D activities that will ultimately lead to better algorithms and better performance on oil spill detection from SAR imagery. This strategy is being pursued through EDISOFT participation in the FP7 EC Sea-U project and in the Automatic Oil Spill Detection (AOSD) ESA project. The Sea-U project has the aim to improve the current state of oil spill detection algorithms, through the informative content maximization obtained with data fusion, the exploitation of different type of data/ sensors and the development of advanced image processing, segmentation and classification techniques. The AOSD project is closely related to the operational segment, because it is focused on the automation of the oil spill detection processing chain, integrating auxiliary data, like wind information, together with image and geometry analysis techniques. The synergy between these different objectives (R&D versus operational) allowed EDISOFT to develop oil spill detection software, that combines the operational automatic aspect, obtained through dedicated integration of the processing chain in the existing open source NEST

  9. Stimulation of development of notion about syntax in pre-school children

    Directory of Open Access Journals (Sweden)

    Nikolić Mirjana

    2009-01-01

    Full Text Available This paper presents a part of the research the goal of which was to study the notion about syntax as one of the meta-linguistic abilities that contributes to adoption of reading. Research comprised two hundred children of pre-school age, divided into two groups, balanced according to gender, intelligence and socioeconomic status. The research was conducted by an experimental method test-retest. In the initial measuring, experimental and control group were given the list comprising three kinds of experimental tasks for determining the level of development of notion about syntax, constructed by the author of the research. Experimental program consisted of tasks for stimulation of development of notion about syntax, which children practiced in the course of ten days (up to 30 minutes a day, with the help of previously trained pre-school teachers. After the ten-day training, final measuring in both groups was performed in both groups of respondents, by parallel form of tasks. The goal of the research was to determine whether it is possible to encourage the development of notion about syntax in children of pre-school age by systematic practice. The results of final measuring indicate that both in experimental and control group there have been significant improvements with respect to development of notion about syntax, and that the number of answers in which judgement was based on the semantic criterion (experience and meaning was significantly reduced. In making judgements based on consequences (content of the sentence points to something which is a good or not a good thing to do, moral or immoral there were no significant differences in the final compared to the initial measuring in both groups. Significant differences in retest were found in making judgements based on meaning. The mere experience with test material at pre-school age brings about the improvement of the notion about language, and practice contributes considerably to shifting the

  10. Performance Efficient Launch Vehicle Recovery and Reuse

    Science.gov (United States)

    Reed, John G.; Ragab, Mohamed M.; Cheatwood, F. McNeil; Hughes, Stephen J.; Dinonno, J.; Bodkin, R.; Lowry, Allen; Brierly, Gregory T.; Kelly, John W.

    2016-01-01

    For decades, economic reuse of launch vehicles has been an elusive goal. Recent attempts at demonstrating elements of launch vehicle recovery for reuse have invigorated a debate over the merits of different approaches. The parameter most often used to assess the cost of access to space is dollars-per-kilogram to orbit. When comparing reusable vs. expendable launch vehicles, that ratio has been shown to be most sensitive to the performance lost as a result of enabling the reusability. This paper will briefly review the historical background and results of recent attempts to recover launch vehicle assets for reuse. The business case for reuse will be reviewed, with emphasis on the performance expended to recover those assets, and the practicality of the most ambitious reuse concept, namely propulsive return to the launch site. In 2015, United Launch Alliance (ULA) announced its Sensible, Modular, Autonomous Return Technology (SMART) reuse plan for recovery of the booster module for its new Vulcan launch vehicle. That plan employs a non-propulsive approach where atmospheric entry, descent and landing (EDL) technologies are utilized. Elements of such a system have a wide variety of applications, from recovery of launch vehicle elements in suborbital trajectories all the way to human space exploration. This paper will include an update on ULA's booster module recovery approach, which relies on Hypersonic Inflatable Aerodynamic Decelerator (HIAD) and Mid-Air Retrieval (MAR) technologies, including its concept of operations (ConOps). The HIAD design, as well as parafoil staging and MAR concepts, will be discussed. Recent HIAD development activities and near term plans including scalability, next generation materials for the inflatable structure and heat shield, and gas generator inflation systems will be provided. MAR topics will include the ConOps for recovery, helicopter selection and staging, and the state of the art of parachute recovery systems using large parafoils

  11. Wikis: Developing Pre-Service Teachers' Leadership Skills and Knowledge of Content Standards

    Science.gov (United States)

    Reid-Griffin, Angelia; Slaten, Kelli M.

    2016-01-01

    In this initial phase of our multi-year research study we set out to explore the development of leadership skills in our pre-service secondary teachers after using an online wiki, Wikispaces. This paper presents our methods for preparing a group of 13 mathematics and 3 science secondary pre-service teachers to demonstrate the essential knowledge,…

  12. Safety Culture in Pre-operational Phases of Nuclear Power Plant Projects

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-09-15

    An abundance of information exists on safety culture related to the operational phases of nuclear power plants; however, pre-operational phases present unique challenges. This publication focuses on safety culture during pre-operational phases that span the interval from before a decision to launch a nuclear power programme to first fuel load. It provides safety culture insights and focuses on eight generic issues: safety culture understanding; multicultural aspects; leadership; competencies and resource competition; management systems; learning and feedback; cultural assessments; and communication. Each issue is discussed in terms of: specific challenges; desired state; approaches and methods; and examples and resources. This publication will be of interest to newcomers and experienced individuals faced with the opportunities and challenges inherent in safety culture programmes aimed at pre-operational activities.

  13. Safety Culture in Pre-operational Phases of Nuclear Power Plant Projects

    International Nuclear Information System (INIS)

    2012-01-01

    An abundance of information exists on safety culture related to the operational phases of nuclear power plants; however, pre-operational phases present unique challenges. This publication focuses on safety culture during pre-operational phases that span the interval from before a decision to launch a nuclear power programme to first fuel load. It provides safety culture insights and focuses on eight generic issues: safety culture understanding; multicultural aspects; leadership; competencies and resource competition; management systems; learning and feedback; cultural assessments; and communication. Each issue is discussed in terms of: specific challenges; desired state; approaches and methods; and examples and resources. This publication will be of interest to newcomers and experienced individuals faced with the opportunities and challenges inherent in safety culture programmes aimed at pre-operational activities.

  14. New Product Launching Ideas

    Science.gov (United States)

    Kiruthika, E.

    2012-09-01

    Launching a new product can be a tense time for a small or large business. There are those moments when you wonder if all of the work done to develop the product will pay off in revenue, but there are many things are can do to help increase the likelihood of a successful product launch. An open-minded consumer-oriented approach is imperative in todayís diverse global marketplace so a firm can identify and serve its target market, minimize dissatisfaction, and stay ahead of competitors. Final consumers purchase for personal, family, or household use. Finally, the kind of information that the marketing team needs to provide customers in different buying situations. In high-involvement decisions, the marketer needs to provide a good deal of information about the positive consequences of buying. The sales force may need to stress the important attributes of the product, the advantages compared with the competition; and maybe even encourage ìtrialî or ìsamplingî of the product in the hope of securing the sale. The final stage is the post-purchase evaluation of the decision. It is common for customers to experience concerns after making a purchase decision. This arises from a concept that is known as ìcognitive dissonance

  15. Prosthetic joint infection development of an evidence-based diagnostic algorithm.

    Science.gov (United States)

    Mühlhofer, Heinrich M L; Pohlig, Florian; Kanz, Karl-Georg; Lenze, Ulrich; Lenze, Florian; Toepfer, Andreas; Kelch, Sarah; Harrasser, Norbert; von Eisenhart-Rothe, Rüdiger; Schauwecker, Johannes

    2017-03-09

    Increasing rates of prosthetic joint infection (PJI) have presented challenges for general practitioners, orthopedic surgeons and the health care system in the recent years. The diagnosis of PJI is complex; multiple diagnostic tools are used in the attempt to correctly diagnose PJI. Evidence-based algorithms can help to identify PJI using standardized diagnostic steps. We reviewed relevant publications between 1990 and 2015 using a systematic literature search in MEDLINE and PUBMED. The selected search results were then classified into levels of evidence. The keywords were prosthetic joint infection, biofilm, diagnosis, sonication, antibiotic treatment, implant-associated infection, Staph. aureus, rifampicin, implant retention, pcr, maldi-tof, serology, synovial fluid, c-reactive protein level, total hip arthroplasty (THA), total knee arthroplasty (TKA) and combinations of these terms. From an initial 768 publications, 156 publications were stringently reviewed. Publications with class I-III recommendations (EAST) were considered. We developed an algorithm for the diagnostic approach to display the complex diagnosis of PJI in a clear and logically structured process according to ISO 5807. The evidence-based standardized algorithm combines modern clinical requirements and evidence-based treatment principles. The algorithm provides a detailed transparent standard operating procedure (SOP) for diagnosing PJI. Thus, consistently high, examiner-independent process quality is assured to meet the demands of modern quality management in PJI diagnosis.

  16. Development of a Refined Space Vehicle Rollout Forcing Function

    Science.gov (United States)

    James, George; Tucker, Jon-Michael; Valle, Gerard; Grady, Robert; Schliesing, John; Fahling, James; Emory, Benjamin; Armand, Sasan

    2016-01-01

    For several decades, American manned spaceflight vehicles and the associated launch platforms have been transported from final assembly to the launch pad via a pre-launch phase called rollout. The rollout environment is rich with forced harmonics and higher order effects can be used for extracting structural dynamics information. To enable this utilization, processing tools are needed to move from measured and analytical data to dynamic metrics such as transfer functions, mode shapes, modal frequencies, and damping. This paper covers the range of systems and tests that are available to estimate rollout forcing functions for the Space Launch System (SLS). The specific information covered in this paper includes: the different definitions of rollout forcing functions; the operational and developmental data sets that are available; the suite of analytical processes that are currently in-place or in-development; and the plans and future work underway to solve two immediate problems related to rollout forcing functions. Problem 1 involves estimating enforced accelerations to drive finite element models for developing design requirements for the SLS class of launch vehicles. Problem 2 involves processing rollout measured data in near real time to understand structural dynamics properties of a specific vehicle and the class to which it belongs.

  17. Fast neutron radiography testing for components of launch vehicles by a baby-cyclotron

    International Nuclear Information System (INIS)

    Ikeda, Y.; Ohkubo, K.; Matsumoto, G.; Nakamura, T.; Nozaki, Y.; Wakasa, S.; Toda, Y.; Kato, T.

    1990-01-01

    Recently, neutron radiography (NR) has become an important means of nondestructive testing (NDT) in Japan. Especially thermal neutron radiography testing (NRT) has been used for the NDT of various explosive devices of launch vehicles, which are developed as a H-series program by the National Space Development Agency (NASDA) of Japan. The NRT for launch vehicles has been carried out at the NR facility of a baby-cyclotron. In the NRT a conventional film method based on silver-halide emulsion has been exclusively employed to inspect various testing objects including components, and many valuable results have been obtained so far successfully. However, recently, the launch vehicles to be shot up have become much larger. With larger launch vehicles, the parts used in them have also become larger and thicker. One main disadvantage of the NRT by thermal neutrons is somewhat weak penetrability through objects because the energy is small. With the conventional thermal neutron radiography (TNR), steel objects being thicker than 40 to 50 mm are difficult to test through them because scattered neutrons obstruct real image of the object. Consequently a new method of NRT should be developed instead of TNR and applied to the new components of H-2 launch vehicles. In order to cope with the requirement, fast neutron radiography (FNR) has been studied for testing the new components of H-2, such as large separation bolts

  18. Algorithms for Cytoplasm Segmentation of Fluorescence Labelled Cells

    OpenAIRE

    Carolina Wählby; Joakim Lindblad; Mikael Vondrus; Ewert Bengtsson; Lennart Björkesten

    2002-01-01

    Automatic cell segmentation has various applications in cytometry, and while the nucleus is often very distinct and easy to identify, the cytoplasm provides a lot more challenge. A new combination of image analysis algorithms for segmentation of cells imaged by fluorescence microscopy is presented. The algorithm consists of an image pre?processing step, a general segmentation and merging step followed by a segmentation quality measurement. The quality measurement consists of a statistical ana...

  19. Vandenberg Air Force Base Upper Level Wind Launch Weather Constraints

    Science.gov (United States)

    Shafer, Jaclyn A.; Wheeler, Mark M.

    2012-01-01

    The 30th Operational Support Squadron Weather Flight (30 OSSWF) provides comprehensive weather services to the space program at Vandenberg Air Force Base (VAFB) in California. One of their responsibilities is to monitor upper-level winds to ensure safe launch operations of the Minuteman III ballistic missile. The 30 OSSWF tasked the Applied Meteorology Unit (AMU) to analyze VAFB sounding data with the goal of determining the probability of violating (PoV) their upper-level thresholds for wind speed and shear constraints specific to this launch vehicle, and to develop a tool that will calculate the PoV of each constraint on the day of launch. In order to calculate the probability of exceeding each constraint, the AMU collected and analyzed historical data from VAFB. The historical sounding data were retrieved from the National Oceanic and Atmospheric Administration Earth System Research Laboratory archive for the years 1994-2011 and then stratified into four sub-seasons: January-March, April-June, July-September, and October-December. The maximum wind speed and 1000-ft shear values for each sounding in each subseason were determined. To accurately calculate the PoV, the AMU determined the theoretical distributions that best fit the maximum wind speed and maximum shear datasets. Ultimately it was discovered that the maximum wind speeds follow a Gaussian distribution while the maximum shear values follow a lognormal distribution. These results were applied when calculating the averages and standard deviations needed for the historical and real-time PoV calculations. In addition to the requirements outlined in the original task plan, the AMU also included forecast sounding data from the Rapid Refresh model. This information provides further insight for the launch weather officers (LWOs) when determining if a wind constraint violation will occur over the next few hours on day of launch. The interactive graphical user interface (GUI) for this project was developed in

  20. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    Science.gov (United States)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including

  1. Development of algorithm for continuous generation of a computer game in terms of usability and optimization of developed code in computer science

    Directory of Open Access Journals (Sweden)

    Tibor Skala

    2018-03-01

    Full Text Available As both hardware and software have become increasingly available and constantly developed, they globally contribute to improvements in technology in every field of technology and arts. Digital tools for creation and processing of graphical contents are very developed and they have been designed to shorten the time required for content creation, which is, in this case, animation. Since contemporary animation has experienced a surge in various visual styles and visualization methods, programming is built-in in everything that is currently in use. There is no doubt that there is a variety of algorithms and software which are the brain and the moving force behind any idea created for a specific purpose and applicability in society. Art and technology combined make a direct and oriented medium for publishing and marketing in every industry, including those which are not necessarily closely related to those that rely heavily on visual aspect of work. Additionally, quality and consistency of an algorithm will also depend on proper integration into the system that will be powered by that algorithm as well as on the way the algorithm is designed. Development of an endless algorithm and its effective use will be shown during the use of the computer game. In order to present the effect of various parameters, in the final phase of the computer game development an endless algorithm was tested with varying number of key input parameters (achieved time, score reached, pace of the game.

  2. A comparison of three self-tuning control algorithms developed for the Bristol-Babcock controller

    International Nuclear Information System (INIS)

    Tapp, P.A.

    1992-04-01

    A brief overview of adaptive control methods relating to the design of self-tuning proportional-integral-derivative (PID) controllers is given. The methods discussed include gain scheduling, self-tuning, auto-tuning, and model-reference adaptive control systems. Several process identification and parameter adjustment methods are discussed. Characteristics of the two most common types of self-tuning controllers implemented by industry (i.e., pattern recognition and process identification) are summarized. The substance of the work is a comparison of three self-tuning proportional-plus-integral (STPI) control algorithms developed to work in conjunction with the Bristol-Babcock PID control module. The STPI control algorithms are based on closed-loop cycling theory, pattern recognition theory, and model-based theory. A brief theory of operation of these three STPI control algorithms is given. Details of the process simulations developed to test the STPI algorithms are given, including an integrating process, a first-order system, a second-order system, a system with initial inverse response, and a system with variable time constant and delay. The STPI algorithms' performance with regard to both setpoint changes and load disturbances is evaluated, and their robustness is compared. The dynamic effects of process deadtime and noise are also considered. Finally, the limitations of each of the STPI algorithms is discussed, some conclusions are drawn from the performance comparisons, and a few recommendations are made. 6 refs

  3. Three-dimensional development of tensile pre-strained annulus fibrosus cells for tissue regeneration: An in-vitro study

    Energy Technology Data Exchange (ETDEWEB)

    Chuah, Yon Jin [School of Chemical and Biomedical Engineering, Nanyang Technological University, 62 Nanyang Drive, Singapore 637459 (Singapore); Lee, Wu Chean [University Hospital Conventry & Warwickshire NHS Trust, Clifford Bridge Road, West Midlands CV2, 2DX (United Kingdom); Wong, Hee Kit [Department of Orthopedic Surgery, National University Health System, NUHS Tower Block Level 11, 1E Kent Ridge Road, Singapore 119228 (Singapore); Kang, Yuejun, E-mail: yuejun.kang@ntu.edu.sg [School of Chemical and Biomedical Engineering, Nanyang Technological University, 62 Nanyang Drive, Singapore 637459 (Singapore); Hee, Hwan Tak, E-mail: HTHee@ntu.edu.sg [School of Chemical and Biomedical Engineering, Nanyang Technological University, 62 Nanyang Drive, Singapore 637459 (Singapore); Pinnacle Spine & Scoliosis Centre, 3 Mount Elizabeth, Mount Elizabeth Medical Centre, #04-07, Singapore 228510 (Singapore); School of Physical and Mathematical Sciences, Nanyang Technological University, 21 Nanyang Link, Singapore 637459 (Singapore)

    2015-02-01

    Prior research has investigated the immediate response after application of tensile strain on annulus fibrosus (AF) cells for the past decade. Although mechanical strain can produce either catabolic or anabolic consequences to the cell monolayer, little is known on how to translate these findings into further tissue engineering applications. Till to date, the application and effect of tensile pre-strained cells to construct a three-dimensional (3D) AF tissue remains unknown. This study aims to investigate the effect of tensile pre-strained exposure of 1 to 24 h on the development of AF pellet culture for 3 weeks. Equibiaxial cyclic tensile strain was applied on AF monolayer cells over a period of 24 h, which was subsequently developed into a cell pellet. Investigation on cellular proliferation, phenotypic gene expression, and histological changes revealed that tensile pre-strain for 24 h had significant and lasting effect on the AF tissue development, with enhanced cell proliferation, and up-regulation of collagen type I, II, and aggrecan expression. Our results demonstrated the regenerative ability of AF cell pellets subjected to 24 h tensile pre-straining. Knowledge on the effects of tensile pre-strain exposure is necessary to optimize AF development for tissue reconstruction. Moreover, the tensile pre-strained cells may further be utilized in either cell therapy to treat mild disc degeneration disease, or the development of a disc construct for total disc replacement. - Highlights: • Establishment of tensile pre-strained cell line population for annulus development. • Tensile strain limits collagen gene expression declination in monolayer culture. • Tensile pre-strained cells up-regulate their matrix protein in 3D pellet culture.

  4. A Developed ESPRIT Algorithm for DOA Estimation

    Science.gov (United States)

    Fayad, Youssef; Wang, Caiyun; Cao, Qunsheng; Hafez, Alaa El-Din Sayed

    2015-05-01

    A novel algorithm for estimating direction of arrival (DOAE) for target, which aspires to contribute to increase the estimation process accuracy and decrease the calculation costs, has been carried out. It has introduced time and space multiresolution in Estimation of Signal Parameter via Rotation Invariance Techniques (ESPRIT) method (TS-ESPRIT) to realize subspace approach that decreases errors caused by the model's nonlinearity effect. The efficacy of the proposed algorithm is verified by using Monte Carlo simulation, the DOAE accuracy has evaluated by closed-form Cramér-Rao bound (CRB) which reveals that the proposed algorithm's estimated results are better than those of the normal ESPRIT methods leading to the estimator performance enhancement.

  5. Hail Disrometer Array for Launch Systems Support

    Science.gov (United States)

    Lane, John E.; Sharp, David W.; Kasparis, Takis C.; Doesken, Nolan J.

    2008-01-01

    Prior to launch, the space shuttle might be described as a very large thermos bottle containing substantial quantities of cryogenic fuels. Because thermal insulation is a critical design requirement, the external wall of the launch vehicle fuel tank is covered with an insulating foam layer. This foam is fragile and can be damaged by very minor impacts, such as that from small- to medium-size hail, which may go unnoticed. In May 1999, hail damage to the top of the External Tank (ET) of STS-96 required a rollback from the launch pad to the Vehicle Assembly Building (VAB) for repair of the insulating foam. Because of the potential for hail damage to the ET while exposed to the weather, a vigilant hail sentry system using impact transducers was developed as a hail damage warning system and to record and quantify hail events. The Kennedy Space Center (KSC) Hail Monitor System, a joint effort of the NASA and University Affiliated Spaceport Technology Development Contract (USTDC) Physics Labs, was first deployed for operational testing in the fall of 2006. Volunteers from the Community Collaborative Rain. Hail, and Snow Network (CoCoRaHS) in conjunction with Colorado State University were and continue to be active in testing duplicate hail monitor systems at sites in the hail prone high plains of Colorado. The KSC Hail Monitor System (HMS), consisting of three stations positioned approximately 500 ft from the launch pad and forming an approximate equilateral triangle (see Figure 1), was deployed to Pad 39B for support of STS-115. Two months later, the HMS was deployed to Pad 39A for support of STS-116. During support of STS-117 in late February 2007, an unusual hail event occurred in the immediate vicinity of the exposed space shuttle and launch pad. Hail data of this event was collected by the HMS and analyzed. Support of STS-118 revealed another important application of the hail monitor system. Ground Instrumentation personnel check the hail monitors daily when a

  6. The international trade in launch services : the effects of U.S. laws, policies and practices on its development

    NARCIS (Netherlands)

    Fenema, van H.P.

    1999-01-01

    Rockets or launch vehicles, though sharing the same technology, have both military and civil applications: they can be used as missiles or as 'ordinary' transportation vehicles. As a consequence, national security and foreign policy considerations stand in the way of the international launch

  7. Development and comparisons of wind retrieval algorithms for small unmanned aerial systems

    Science.gov (United States)

    Bonin, T. A.; Chilson, P. B.; Zielke, B. S.; Klein, P. M.; Leeman, J. R.

    2012-12-01

    Recently, there has been an increase in use of Unmanned Aerial Systems (UASs) as platforms for conducting fundamental and applied research in the lower atmosphere due to their relatively low cost and ability to collect samples with high spatial and temporal resolution. Concurrent with this development comes the need for accurate instrumentation and measurement methods suitable for small meteorological UASs. Moreover, the instrumentation to be integrated into such platforms must be small and lightweight. Whereas thermodynamic variables can be easily measured using well aspirated sensors onboard, it is much more challenging to accurately measure the wind with a UAS. Several algorithms have been developed that incorporate GPS observations as a means of estimating the horizontal wind vector, with each algorithm exhibiting its own particular strengths and weaknesses. In the present study, the performance of three such GPS-based wind-retrieval algorithms has been investigated and compared with wind estimates from rawinsonde and sodar observations. Each of the algorithms considered agreed well with the wind measurements from sounding and sodar data. Through the integration of UAS-retrieved profiles of thermodynamic and kinematic parameters, one can investigate the static and dynamic stability of the atmosphere and relate them to the state of the boundary layer across a variety of times and locations, which might be difficult to access using conventional instrumentation.

  8. Launch Opportunities for Jupiter Missions Using the Gravity Assist

    Directory of Open Access Journals (Sweden)

    Young-Joo Song

    2004-06-01

    Full Text Available Interplanetary trajectories using the gravity assists are studied for future Korean interplanetary missions. Verifications of the developed softwares and results were performed by comparing data from ESA's Mars Express mission and previous results. Among the Jupiter exploration mission scenarios, multi-planet gravity assist mission to Jupiter (Earth-Mars-Earth-Jupiter Gravity Assist, EMEJGA trajectory requires minimum launch energy (C3 of 29.231 km2/s2 with 4.6 years flight times. Others, such as direct mission and single-planet(Mars gravity assist mission, requires launch energy (C3 of 75.656 km^2/s^2 with 2.98 years flight times and 63.590 km2/s2 with 2.33 years flight times, respectively. These results show that the planetary gravity assists can reduce launch energy, while EMEJGA trajectory requires the longer flight time than the other missions.

  9. Development of pattern recognition algorithms for the central drift chamber of the Belle II detector

    Energy Technology Data Exchange (ETDEWEB)

    Trusov, Viktor

    2016-11-04

    In this thesis, the development of one of the pattern recognition algorithms for the Belle II experiment based on conformal and Legendre transformations is presented. In order to optimize the performance of the algorithm (CPU time and efficiency) specialized processing steps have been introduced. To show achieved results, Monte-Carlo based efficiency measurements of the tracking algorithms in the Central Drift Chamber (CDC) has been done.

  10. Post launch calibration and testing of the Geostationary Lightning Mapper on GOES-R satellite

    Science.gov (United States)

    Rafal, Marc; Clarke, Jared T.; Cholvibul, Ruth W.

    2016-05-01

    The Geostationary Operational Environmental Satellite R (GOES-R) series is the planned next generation of operational weather satellites for the United States National Oceanic and Atmospheric Administration (NOAA). The National Aeronautics and Space Administration (NASA) is procuring the GOES-R spacecraft and instruments with the first launch of the GOES-R series planned for October 2016. Included in the GOES-R Instrument suite is the Geostationary Lightning Mapper (GLM). GLM is a single-channel, near-infrared optical detector that can sense extremely brief (800 μs) transient changes in the atmosphere, indicating the presence of lightning. GLM will measure total lightning activity continuously over the Americas and adjacent ocean regions with near-uniform spatial resolution of approximately 10 km. Due to its large CCD (1372x1300 pixels), high frame rate, sensitivity and onboard event filtering, GLM will require extensive post launch characterization and calibration. Daytime and nighttime images will be used to characterize both image quality criteria inherent to GLM as a space-based optic system (focus, stray light, crosstalk, solar glint) and programmable image processing criteria (dark offsets, gain, noise, linearity, dynamic range). In addition ground data filtering will be adjusted based on lightning-specific phenomenology (coherence) to isolate real from false transients with their own characteristics. These parameters will be updated, as needed, on orbit in an iterative process guided by pre-launch testing. This paper discusses the planned tests to be performed on GLM over the six-month Post Launch Test period to optimize and demonstrate GLM performance.

  11. A novel washing algorithm for underarm stain removal

    Science.gov (United States)

    Acikgoz Tufan, H.; Gocek, I.; Sahin, U. K.; Erdem, I.

    2017-10-01

    After contacting with human sweat which comprise around 27% sebum, anti-perspirants comprising aluminium chloride or its compounds form a jel-like structure whose solubility in water is very poor. In daily use, this jel-like structure closes sweat pores and hinders wetting of skin by sweat. However, when in contact with garments, they form yellowish stains at the underarm of the garments. These stains are very hard to remove with regular machine washing. In this study, first of all, we focused on understanding and simulating such stain formation on the garments. Two alternative procedures are offered to form jel-like structures. On both procedures, commercially available spray or deo-stick type anti-perspirants, standard acidic and basic sweat solutions and artificial sebum are used to form jel-like structures, and they are applied on fabric in order to get hard stains. Secondly, after simulation of the stain on the fabric, we put our efforts on developing a washing algorithm specifically designed for removal of underarm stains. Eight alternative washing algorithms are offered with varying washing temperature, amounts of detergent, and pre-stain removal procedures. Better algorithm is selected by comparison of Tristimulus Y values after washing.

  12. Transcriptomic changes in the pre-implantation uterus highlight histotrophic nutrition of the developing marsupial embryo.

    Science.gov (United States)

    Whittington, Camilla M; O'Meally, Denis; Laird, Melanie K; Belov, Katherine; Thompson, Michael B; McAllan, Bronwyn M

    2018-02-05

    Early pregnancy is a critical time for successful reproduction; up to half of human pregnancies fail before the development of the definitive chorioallantoic placenta. Unlike the situation in eutherian mammals, marsupial pregnancy is characterised by a long pre-implantation period prior to the development of the short-lived placenta, making them ideal models for study of the uterine environment promoting embryonic survival pre-implantation. Here we present a transcriptomic study of pre-implantation marsupial pregnancy, and identify differentially expressed genes in the Sminthopsis crassicaudata uterus involved in metabolism and biosynthesis, transport, immunity, tissue remodelling, and uterine receptivity. Interestingly, almost one quarter of the top 50 genes that are differentially upregulated in early pregnancy are putatively involved in histotrophy, highlighting the importance of nutrient transport to the conceptus prior to the development of the placenta. This work furthers our understanding of the mechanisms underlying survival of pre-implantation embryos in the earliest live bearing ancestors of mammals.

  13. A Numerical Method for Blast Shock Wave Analysis of Missile Launch from Aircraft

    Directory of Open Access Journals (Sweden)

    Sebastian Heimbs

    2015-01-01

    Full Text Available An efficient empirical approach was developed to accurately represent the blast shock wave loading resulting from the launch of a missile from a military aircraft to be used in numerical analyses. Based on experimental test series of missile launches in laboratory environment and from a helicopter, equations were derived to predict the time- and position-dependent overpressure. The method was finally applied and validated in a structural analysis of a helicopter tail boom under missile launch shock wave loading.

  14. The techniques of quality operations computational and experimental researches of the launch vehicles in the drawing-board stage

    Science.gov (United States)

    Rozhaeva, K.

    2018-01-01

    The aim of the researchis the quality operations of the design process at the stage of research works on the development of active on-Board system of the launch vehicles spent stages descent with liquid propellant rocket engines by simulating the gasification process of undeveloped residues of fuel in the tanks. The design techniques of the gasification process of liquid rocket propellant components residues in the tank to the expense of finding and fixing errors in the algorithm calculation to increase the accuracy of calculation results is proposed. Experimental modelling of the model liquid evaporation in a limited reservoir of the experimental stand, allowing due to the false measurements rejection based on given criteria and detected faults to enhance the results reliability of the experimental studies; to reduce the experiments cost.

  15. An efficient algorithm for incompressible N-phase flows

    International Nuclear Information System (INIS)

    Dong, S.

    2014-01-01

    We present an efficient algorithm within the phase field framework for simulating the motion of a mixture of N (N⩾2) immiscible incompressible fluids, with possibly very different physical properties such as densities, viscosities, and pairwise surface tensions. The algorithm employs a physical formulation for the N-phase system that honors the conservations of mass and momentum and the second law of thermodynamics. We present a method for uniquely determining the mixing energy density coefficients involved in the N-phase model based on the pairwise surface tensions among the N fluids. Our numerical algorithm has several attractive properties that make it computationally very efficient: (i) it has completely de-coupled the computations for different flow variables, and has also completely de-coupled the computations for the (N−1) phase field functions; (ii) the algorithm only requires the solution of linear algebraic systems after discretization, and no nonlinear algebraic solve is needed; (iii) for each flow variable the linear algebraic system involves only constant and time-independent coefficient matrices, which can be pre-computed during pre-processing, despite the variable density and variable viscosity of the N-phase mixture; (iv) within a time step the semi-discretized system involves only individual de-coupled Helmholtz-type (including Poisson) equations, despite the strongly-coupled phase–field system of fourth spatial order at the continuum level; (v) the algorithm is suitable for large density contrasts and large viscosity contrasts among the N fluids. Extensive numerical experiments have been presented for several problems involving multiple fluid phases, large density contrasts and large viscosity contrasts. In particular, we compare our simulations with the de Gennes theory, and demonstrate that our method produces physically accurate results for multiple fluid phases. We also demonstrate the significant and sometimes dramatic effects of the

  16. Cooperated Bayesian algorithm for distributed scheduling problem

    Institute of Scientific and Technical Information of China (English)

    QIANG Lei; XIAO Tian-yuan

    2006-01-01

    This paper presents a new distributed Bayesian optimization algorithm (BOA) to overcome the efficiency problem when solving NP scheduling problems.The proposed approach integrates BOA into the co-evolutionary schema,which builds up a concurrent computing environment.A new search strategy is also introduced for local optimization process.It integrates the reinforcement learning(RL) mechanism into the BOA search processes,and then uses the mixed probability information from BOA (post-probability) and RL (pre-probability) to enhance the cooperation between different local controllers,which improves the optimization ability of the algorithm.The experiment shows that the new algorithm does better in both optimization (2.2%) and convergence (11.7%),compared with classic BOA.

  17. A Business Analysis of a SKYLON-based European Launch Service Operator

    Science.gov (United States)

    Hempsell, Mark; Aprea, Julio; Gallagher, Ben; Sadlier, Greg

    2016-04-01

    Between 2012 and 2014 an industrial consortium led by Reaction Engines conducted a feasibility study for the European Space Agency with the objective to explore the feasibility of SKYLON as the basis for a launcher that meets the requirements established for the Next Generation European Launcher. SKYLON is a fully reusable single stage to orbit launch system that is enabled by the unique performance characteristic of the Synergetic Air-Breathing Rocket Engine and is under active development. The purpose of the study which was called ;SKYLON-based European Launch Service Operator (S-ELSO); was to support ESA decision making on launch service strategy by exploring the potential implications of this new launch system on future European launch capability and the European industry that supports it. The study explored both a SKYLON operator (S-ELSO) and SKYLON manufacturer as separate business ventures. In keeping with previous studies, the only strategy that was found that kept the purchase price of the SKYLON low enough for a viable operator business was to follow an ;airline; business model where the manufacturer sells SKYLONs to other operators in addition to S-ELSO. With the assumptions made in the study it was found that the SKYLON manufacturer with a total production run of between 30 and 100 SKYLONs could expect an Internal Rate of Return of around 10%. This was judged too low for all the funding to come from commercial funding sources, but is sufficiently high for a Public Private Partnership. The S-ELSO business model showed that the Internal Rate of Return would be high enough to consider operating without public support (i.e. commercial in operation, irrespective of any public funding of development), even when the average launch price is lowered to match the lowest currently quoted price for expendable systems.

  18. Lockheed Martin approach to a Reusable Launch Vehicle (RLV)

    Science.gov (United States)

    Elvin, John D.

    1996-03-01

    This paper discusses Lockheed Martin's perspective on the development of a cost effective Reusable Launch Vehicle (RLV). Critical to a successful Single Stage To Orbit (SSTO) program are; an economic development plan sensitive to fiscal constraints; a vehicle concept satisfying present and future US launch needs; and an operations concept commensurate with a market driven program. Participation in the economic plan by government, industry, and the commercial sector is a key element of integrating our development plan and funding profile. The RLV baseline concept design, development evolution and several critical trade studies illustrate the superior performance achieved by our innovative approach to the problem of SSTO. Findings from initial aerodynamic and aerothermodynamic wind tunnel tests and trajectory analyses on this concept confirm the superior characteristics of the lifting body shape combined with the Linear Aerospike rocket engine. This Aero Ballistic Rocket (ABR) concept captures the essence of The Skunk Works approach to SSTO RLV technology integration and system engineering. These programmatic and concept development topics chronicle the key elements to implementing an innovative market driven next generation RLV.

  19. Development of a pre-ignition submodel for hydrogen engines

    Energy Technology Data Exchange (ETDEWEB)

    Al-Baghdadi, Sadiq [University of Babylon (Iraq). Dept. of Mechanical Engineering

    2005-10-15

    In hydrogen-fuelled spark ignition engine applications, the onset of pre-ignition remains one of the prime limitations that needs to be addressed to avoid its incidence and achieve superior performance. This paper describes a new pre-ignition submodel for engine modelling codes. The effects of changes in key operating variables, such as compression ratio, spark timing, intake pressure, and temperature on pre-ignition limiting equivalence ratios are established both analytically and experimentally. With the established pre-ignition model, it is possible not only to investigate whether pre-ignition is observed with changing operating and design parameters, but also to evaluate those parameters' effects on the maximum possible pre-ignition intensity. (author)

  20. EDIN0613P weight estimating program. [for launch vehicles

    Science.gov (United States)

    Hirsch, G. N.

    1976-01-01

    The weight estimating relationships and program developed for space power system simulation are described. The program was developed to size a two-stage launch vehicle for the space power system. The program is actually part of an overall simulation technique called EDIN (Engineering Design and Integration) system. The program sizes the overall vehicle, generates major component weights and derives a large amount of overall vehicle geometry. The program is written in FORTRAN V and is designed for use on the Univac Exec 8 (1110). By utilizing the flexibility of this program while remaining cognizant of the limits imposed upon output depth and accuracy by utilization of generalized input, this program concept can be a useful tool for estimating purposes at the conceptual design stage of a launch vehicle.

  1. Prototype Implementation of Two Efficient Low-Complexity Digital Predistortion Algorithms

    Directory of Open Access Journals (Sweden)

    Timo I. Laakso

    2008-01-01

    Full Text Available Predistortion (PD lineariser for microwave power amplifiers (PAs is an important topic of research. With larger and larger bandwidth as it appears today in modern WiMax standards as well as in multichannel base stations for 3GPP standards, the relatively simple nonlinear effect of a PA becomes a complex memory-including function, severely distorting the output signal. In this contribution, two digital PD algorithms are investigated for the linearisation of microwave PAs in mobile communications. The first one is an efficient and low-complexity algorithm based on a memoryless model, called the simplicial canonical piecewise linear (SCPWL function that describes the static nonlinear characteristic of the PA. The second algorithm is more general, approximating the pre-inverse filter of a nonlinear PA iteratively using a Volterra model. The first simpler algorithm is suitable for compensation of amplitude compression and amplitude-to-phase conversion, for example, in mobile units with relatively small bandwidths. The second algorithm can be used to linearise PAs operating with larger bandwidths, thus exhibiting memory effects, for example, in multichannel base stations. A measurement testbed which includes a transmitter-receiver chain with a microwave PA is built for testing and prototyping of the proposed PD algorithms. In the testing phase, the PD algorithms are implemented using MATLAB (floating-point representation and tested in record-and-playback mode. The iterative PD algorithm is then implemented on a Field Programmable Gate Array (FPGA using fixed-point representation. The FPGA implementation allows the pre-inverse filter to be tested in a real-time mode. Measurement results show excellent linearisation capabilities of both the proposed algorithms in terms of adjacent channel power suppression. It is also shown that the fixed-point FPGA implementation of the iterative algorithm performs as well as the floating-point implementation.

  2. Development Modules for Specification of Requirements for a System of Verification of Parallel Algorithms

    Directory of Open Access Journals (Sweden)

    Vasiliy Yu. Meltsov

    2012-05-01

    Full Text Available This paper presents the results of the development of one of the modules of the system verification of parallel algorithms that are used to verify the inference engine. This module is designed to build the specification requirements, the feasibility of which on the algorithm is necessary to prove (test.

  3. Algorithmic mathematics

    CERN Document Server

    Hougardy, Stefan

    2016-01-01

    Algorithms play an increasingly important role in nearly all fields of mathematics. This book allows readers to develop basic mathematical abilities, in particular those concerning the design and analysis of algorithms as well as their implementation. It presents not only fundamental algorithms like the sieve of Eratosthenes, the Euclidean algorithm, sorting algorithms, algorithms on graphs, and Gaussian elimination, but also discusses elementary data structures, basic graph theory, and numerical questions. In addition, it provides an introduction to programming and demonstrates in detail how to implement algorithms in C++. This textbook is suitable for students who are new to the subject and covers a basic mathematical lecture course, complementing traditional courses on analysis and linear algebra. Both authors have given this "Algorithmic Mathematics" course at the University of Bonn several times in recent years.

  4. Launch Vehicle Abort Analysis for Failures Leading to Loss of Control

    Science.gov (United States)

    Hanson, John M.; Hill, Ashley D.; Beard, Bernard B.

    2013-01-01

    Launch vehicle ascent is a time of high risk for an onboard crew. There is a large fraction of possible failures for which time is of the essence and a successful abort is possible if the detection and action happens quickly enough. This paper focuses on abort determination based on data already available from the Guidance, Navigation, and Control system. This work is the result of failure analysis efforts performed during the Ares I launch vehicle development program. The two primary areas of focus are the derivation of abort triggers to ensure that abort occurs as quickly as possible when needed, but that false aborts are avoided, and evaluation of success in aborting off the failing launch vehicle.

  5. Superior Generalization Capability of Hardware-Learing Algorithm Developed for Self-Learning Neuron-MOS Neural Networks

    Science.gov (United States)

    Kondo, Shuhei; Shibata, Tadashi; Ohmi, Tadahiro

    1995-02-01

    We have investigated the learning performance of the hardware backpropagation (HBP) algorithm, a hardware-oriented learning algorithm developed for the self-learning architecture of neural networks constructed using neuron MOS (metal-oxide-semiconductor) transistors. The solution to finding a mirror symmetry axis in a 4×4 binary pixel array was tested by computer simulation based on the HBP algorithm. Despite the inherent restrictions imposed on the hardware-learning algorithm, HBP exhibits equivalent learning performance to that of the original backpropagation (BP) algorithm when all the pertinent parameters are optimized. Very importantly, we have found that HBP has a superior generalization capability over BP; namely, HBP exhibits higher performance in solving problems that the network has not yet learnt.

  6. Competitive Supply Chain Network Design Considering Marketing Strategies: A Hybrid Metaheuristic Algorithm

    Directory of Open Access Journals (Sweden)

    Ali Akbar Hasani

    2016-11-01

    Full Text Available In this paper, a comprehensive model is proposed to design a network for multi-period, multi-echelon, and multi-product inventory controlled the supply chain. Various marketing strategies and guerrilla marketing approaches are considered in the design process under the static competition condition. The goal of the proposed model is to efficiently respond to the customers’ demands in the presence of the pre-existing competitors and the price inelasticity of demands. The proposed optimization model considers multiple objectives that incorporate both market share and total profit of the considered supply chain network, simultaneously. To tackle the proposed multi-objective mixed-integer nonlinear programming model, an efficient hybrid meta-heuristic algorithm is developed that incorporates a Taguchi-based non-dominated sorting genetic algorithm-II and a particle swarm optimization. A variable neighborhood decomposition search is applied to enhance a local search process of the proposed hybrid solution algorithm. Computational results illustrate that the proposed model and solution algorithm are notably efficient in dealing with the competitive pressure by adopting the proper marketing strategies.

  7. Developing an Attitude Scale for Discussion Ability of Pre-service Teachers

    Directory of Open Access Journals (Sweden)

    Gürbüz OCAK

    2015-01-01

    Full Text Available In this study, developing a scale to measure the attitudes towards the discussion ability of pre-service teachers indented. The application scale of the discussion ability was developed in the direction of the pre-service teachers. The scale is likert-type comprising of 57 items. The validity and reliability of the scale is done on the data gained from 200 students elected by the method of coincidental exemplification amount of the sophomore students. As a result of the factorial validity of the scale, items change between 0.59 and 0.76 and that the KMO Kaiser-Meyer-Olkin value is 0.867 and the value of (Cronbach alpha calculated for reliability study is 0.865. Findings related to the studies of validity and reliability show that the scale has a valid and reliable form.

  8. New reference trajectory optimization algorithm for a flight management system inspired in beam search

    Directory of Open Access Journals (Sweden)

    Alejandro MURRIETA-MENDOZA

    2017-08-01

    Full Text Available With the objective of reducing the flight cost and the amount of polluting emissions released in the atmosphere, a new optimization algorithm considering the climb, cruise and descent phases is presented for the reference vertical flight trajectory. The selection of the reference vertical navigation speeds and altitudes was solved as a discrete combinatory problem by means of a graph-tree passing through nodes using the beam search optimization technique. To achieve a compromise between the execution time and the algorithm’s ability to find the global optimal solution, a heuristic methodology introducing a parameter called “optimism coefficient was used in order to estimate the trajectory’s flight cost at every node. The optimal trajectory cost obtained with the developed algorithm was compared with the cost of the optimal trajectory provided by a commercial flight management system(FMS. The global optimal solution was validated against an exhaustive search algorithm(ESA, other than the proposed algorithm. The developed algorithm takes into account weather effects, step climbs during cruise and air traffic management constraints such as constant altitude segments, constant cruise Mach, and a pre-defined reference lateral navigation route. The aircraft fuel burn was computed using a numerical performance model which was created and validated using flight test experimental data.

  9. Multimodal Instruction in Pre-Kindergarten: An Introduction to an Inclusive Early Language Program

    Science.gov (United States)

    Regalla, Michele; Peker, Hilal

    2016-01-01

    During the 2013-2014 school year, a charter school in Central Florida (which will be given the pseudonym "The Unity School") known for its practice of full inclusion launched an unconventional project. The Unity School, which serves children from preschool through grade five, began offering foreign language to all pre-kindergarten…

  10. Development of Data Processing Algorithms for the Upgraded LHCb Vertex Locator

    CERN Document Server

    AUTHOR|(CDS)2101352

    The LHCb detector will see a major upgrade during LHC Long Shutdown II, which is planned for 2019/20. The silicon Vertex Locator subdetector will be upgraded for operation under the new run conditions. The detector will be read out using a data acquisition board based on an FPGA. The work presented in this thesis is concerned with the development of the data processing algorithms to be used in this data acquisition board. In particular, work in three different areas of the FPGA is covered: the data processing block, the low level interface, and the post router block. The algorithms produced have been simulated and tested, and shown to provide the required performance. Errors in the initial implementation of the Gigabit Wireline Transmitter serialized data in the low level interface were discovered and corrected. The data scrambling algorithm and the post router block have been incorporated in the front end readout chip.

  11. An efficient algorithm for the detection of exposed and hidden wormhole attack

    International Nuclear Information System (INIS)

    Khan, Z.A.; Rehman, S.U.; Islam, M.H.

    2016-01-01

    MANETs (Mobile Ad Hoc Networks) are slowly integrating into our everyday lives, their most prominent uses are visible in the disaster and war struck areas where physical infrastructure is almost impossible or very hard to build. MANETs like other networks are facing the threat of malicious users and their activities. A number of attacks have been identified but the most severe of them is the wormhole attack which has the ability to succeed even in case of encrypted traffic and secure networks. Once wormhole is launched successfully, the severity increases by the fact that attackers can launch other attacks too. This paper presents a comprehensive algorithm for the detection of exposed as well as hidden wormhole attack while keeping the detection rate to maximum and at the same reducing false alarms. The algorithm does not require any extra hardware, time synchronization or any special type of nodes. The architecture consists of the combination of Routing Table, RTT (Round Trip Time) and RSSI (Received Signal Strength Indicator) for comprehensive detection of wormhole attack. The proposed technique is robust, light weight, has low resource requirements and provides real-time detection against the wormhole attack. Simulation results show that the algorithm is able to provide a higher detection rate, packet delivery ratio, negligible false alarms and is also better in terms of Ease of Implementation, Detection Accuracy/ Speed and processing overhead. (author)

  12. Big Bang launch

    CERN Multimedia

    2008-01-01

    Physicists from the University, along with scientists and engineers around the world, watched with fevered anticipation as the world's biggest scientific experiment was launched in September. (1/1 page)

  13. Development of a meta-algorithm for guiding primary care encounters for patients with multimorbidity using evidence-based and case-based guideline development methodology.

    Science.gov (United States)

    Muche-Borowski, Cathleen; Lühmann, Dagmar; Schäfer, Ingmar; Mundt, Rebekka; Wagner, Hans-Otto; Scherer, Martin

    2017-06-22

    The study aimed to develop a comprehensive algorithm (meta-algorithm) for primary care encounters of patients with multimorbidity. We used a novel, case-based and evidence-based procedure to overcome methodological difficulties in guideline development for patients with complex care needs. Systematic guideline development methodology including systematic evidence retrieval (guideline synopses), expert opinions and informal and formal consensus procedures. Primary care. The meta-algorithm was developed in six steps:1. Designing 10 case vignettes of patients with multimorbidity (common, epidemiologically confirmed disease patterns and/or particularly challenging health care needs) in a multidisciplinary workshop.2. Based on the main diagnoses, a systematic guideline synopsis of evidence-based and consensus-based clinical practice guidelines was prepared. The recommendations were prioritised according to the clinical and psychosocial characteristics of the case vignettes.3. Case vignettes along with the respective guideline recommendations were validated and specifically commented on by an external panel of practicing general practitioners (GPs).4. Guideline recommendations and experts' opinions were summarised as case specific management recommendations (N-of-one guidelines).5. Healthcare preferences of patients with multimorbidity were elicited from a systematic literature review and supplemented with information from qualitative interviews.6. All N-of-one guidelines were analysed using pattern recognition to identify common decision nodes and care elements. These elements were put together to form a generic meta-algorithm. The resulting meta-algorithm reflects the logic of a GP's encounter of a patient with multimorbidity regarding decision-making situations, communication needs and priorities. It can be filled with the complex problems of individual patients and hereby offer guidance to the practitioner. Contrary to simple, symptom-oriented algorithms, the meta-algorithm

  14. The Max Launch Abort System - Concept, Flight Test, and Evolution

    Science.gov (United States)

    Gilbert, Michael G.

    2014-01-01

    The NASA Engineering and Safety Center (NESC) is an independent engineering analysis and test organization providing support across the range of NASA programs. In 2007 NASA was developing the launch escape system for the Orion spacecraft that was evolved from the traditional tower-configuration escape systems used for the historic Mercury and Apollo spacecraft. The NESC was tasked, as a programmatic risk-reduction effort to develop and flight test an alternative to the Orion baseline escape system concept. This project became known as the Max Launch Abort System (MLAS), named in honor of Maxime Faget, the developer of the original Mercury escape system. Over the course of approximately two years the NESC performed conceptual and tradeoff analyses, designed and built full-scale flight test hardware, and conducted a flight test demonstration in July 2009. Since the flight test, the NESC has continued to further develop and refine the MLAS concept.

  15. Characteristics of Place Identity as Part of Professional Identity Development among Pre-Service Teachers

    Science.gov (United States)

    Gross, Michal; Hochberg, Nurit

    2016-01-01

    How do pre-service teachers perceive place identity, and is there a connection between their formative place identity and the development of their professional teaching identity? These questions are probed among pre-service teachers who participated in a course titled "Integrating Nature into Preschool." The design of the course was…

  16. Pre-seismic anomalies from optical satellite observations: a review

    Science.gov (United States)

    Jiao, Zhong-Hu; Zhao, Jing; Shan, Xinjian

    2018-04-01

    Detecting various anomalies using optical satellite data prior to strong earthquakes is key to understanding and forecasting earthquake activities because of its recognition of thermal-radiation-related phenomena in seismic preparation phases. Data from satellite observations serve as a powerful tool in monitoring earthquake preparation areas at a global scale and in a nearly real-time manner. Over the past several decades, many new different data sources have been utilized in this field, and progressive anomaly detection approaches have been developed. This paper reviews the progress and development of pre-seismic anomaly detection technology in this decade. First, precursor parameters, including parameters from the top of the atmosphere, in the atmosphere, and on the Earth's surface, are stated and discussed. Second, different anomaly detection methods, which are used to extract anomalous signals that probably indicate future seismic events, are presented. Finally, certain critical problems with the current research are highlighted, and new developing trends and perspectives for future work are discussed. The development of Earth observation satellites and anomaly detection algorithms can enrich available information sources, provide advanced tools for multilevel earthquake monitoring, and improve short- and medium-term forecasting, which play a large and growing role in pre-seismic anomaly detection research.

  17. Development of MODIS data-based algorithm for retrieving sea surface temperature in coastal waters.

    Science.gov (United States)

    Wang, Jiao; Deng, Zhiqiang

    2017-06-01

    A new algorithm was developed for retrieving sea surface temperature (SST) in coastal waters using satellite remote sensing data from Moderate Resolution Imaging Spectroradiometer (MODIS) aboard Aqua platform. The new SST algorithm was trained using the Artificial Neural Network (ANN) method and tested using 8 years of remote sensing data from MODIS Aqua sensor and in situ sensing data from the US coastal waters in Louisiana, Texas, Florida, California, and New Jersey. The ANN algorithm could be utilized to map SST in both deep offshore and particularly shallow nearshore waters at the high spatial resolution of 1 km, greatly expanding the coverage of remote sensing-based SST data from offshore waters to nearshore waters. Applications of the ANN algorithm require only the remotely sensed reflectance values from the two MODIS Aqua thermal bands 31 and 32 as input data. Application results indicated that the ANN algorithm was able to explaining 82-90% variations in observed SST in US coastal waters. While the algorithm is generally applicable to the retrieval of SST, it works best for nearshore waters where important coastal resources are located and existing algorithms are either not applicable or do not work well, making the new ANN-based SST algorithm unique and particularly useful to coastal resource management.

  18. New Opportunitie s for Small Satellite Programs Provided by the Falcon Family of Launch Vehicles

    Science.gov (United States)

    Dinardi, A.; Bjelde, B.; Insprucker, J.

    2008-08-01

    The Falcon family of launch vehicles, developed by Space Exploration Technologies Corporation (SpaceX), are designed to provide the world's lowest cost access to orbit. Highly reliable, low cost launch services offer considerable opportunities for risk reduction throughout the life cycle of satellite programs. The significantly lower costs of Falcon 1 and Falcon 9 as compared with other similar-class launch vehicles results in a number of new business case opportunities; which in turn presents the possibility for a paradigm shift in how the satellite industry thinks about launch services.

  19. On developing B-spline registration algorithms for multi-core processors

    International Nuclear Information System (INIS)

    Shackleford, J A; Kandasamy, N; Sharp, G C

    2010-01-01

    Spline-based deformable registration methods are quite popular within the medical-imaging community due to their flexibility and robustness. However, they require a large amount of computing time to obtain adequate results. This paper makes two contributions towards accelerating B-spline-based registration. First, we propose a grid-alignment scheme and associated data structures that greatly reduce the complexity of the registration algorithm. Based on this grid-alignment scheme, we then develop highly data parallel designs for B-spline registration within the stream-processing model, suitable for implementation on multi-core processors such as graphics processing units (GPUs). Particular attention is focused on an optimal method for performing analytic gradient computations in a data parallel fashion. CPU and GPU versions are validated for execution time and registration quality. Performance results on large images show that our GPU algorithm achieves a speedup of 15 times over the single-threaded CPU implementation whereas our multi-core CPU algorithm achieves a speedup of 8 times over the single-threaded implementation. The CPU and GPU versions achieve near-identical registration quality in terms of RMS differences between the generated vector fields.

  20. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    Science.gov (United States)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  1. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast.

    Science.gov (United States)

    Lai, Fu-Jou; Chang, Hong-Tsun; Wu, Wei-Sheng

    2015-01-01

    Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Allowing users to select eight existing performance indices and 15

  2. Development of computational algorithms for quantification of pulmonary structures

    International Nuclear Information System (INIS)

    Oliveira, Marcela de; Alvarez, Matheus; Alves, Allan F.F.; Miranda, Jose R.A.; Pina, Diana R.

    2012-01-01

    The high-resolution computed tomography has become the imaging diagnostic exam most commonly used for the evaluation of the squeals of Paracoccidioidomycosis. The subjective evaluations the radiological abnormalities found on HRCT images do not provide an accurate quantification. The computer-aided diagnosis systems produce a more objective assessment of the abnormal patterns found in HRCT images. Thus, this research proposes the development of algorithms in MATLAB® computing environment can quantify semi-automatically pathologies such as pulmonary fibrosis and emphysema. The algorithm consists in selecting a region of interest (ROI), and by the use of masks, filter densities and morphological operators, to obtain a quantification of the injured area to the area of a healthy lung. The proposed method was tested on ten HRCT scans of patients with confirmed PCM. The results of semi-automatic measurements were compared with subjective evaluations performed by a specialist in radiology, falling to a coincidence of 80% for emphysema and 58% for fibrosis. (author)

  3. Development of algorithm for depreciation costs allocation in dynamic input-output industrial enterprise model

    Directory of Open Access Journals (Sweden)

    Keller Alevtina

    2017-01-01

    Full Text Available The article considers the issue of allocation of depreciation costs in the dynamic inputoutput model of an industrial enterprise. Accounting the depreciation costs in such a model improves the policy of fixed assets management. It is particularly relevant to develop the algorithm for the allocation of depreciation costs in the construction of dynamic input-output model of an industrial enterprise, since such enterprises have a significant amount of fixed assets. Implementation of terms of the adequacy of such an algorithm itself allows: evaluating the appropriateness of investments in fixed assets, studying the final financial results of an industrial enterprise, depending on management decisions in the depreciation policy. It is necessary to note that the model in question for the enterprise is always degenerate. It is caused by the presence of zero rows in the matrix of capital expenditures by lines of structural elements unable to generate fixed assets (part of the service units, households, corporate consumers. The paper presents the algorithm for the allocation of depreciation costs for the model. This algorithm was developed by the authors and served as the basis for further development of the flowchart for subsequent implementation with use of software. The construction of such algorithm and its use for dynamic input-output models of industrial enterprises is actualized by international acceptance of the effectiveness of the use of input-output models for national and regional economic systems. This is what allows us to consider that the solutions discussed in the article are of interest to economists of various industrial enterprises.

  4. Autonomous Star Tracker Algorithms

    DEFF Research Database (Denmark)

    Betto, Maurizio; Jørgensen, John Leif; Kilsgaard, Søren

    1998-01-01

    Proposal, in response to an ESA R.f.P., to design algorithms for autonomous star tracker operations.The proposal also included the development of a star tracker breadboard to test the algorithms performances.......Proposal, in response to an ESA R.f.P., to design algorithms for autonomous star tracker operations.The proposal also included the development of a star tracker breadboard to test the algorithms performances....

  5. Motivation for Air-Launch: Past, Present, and Future

    Science.gov (United States)

    Kelly, John W.; Rogers, Charles E.; Brierly, Gregory T.; Martin, J Campbell; Murphy, Marshall G.

    2017-01-01

    Air-launch is defined as two or more air-vehicles joined and working together, that eventually separate in flight, and that have a combined performance greater than the sum of the individual parts. The use of the air-launch concept has taken many forms across civil, commercial, and military contexts throughout the history of aviation. Air-launch techniques have been applied for entertainment, movement of materiel and personnel, efficient execution of aeronautical research, increasing aircraft range, and enabling flexible and efficient launch of space vehicles. For each air-launch application identified in the paper, the motivation for that application is discussed.

  6. Launch Processing System. [for Space Shuttle

    Science.gov (United States)

    Byrne, F.; Doolittle, G. V.; Hockenberger, R. W.

    1976-01-01

    This paper presents a functional description of the Launch Processing System, which provides automatic ground checkout and control of the Space Shuttle launch site and airborne systems, with emphasis placed on the Checkout, Control, and Monitor Subsystem. Hardware and software modular design concepts for the distributed computer system are reviewed relative to performing system tests, launch operations control, and status monitoring during ground operations. The communication network design, which uses a Common Data Buffer interface to all computers to allow computer-to-computer communication, is discussed in detail.

  7. Development of algorithms for building inventory compilation through remote sensing and statistical inferencing

    Science.gov (United States)

    Sarabandi, Pooya

    Building inventories are one of the core components of disaster vulnerability and loss estimations models, and as such, play a key role in providing decision support for risk assessment, disaster management and emergency response efforts. In may parts of the world inclusive building inventories, suitable for the use in catastrophe models cannot be found. Furthermore, there are serious shortcomings in the existing building inventories that include incomplete or out-dated information on critical attributes as well as missing or erroneous values for attributes. In this dissertation a set of methodologies for updating spatial and geometric information of buildings from single and multiple high-resolution optical satellite images are presented. Basic concepts, terminologies and fundamentals of 3-D terrain modeling from satellite images are first introduced. Different sensor projection models are then presented and sources of optical noise such as lens distortions are discussed. An algorithm for extracting height and creating 3-D building models from a single high-resolution satellite image is formulated. The proposed algorithm is a semi-automated supervised method capable of extracting attributes such as longitude, latitude, height, square footage, perimeter, irregularity index and etc. The associated errors due to the interactive nature of the algorithm are quantified and solutions for minimizing the human-induced errors are proposed. The height extraction algorithm is validated against independent survey data and results are presented. The validation results show that an average height modeling accuracy of 1.5% can be achieved using this algorithm. Furthermore, concept of cross-sensor data fusion for the purpose of 3-D scene reconstruction using quasi-stereo images is developed in this dissertation. The developed algorithm utilizes two or more single satellite images acquired from different sensors and provides the means to construct 3-D building models in a more

  8. PRE_X Programme: Aerothermodynamic Objectives and Aeroshape Definition for in Flight Experiments

    Science.gov (United States)

    Lambert, O.; Tribot, J.-P.; Saint-Cloud, F.

    2002-01-01

    As the expendable launch vehicles (ELV) are limited in their trend to lower costs, the reusability (Reusable Launch Vehicle, RLV) could be the way to make drastic step. By the year 2001, CNES proposed through the ANGEL phase 1 programme to preprare the required technical maturity before that RLV's become alternatives to ELV's. In such way, system ,propulsion, ground based demonstrations, aero-thermo-dynamics as well as in flight experimentation are planned. This paper is focused on the aero-thermo-dynamics (ATD) and in flight demonstration activities with emphasis on the better understanding of ATD problems emerging from past programmes among them shock wave transitionnal boundary layer interaction on surface control, boundary layer transition, local aerothermodynamic effects, gas- surface interaction, catalycity, base flow prediction,...In order to minimize as small as possible the management risk a first generation of vehicle dubbed Pre_X is designed to validate technological choices and to have as soon as possible re-entry data to calibrate the various tools involved in the future RLV definition. In addition, the main requirement for PRE_X aeroshape definition and the two different design approaches considered by Dassault Aviation and EADS-LV are discussed. Then, the more promising concept for the PRE_X application is presented. Finally, the current status of the ATD activities is given as well as the perspectives.

  9. System driven technology selection for future European launch systems

    Science.gov (United States)

    Baiocco, P.; Ramusat, G.; Sirbi, A.; Bouilly, Th.; Lavelle, F.; Cardone, T.; Fischer, H.; Appel, S.

    2015-02-01

    In the framework of the next generation launcher activity at ESA, a top-down approach and a bottom-up approach have been performed for the identification of promising technologies and alternative conception of future European launch vehicles. The top-down approach consists in looking for system-driven design solutions and the bottom-up approach features design solutions leading to substantial advantages for the system. The main investigations have been focused on the future launch vehicle technologies. Preliminary specifications have been used in order to permit sub-system design to find the major benefit for the overall launch system. The development cost, non-recurring and recurring cost, industrialization and operational aspects have been considered as competitiveness factors for the identification and down-selection of the most interesting technologies. The recurring cost per unit payload mass has been evaluated. The TRL/IRL has been assessed and a preliminary development plan has been traced for the most promising technologies. The potentially applicable launch systems are Ariane and VEGA evolution. The main FLPP technologies aim at reducing overall structural mass, increasing structural margins for robustness, metallic and composite containment of cryogenic hydrogen and oxygen propellants, propellant management subsystems, elements significantly reducing fabrication and operational costs, avionics, pyrotechnics, etc. to derive performing upper and booster stages. Application of the system driven approach allows creating performing technology demonstrators in terms of need, demonstration objective, size and cost. This paper outlines the process of technology down selection using a system driven approach, the accomplishments already achieved in the various technology fields up to now, as well as the potential associated benefit in terms of competitiveness factors.

  10. Paroxysmal atrial fibrillation prediction based on HRV analysis and non-dominated sorting genetic algorithm III.

    Science.gov (United States)

    Boon, K H; Khalil-Hani, M; Malarvili, M B

    2018-01-01

    This paper presents a method that able to predict the paroxysmal atrial fibrillation (PAF). The method uses shorter heart rate variability (HRV) signals when compared to existing methods, and achieves good prediction accuracy. PAF is a common cardiac arrhythmia that increases the health risk of a patient, and the development of an accurate predictor of the onset of PAF is clinical important because it increases the possibility to electrically stabilize and prevent the onset of atrial arrhythmias with different pacing techniques. We propose a multi-objective optimization algorithm based on the non-dominated sorting genetic algorithm III for optimizing the baseline PAF prediction system, that consists of the stages of pre-processing, HRV feature extraction, and support vector machine (SVM) model. The pre-processing stage comprises of heart rate correction, interpolation, and signal detrending. After that, time-domain, frequency-domain, non-linear HRV features are extracted from the pre-processed data in feature extraction stage. Then, these features are used as input to the SVM for predicting the PAF event. The proposed optimization algorithm is used to optimize the parameters and settings of various HRV feature extraction algorithms, select the best feature subsets, and tune the SVM parameters simultaneously for maximum prediction performance. The proposed method achieves an accuracy rate of 87.7%, which significantly outperforms most of the previous works. This accuracy rate is achieved even with the HRV signal length being reduced from the typical 30 min to just 5 min (a reduction of 83%). Furthermore, another significant result is the sensitivity rate, which is considered more important that other performance metrics in this paper, can be improved with the trade-off of lower specificity. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. The Road from the NASA Access to Space Study to a Reusable Launch Vehicle

    Science.gov (United States)

    Powell, Richard W.; Cook, Stephen A.; Lockwood, Mary Kae

    1998-01-01

    NASA is cooperating with the aerospace industry to develop a space transportation system that provides reliable access-to-space at a much lower cost than is possible with today's launch vehicles. While this quest has been on-going for many years it received a major impetus when the U.S. Congress mandated as part of the 1993 NASA appropriations bill that: "In view of budget difficulties, present and future..., the National Aeronautics and Space Administration shall ... recommend improvements in space transportation." NASA, working with other organizations, including the Department of Transportation, and the Department of Defense identified three major transportation architecture options that were to be evaluated in the areas of reliability, operability and cost. These architectural options were: (1) retain and upgrade the Space Shuttle and the current expendable launch vehicles; (2) develop new expendable launch vehicles using conventional technologies and transition to these new vehicles beginning in 2005; and (3) develop new reusable vehicles using advanced technology, and transition to these vehicles beginning in 2008. The launch needs mission model was based on 1993 projections of civil, defense, and commercial payload requirements. This "Access to Space" study concluded that the option that provided the greatest potential for meeting the cost, operability, and reliability goals was a rocket-powered single-stage-to-orbit fully reusable launch vehicle (RLV) fleet designed with advanced technologies.

  12. Research of the pre-launch powered lubrication device of major parts of the engine D-240

    Science.gov (United States)

    Korchuganova, M.; Syrbakov, A.; Tkachev, A.; Zorina, T.

    2015-09-01

    In the publication, the issues have been considered concerning combustion engine start wear of mobile machines in case of outside storage in the conditions of low environmental temperature. Based on the analysis of existing methods and constructions of powered lubrication devices for contact surfaces of engines, a design of a combined device has been suggested which unites the functions of hydraulic and heat accumulators. On the basis of the elaborated design, preparatory tests have been conducted in order to evaluate the effectiveness of pre-start oil circulation in the engine D-240, as well as the effectiveness rate of thermoinsulation and the heating device of the hydraulic accumulator. The findings of the survey have shown that the pre-start powered lubrication device for major parts of the engine is effective.

  13. DEVELOPMENT OF THE ALGORITHM FOR CHOOSING THE OPTIMAL SCENARIO FOR THE DEVELOPMENT OF THE REGION'S ECONOMY

    Directory of Open Access Journals (Sweden)

    I. S. Borisova

    2018-01-01

    Full Text Available Purpose: the article deals with the development of an algorithm for choosing the optimal scenario for the development of the regional economy. Since the "Strategy for socio-economic development of the Lipetsk region for the period until 2020" does not contain scenarios for the development of the region, the algorithm for choosing the optimal scenario for the development of the regional economy is formalized. The scenarios for the development of the economy of the Lipetsk region according to the indicators of the Program of social and economic development are calculated: "Quality of life index", "Average monthly nominal wage", "Level of registered unemployment", "Growth rate of gross regional product", "The share of innovative products in the total volume of goods shipped, works performed and services rendered by industrial organizations", "Total volume of atmospheric pollution per unit GRP" and "Satisfaction of the population with the activity of executive bodies of state power of the region". Based on the calculation of development scenarios, the dynamics of the values of these indicators was developed in the implementation of scenarios for the development of the economy of the Lipetsk region in 2016–2020. Discounted financial costs of economic participants for realization of scenarios of development of economy of the Lipetsk region are estimated. It is shown that the current situation in the economy of the Russian Federation assumes the choice of a paradigm for the innovative development of territories and requires all participants in economic relations at the regional level to concentrate their resources on the creation of new science-intensive products. An assessment of the effects of the implementation of reasonable scenarios for the development of the economy of the Lipetsk region was carried out. It is shown that the most acceptable is the "base" scenario, which assumes a consistent change in the main indicators. The specific economic

  14. Launching to the Moon, Mars, and Beyond

    Science.gov (United States)

    Sumrall, John P.

    2007-01-01

    America is returning to the Moon in preparation for the first human footprint on Mars, guided by the U.S. Vision for Space Exploration. This presentation will discuss NASA's mission today, the reasons for returning to the Moon and going to Mars, and how NASA will accomplish that mission. The primary goals of the Vision for Space Exploration are to finish the International Space Station, retire the Space Shuttle, and build the new spacecraft needed to return people to the Moon and go to Mars. Unlike the Apollo program of the 1960s, this phase of exploration will be a journey, not a race. In 1966, the NASA's budget was 4 percent of federal spending. Today, with 6/10 of 1 percent of the budget, NASA must incrementally develop the vehicles, infrastructure, technology, and organization to accomplish this goal. Fortunately, our knowledge and experience are greater than they were 40 years ago. NASA's goal is a return to the Moon by 2020. The Moon is the first step to America's exploration of Mars. Many questions about the Moon's history and how its history is linked to that of Earth remain even after the brief Apollo explorations of the 1960s and 1970s. This new venture will carry more explorers to more diverse landing sites with more capable tools and equipment. The Moon also will serve as a training ground in several respects before embarking on the longer, more perilous trip to Mars. The journeys to the Moon and Mars will require a variety of vehicles, including the Ares I Crew Launch Vehicle, the Ares V Cargo Launch Vehicle, the Orion Crew Exploration Vehicle, and the Lunar Surface Access Module. The architecture for the lunar missions will use one launch to ferry the crew into orbit on the Ares I and a second launch to orbit the lunar lander and the Earth Departure Stage to send the lander and crew vehicle to the Moon. In order to reach the Moon and Mars within a lifetime and within budget, NASA is building on proven hardware and decades of experience derived from

  15. SU-G-201-09: Evaluation of a Novel Machine-Learning Algorithm for Permanent Prostate Brachytherapy Treatment Planning

    International Nuclear Information System (INIS)

    Nicolae, A; Lu, L; Morton, G; Chung, H; Helou, J; Al Hanaqta, M; Loblaw, A; Ravi, A; Heath, E

    2016-01-01

    Purpose: A novel, automated, algorithm for permanent prostate brachytherapy (PPB) treatment planning has been developed. The novel approach uses machine-learning (ML), a form of artificial intelligence, to substantially decrease planning time while simultaneously retaining the clinical intuition of plans created by radiation oncologists. This study seeks to compare the ML algorithm against expert-planned PPB plans to evaluate the equivalency of dosimetric and clinical plan quality. Methods: Plan features were computed from historical high-quality PPB treatments (N = 100) and stored in a relational database (RDB). The ML algorithm matched new PPB features to a highly similar case in the RDB; this initial plan configuration was then further optimized using a stochastic search algorithm. PPB pre-plans (N = 30) generated using the ML algorithm were compared to plan variants created by an expert dosimetrist (RT), and radiation oncologist (MD). Planning time and pre-plan dosimetry were evaluated using a one-way Student’s t-test and ANOVA, respectively (significance level = 0.05). Clinical implant quality was evaluated by expert PPB radiation oncologists as part of a qualitative study. Results: Average planning time was 0.44 ± 0.42 min compared to 17.88 ± 8.76 min for the ML algorithm and RT, respectively, a significant advantage [t(9), p = 0.01]. A post-hoc ANOVA [F(2,87) = 6.59, p = 0.002] using Tukey-Kramer criteria showed a significantly lower mean prostate V150% for the ML plans (52.9%) compared to the RT (57.3%), and MD (56.2%) plans. Preliminary qualitative study results indicate comparable clinical implant quality between RT and ML plans with a trend towards preference for ML plans. Conclusion: PPB pre-treatment plans highly comparable to those of an expert radiation oncologist can be created using a novel ML planning model. The use of an ML-based planning approach is expected to translate into improved PPB accessibility and plan uniformity.

  16. SU-G-201-09: Evaluation of a Novel Machine-Learning Algorithm for Permanent Prostate Brachytherapy Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Nicolae, A [Odette Cancer Centre, Sunnybrook Health Sciences Centre, Toronto, ON (Canada); Department of Physics, Ryerson University, Toronto, ON (Canada); Lu, L; Morton, G; Chung, H; Helou, J; Al Hanaqta, M; Loblaw, A; Ravi, A [Odette Cancer Centre, Sunnybrook Health Sciences Centre, Toronto, ON (Canada); Heath, E [Carleton Laboratory for Radiotherapy Physics, Carleton University, Ottawa, ON, CA (Canada)

    2016-06-15

    Purpose: A novel, automated, algorithm for permanent prostate brachytherapy (PPB) treatment planning has been developed. The novel approach uses machine-learning (ML), a form of artificial intelligence, to substantially decrease planning time while simultaneously retaining the clinical intuition of plans created by radiation oncologists. This study seeks to compare the ML algorithm against expert-planned PPB plans to evaluate the equivalency of dosimetric and clinical plan quality. Methods: Plan features were computed from historical high-quality PPB treatments (N = 100) and stored in a relational database (RDB). The ML algorithm matched new PPB features to a highly similar case in the RDB; this initial plan configuration was then further optimized using a stochastic search algorithm. PPB pre-plans (N = 30) generated using the ML algorithm were compared to plan variants created by an expert dosimetrist (RT), and radiation oncologist (MD). Planning time and pre-plan dosimetry were evaluated using a one-way Student’s t-test and ANOVA, respectively (significance level = 0.05). Clinical implant quality was evaluated by expert PPB radiation oncologists as part of a qualitative study. Results: Average planning time was 0.44 ± 0.42 min compared to 17.88 ± 8.76 min for the ML algorithm and RT, respectively, a significant advantage [t(9), p = 0.01]. A post-hoc ANOVA [F(2,87) = 6.59, p = 0.002] using Tukey-Kramer criteria showed a significantly lower mean prostate V150% for the ML plans (52.9%) compared to the RT (57.3%), and MD (56.2%) plans. Preliminary qualitative study results indicate comparable clinical implant quality between RT and ML plans with a trend towards preference for ML plans. Conclusion: PPB pre-treatment plans highly comparable to those of an expert radiation oncologist can be created using a novel ML planning model. The use of an ML-based planning approach is expected to translate into improved PPB accessibility and plan uniformity.

  17. Development of Speckle Interferometry Algorithm and System

    International Nuclear Information System (INIS)

    Shamsir, A. A. M.; Jafri, M. Z. M.; Lim, H. S.

    2011-01-01

    Electronic speckle pattern interferometry (ESPI) method is a wholefield, non destructive measurement method widely used in the industries such as detection of defects on metal bodies, detection of defects in intergrated circuits in digital electronics components and in the preservation of priceless artwork. In this research field, this method is widely used to develop algorithms and to develop a new laboratory setup for implementing the speckle pattern interferometry. In speckle interferometry, an optically rough test surface is illuminated with an expanded laser beam creating a laser speckle pattern in the space surrounding the illuminated region. The speckle pattern is optically mixed with a second coherent light field that is either another speckle pattern or a smooth light field. This produces an interferometric speckle pattern that will be detected by sensor to count the change of the speckle pattern due to force given. In this project, an experimental setup of ESPI is proposed to analyze a stainless steel plate using 632.8 nm (red) wavelength of lights.

  18. Development of a Framework for Genetic Algorithms

    OpenAIRE

    Wååg, Håkan

    2009-01-01

    Genetic algorithms is a method of optimization that can be used tosolve many different kinds of problems. This thesis focuses ondeveloping a framework for genetic algorithms that is capable ofsolving at least the two problems explored in the work. Otherproblems are supported by allowing user-made extensions.The purpose of this thesis is to explore the possibilities of geneticalgorithms for optimization problems and artificial intelligenceapplications.To test the framework two applications are...

  19. Ceremony celebrates 50 years of rocket launches

    Science.gov (United States)

    2000-01-01

    Ceremony celebrates 50 years of rocket launches PL00C-10364.12 At the 50th anniversary ceremony celebrating the first rocket launch from pad 3 on what is now Cape Canaveral Air Force Station, Norris Gray waves to the audience. Gray was part of the team who successfully launched the first rocket, known as Bumper 8. The ceremony was hosted by the Air Force Space & Missile Museum Foundation, Inc. , and included launch of a Bumper 8 model rocket, presentation of a Bumper Award to Florida Sen. George Kirkpatrick by the National Space Club; plus remarks by Sen. Kirkpatrick, KSC's Center Director Roy Bridges, and the Commander of the 45th Space Wing, Brig. Gen. Donald Pettit. Also attending the ceremony were other members of the original Bumper 8 team. A reception followed at Hangar C. Since 1950 there have been a total of 3,245 launches from Cape Canaveral.

  20. 14 CFR 417.111 - Launch plans.

    Science.gov (United States)

    2010-01-01

    ... classification and compatibility group as defined by part 420 of this chapter. (3) A graphic depiction of the... authorities, including the Federal Communications Commission. (g) Flight termination system electronic piece... for launch personnel control, handling of intruders, communications and coordination with launch...

  1. VISUALIZATION OF PAGERANK ALGORITHM

    OpenAIRE

    Perhaj, Ervin

    2013-01-01

    The goal of the thesis is to develop a web application that help users understand the functioning of the PageRank algorithm. The thesis consists of two parts. First we develop an algorithm to calculate PageRank values of web pages. The input of algorithm is a list of web pages and links between them. The user enters the list through the web interface. From the data the algorithm calculates PageRank value for each page. The algorithm repeats the process, until the difference of PageRank va...

  2. Soft-Fault Detection Technologies Developed for Electrical Power Systems

    Science.gov (United States)

    Button, Robert M.

    2004-01-01

    The NASA Glenn Research Center, partner universities, and defense contractors are working to develop intelligent power management and distribution (PMAD) technologies for future spacecraft and launch vehicles. The goals are to provide higher performance (efficiency, transient response, and stability), higher fault tolerance, and higher reliability through the application of digital control and communication technologies. It is also expected that these technologies will eventually reduce the design, development, manufacturing, and integration costs for large, electrical power systems for space vehicles. The main focus of this research has been to incorporate digital control, communications, and intelligent algorithms into power electronic devices such as direct-current to direct-current (dc-dc) converters and protective switchgear. These technologies, in turn, will enable revolutionary changes in the way electrical power systems are designed, developed, configured, and integrated in aerospace vehicles and satellites. Initial successes in integrating modern, digital controllers have proven that transient response performance can be improved using advanced nonlinear control algorithms. One technology being developed includes the detection of "soft faults," those not typically covered by current systems in use today. Soft faults include arcing faults, corona discharge faults, and undetected leakage currents. Using digital control and advanced signal analysis algorithms, we have shown that it is possible to reliably detect arcing faults in high-voltage dc power distribution systems (see the preceding photograph). Another research effort has shown that low-level leakage faults and cable degradation can be detected by analyzing power system parameters over time. This additional fault detection capability will result in higher reliability for long-lived power systems such as reusable launch vehicles and space exploration missions.

  3. Larval, pre-juvenile and juvenile development of Diapterus peruvianus (Perciformes: Gerreidae

    Directory of Open Access Journals (Sweden)

    Sylvia Patricia Adelheid Jiménez Rosenberg

    2003-06-01

    Full Text Available The development of Diapterus peruvianus (Sauvage 1879 is based on 60 larvae collected in superficial tows made in Bahía Concepción, and on 16 prejuvenile and juvenile organisms collected in Bahía de La Paz, B. C. S., México, using a standard plankton net and a rectangular epibenthonic net, respectively. Larvae of D. peruvianus show three large blotches on the dorsum of the gut that can fuse together and give the appearance of one large continuous blotch. There are two to three pre-anal pigments and 16 post-anal pigments in the ventral midline; cephalic pigments are present from the postflexion stage, as well as a serrated preoperculum. The prejuvenile and juvenile organisms are distinguished by their body depth, the analfin formula, the serrated preoperculum and the base pigments in the dorsal and anal fins.El desarrollo de Diapterus peruvianus se analizó con base en 60 larvas recolectadas en Bahía Concepción y 16 pre-juveniles y juveniles recolectados en la Ensenada de La Paz, B. C. S. México, usando respectivamente, una red estándar de plancton en arrastres superficiales y una red epibentónica para arrastres de plancton. Las larvas presentan desde la pre-flexión tres manchas alargadas sobre la superficie dorsal de la masa visceral, que pueden unirse y dar apariencia de pigmentación continua, observándose hasta 16 pigmentos post-anales en la línea media ventral y de dos a tres pigmentos pre-anales; la pigmentación cefálica así como la forma aserrada del pre-opérculo característica del género, aparecen a partir de la post-flexión. Los organismos pre-juveniles y juveniles se distinguen por la profundidad del cuerpo, la fórmula de la aleta anal, la fina forma aserrada del pre-opérculo y la pigmentación en la base de las aletas dorsal y anal.

  4. Development of Pre-Service Chemistry Teachers' Technological Pedagogical Content Knowledge

    Science.gov (United States)

    Cetin-Dindar, Ayla; Boz, Yezdan; Sonmez, Demet Yildiran; Celep, Nilgun Demirci

    2018-01-01

    In this study, a mixed-method design was employed to investigate pre-service chemistry teachers' Technological Pedagogical Content Knowledge (TPACK) development. For effective technology integration in instruction, knowledge about technology is not enough; teachers should have different knowledge types which are content, pedagogical, and…

  5. The Impacts of a Scalable Intervention on the Language and Literacy Development of Rural Pre-Kindergartners

    Science.gov (United States)

    Mashburn, Andrew; Justice, Laura M.; McGinty, Anita; Slocum, Laura

    2016-01-01

    Read It Again (RIA) is a curriculum for pre-kindergarten (pre-K) classrooms that targets children's development of language and literacy skills. A cluster randomized trial was conducted in which 104 pre-K classrooms in the Appalachian region of the United States were randomly assigned to one of three study conditions: Control (n = 30), RIA only…

  6. The Algorithmic Imaginary

    DEFF Research Database (Denmark)

    Bucher, Taina

    2017-01-01

    the notion of the algorithmic imaginary. It is argued that the algorithmic imaginary – ways of thinking about what algorithms are, what they should be and how they function – is not just productive of different moods and sensations but plays a generative role in moulding the Facebook algorithm itself...... of algorithms affect people's use of these platforms, if at all? To help answer these questions, this article examines people's personal stories about the Facebook algorithm through tweets and interviews with 25 ordinary users. To understand the spaces where people and algorithms meet, this article develops...

  7. Implementation on Landsat Data of a Simple Cloud Mask Algorithm Developed for MODIS Land Bands

    Science.gov (United States)

    Oreopoulos, Lazaros; Wilson, Michael J.; Varnai, Tamas

    2010-01-01

    This letter assesses the performance on Landsat-7 images of a modified version of a cloud masking algorithm originally developed for clear-sky compositing of Moderate Resolution Imaging Spectroradiometer (MODIS) images at northern mid-latitudes. While data from recent Landsat missions include measurements at thermal wavelengths, and such measurements are also planned for the next mission, thermal tests are not included in the suggested algorithm in its present form to maintain greater versatility and ease of use. To evaluate the masking algorithm we take advantage of the availability of manual (visual) cloud masks developed at USGS for the collection of Landsat scenes used here. As part of our evaluation we also include the Automated Cloud Cover Assesment (ACCA) algorithm that includes thermal tests and is used operationally by the Landsat-7 mission to provide scene cloud fractions, but no cloud masks. We show that the suggested algorithm can perform about as well as ACCA both in terms of scene cloud fraction and pixel-level cloud identification. Specifically, we find that the algorithm gives an error of 1.3% for the scene cloud fraction of 156 scenes, and a root mean square error of 7.2%, while it agrees with the manual mask for 93% of the pixels, figures very similar to those from ACCA (1.2%, 7.1%, 93.7%).

  8. Implementation of the LandTrendr Algorithm on Google Earth Engine

    Directory of Open Access Journals (Sweden)

    Robert E Kennedy

    2018-05-01

    Full Text Available The LandTrendr (LT algorithm has been used widely for analysis of change in Landsat spectral time series data, but requires significant pre-processing, data management, and computational resources, and is only accessible to the community in a proprietary programming language (IDL. Here, we introduce LT for the Google Earth Engine (GEE platform. The GEE platform simplifies pre-processing steps, allowing focus on the translation of the core temporal segmentation algorithm. Temporal segmentation involved a series of repeated random access calls to each pixel’s time series, resulting in a set of breakpoints (“vertices” that bound straight-line segments. The translation of the algorithm into GEE included both transliteration and code analysis, resulting in improvement and logic error fixes. At six study areas representing diverse land cover types across the U.S., we conducted a direct comparison of the new LT-GEE code against the heritage code (LT-IDL. The algorithms agreed in most cases, and where disagreements occurred, they were largely attributable to logic error fixes in the code translation process. The practical impact of these changes is minimal, as shown by an example of forest disturbance mapping. We conclude that the LT-GEE algorithm represents a faithful translation of the LT code into a platform easily accessible by the broader user community.

  9. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    Science.gov (United States)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  10. Fast geometric algorithms

    International Nuclear Information System (INIS)

    Noga, M.T.

    1984-01-01

    This thesis addresses a number of important problems that fall within the framework of the new discipline of Computational Geometry. The list of topics covered includes sorting and selection, convex hull algorithms, the L 1 hull, determination of the minimum encasing rectangle of a set of points, the Euclidean and L 1 diameter of a set of points, the metric traveling salesman problem, and finding the superrange of star-shaped and monotype polygons. The main theme of all the work was to develop a set of very fast state-of-the-art algorithms that supersede any rivals in terms of speed and ease of implementation. In some cases existing algorithms were refined; for others new techniques were developed that add to the present database of fast adaptive geometric algorithms. What emerges is a collection of techniques that is successful at merging modern tools developed in analysis of algorithms with those of classical geometry

  11. Development of new port in Minahasa Utara: A-pre feasibility study

    Science.gov (United States)

    Hamzah, Suharman; Abdurahman, Asad; Saputra, Reza; Aprianti, Evi

    2017-11-01

    In order to support Indonesian Government priority sector to build toll maritime in conjunction to connect "nusantara" as an archipelago country, and also to develop the frontier area of Indonesia, the existence of a port should get more in attention as the gate of Indonesia. A port has significant role in changing of transport mode and at the end as catalyst of economic growth. An important thing from pre-feasibility studies is the priority factor will clearly shown wheather it is good or fair. As a supporting to service passenger moving, container, general cargo and also bulk, a port is necessary to have an excellent design and planning. A pre-feasibility study is required to obtain a scientific basis based on value of interest and the necessity of the region. A pre-feasibility study in this paper aims to identify potential support of the region and giving priority location of development of new port in Minahasa Utara regarding to spatial, government issues, transportation aspect, regional economics, environmental aspect and technical aspect consideration. Experimental Method used are qualitative and quantitative, going through to the data obtained and supported by interview on the spot as well as questionnaire surveys. Result of this paper showing that there are 5 locations suitable with the requirements. Based on the level of priority, high (Kema, Likupang), moderate (Linuhu, Kahuku) and low (Gangga 1).

  12. National Security Space Launch Report

    Science.gov (United States)

    2006-01-01

    Company Clayton Mowry, President, Arianespace Inc., North American—“Launch Solutions” Elon Musk , CEO and CTO, Space Exploration Technologies (SpaceX...technologies to the NASA Exploration Initiative (“…Moon, Mars and Beyond.”).1 EELV Technology Needs The Atlas V and Delta IV vehicles incorporate current... Mars and other destinations.” 46 National Security Space Launch Report Figure 6.1 U.S. Government Liquid Propulsion Rocket Investment, 1991–2005

  13. NanoLaunch

    Science.gov (United States)

    Jones, Jonathan; Harris, Lawanna

    2015-01-01

    NASA's NanoLaunch effort will provide the framework to mature both Earth-to-orbit and on-orbit propulsion and avionics technologies while also providing affordable, dedicated access to low-Earth orbit for CubeSat-class payloads. The project will also serve as an early career personnel training opportunity with mentors to gain hands-on project experience.

  14. Management Challenges of Launching Multiple Payloads for Multiple Customers

    OpenAIRE

    Callen, Dave

    1999-01-01

    Orbital has provided launch services for multiple satellites as a means to provide greater economy for access to space. These include satellites from NASA, 000, commercial companies, universities, and foreign governments. While satellite customers view shared launches as a means to achieve reduced launch costs, this approach adds many complexities that a traditional launch service provider does not have to address for a dedicated launch. This paper will discuss some of the challenges associat...

  15. Dynamic modeling and ascent flight control of Ares-I Crew Launch Vehicle

    Science.gov (United States)

    Du, Wei

    This research focuses on dynamic modeling and ascent flight control of large flexible launch vehicles such as the Ares-I Crew Launch Vehicle (CLV). A complete set of six-degrees-of-freedom dynamic models of the Ares-I, incorporating its propulsion, aerodynamics, guidance and control, and structural flexibility, is developed. NASA's Ares-I reference model and the SAVANT Simulink-based program are utilized to develop a Matlab-based simulation and linearization tool for an independent validation of the performance and stability of the ascent flight control system of large flexible launch vehicles. A linearized state-space model as well as a non-minimum-phase transfer function model (which is typical for flexible vehicles with non-collocated actuators and sensors) are validated for ascent flight control design and analysis. This research also investigates fundamental principles of flight control analysis and design for launch vehicles, in particular the classical "drift-minimum" and "load-minimum" control principles. It is shown that an additional feedback of angle-of-attack can significantly improve overall performance and stability, especially in the presence of unexpected large wind disturbances. For a typical "non-collocated actuator and sensor" control problem for large flexible launch vehicles, non-minimum-phase filtering of "unstably interacting" bending modes is also shown to be effective. The uncertainty model of a flexible launch vehicle is derived. The robust stability of an ascent flight control system design, which directly controls the inertial attitude-error quaternion and also employs the non-minimum-phase filters, is verified by the framework of structured singular value (mu) analysis. Furthermore, nonlinear coupled dynamic simulation results are presented for a reference model of the Ares-I CLV as another validation of the feasibility of the ascent flight control system design. Another important issue for a single main engine launch vehicle is

  16. Dataset exploited for the development and validation of automated cyanobacteria quantification algorithm, ACQUA

    Directory of Open Access Journals (Sweden)

    Emanuele Gandola

    2016-09-01

    Full Text Available The estimation and quantification of potentially toxic cyanobacteria in lakes and reservoirs are often used as a proxy of risk for water intended for human consumption and recreational activities. Here, we present data sets collected from three volcanic Italian lakes (Albano, Vico, Nemi that present filamentous cyanobacteria strains at different environments. Presented data sets were used to estimate abundance and morphometric characteristics of potentially toxic cyanobacteria comparing manual Vs. automated estimation performed by ACQUA (“ACQUA: Automated Cyanobacterial Quantification Algorithm for toxic filamentous genera using spline curves, pattern recognition and machine learning” (Gandola et al., 2016 [1]. This strategy was used to assess the algorithm performance and to set up the denoising algorithm. Abundance and total length estimations were used for software development, to this aim we evaluated the efficiency of statistical tools and mathematical algorithms, here described. The image convolution with the Sobel filter has been chosen to denoise input images from background signals, then spline curves and least square method were used to parameterize detected filaments and to recombine crossing and interrupted sections aimed at performing precise abundances estimations and morphometric measurements. Keywords: Comparing data, Filamentous cyanobacteria, Algorithm, Deoising, Natural sample

  17. Algorithm development and verification of UASCM for multi-dimension and multi-group neutron kinetics model

    International Nuclear Information System (INIS)

    Si, S.

    2012-01-01

    The Universal Algorithm of Stiffness Confinement Method (UASCM) for neutron kinetics model of multi-dimensional and multi-group transport equations or diffusion equations has been developed. The numerical experiments based on transport theory code MGSNM and diffusion theory code MGNEM have demonstrated that the algorithm has sufficient accuracy and stability. (authors)

  18. Development of an algorithm for quantifying extremity biological tissue

    International Nuclear Information System (INIS)

    Pavan, Ana L.M.; Miranda, Jose R.A.; Pina, Diana R. de

    2013-01-01

    The computerized radiology (CR) has become the most widely used device for image acquisition and production, since its introduction in the 80s. The detection and early diagnosis, obtained via CR, are important for the successful treatment of diseases such as arthritis, metabolic bone diseases, tumors, infections and fractures. However, the standards used for optimization of these images are based on international protocols. Therefore, it is necessary to compose radiographic techniques for CR system that provides a secure medical diagnosis, with doses as low as reasonably achievable. To this end, the aim of this work is to develop a quantifier algorithm of tissue, allowing the construction of a homogeneous end used phantom to compose such techniques. It was developed a database of computed tomography images of hand and wrist of adult patients. Using the Matlab ® software, was developed a computational algorithm able to quantify the average thickness of soft tissue and bones present in the anatomical region under study, as well as the corresponding thickness in simulators materials (aluminium and lucite). This was possible through the application of mask and Gaussian removal technique of histograms. As a result, was obtained an average thickness of soft tissue of 18,97 mm and bone tissue of 6,15 mm, and their equivalents in materials simulators of 23,87 mm of acrylic and 1,07mm of aluminum. The results obtained agreed with the medium thickness of biological tissues of a patient's hand pattern, enabling the construction of an homogeneous phantom

  19. Development of Mitsubishi--Lurgi fluidized bd incinerator with pre-drying hearths

    Energy Technology Data Exchange (ETDEWEB)

    Hori, Y; Senshu, A; Mishima, K; Sato, T; Honda, H

    1979-02-01

    For a better disposal of a steadily increasing volume of sludges with energy conservation it is essential to develop an effective and energy-saving incinerator. The fluidized bed incinerator now widely used for the disposal of sludges has many superior features as compared with the conventional vertical multiple-hearth incinerator, but, on the other hand, has a defect, that is, a large fuel consumption. This is due to the fact that the fluidized bed incinerator has generally low drying efficiency notwithstanding its excellent burning characteristics with minimum excess air. The feasibility of fuel saving by installing sludge pre-drying hearths and an exhaust gas recirculation system additionally on the conventional fluidized bed incinerator and conducted incineration tests on various kinds of sludges, using a 1500 kg/h pilot plant equipped with the incinerator is examined. As the result, the Mitsubishi--Lurgi fluidized bed incinerator with high efficiency multiple pre-drying hearths which consumes less fuel was developed. Part of the incineration test results are presented.

  20. Distributed Web-Based Expert System for Launch Operations

    Science.gov (United States)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar

    2005-01-01

    The simulation and modeling of launch operations is based on a representation of the organization of the operations suitable to experiment of the physical, procedural, software, hardware and psychological aspects of space flight operations. The virtual test bed consists of a weather expert system to advice on the effect of weather to the launch operations. It also simulates toxic gas dispersion model, and the risk impact on human health. Since all modeling and simulation is based on the internet, it could reduce the cost of operations of launch and range safety by conducting extensive research before a particular launch. Each model has an independent decision making module to derive the best decision for launch.