WorldWideScience

Sample records for models enable onboard

  1. Weather-enabled future onboard surveillance and navigation systems

    Science.gov (United States)

    Mutuel, L.; Baillon, B.; Barnetche, B.; Delpy, P.

    2009-09-01

    With the increasing traffic and the development of business trajectories, there is a widespread need to anticipate any adverse weather conditions that could impact the performance of the flight or to use of atmospheric parameters to optimize trajectories. Current sensors onboard air transport are challenged to provide the required service, while new products for business jets and general aviation open the door to innovative assimilation of weather information in onboard surveillance and navigation. The paper aims at surveying current technology available to air transport aircraft and pointing out their shortcomings in view of the modernization proposed in SESAR and NextGen implementation plans. Foreseen innovations are then illustrated via results of ongoing research like FLYSAFE or standardization efforts, in particular meteorological datalink services and impact on Human-Machine Interface. The paper covers the operational need to avoid adverse weather like thunderstorm, icing, turbulence, windshear and volcanic ash, but also the requirement to control in 4D the trajectory through the integration of wind and temperature grids in the flight management. The former will lead to enhanced surveillance systems onboard the aircraft with new displays and new alerting schemes, ranging from targeted information supporting better re-planning to auto-escape strategies. The latter will be standard in next generation flight management systems. Finally both will rely on ATM products that will also assimilate weather information so that situational awareness is shared and decision is collaborative.

  2. A new model for understanding teamwork onboard: the shipmate model.

    Science.gov (United States)

    Espevik, Roar; Olsen, Olav Kjellevold

    2013-01-01

    The increasing complexity onboard a ship underline the importance of crews that are able to coordinate and cooperate with each other to facilitate task objectives through a shared understanding of resources (e.g. team members' knowledge, skills and experience), the crew's goals, and the constrains under which they work. Rotation of personnel through 24/7 shift-work schedules and replacements often put crews ina position of having little or no previous history as a team. Findings from 3 studies indicated that unfamiliar teams used less efficient coordination strategies which reduced efficiency and increased levels of stress in situations where team members where experts on task, distributed or unknown to task and environment.Implications for staffing, safety and training are discussed.

  3. Enabling model customization and integration

    Science.gov (United States)

    Park, Minho; Fishwick, Paul A.

    2003-09-01

    Until fairly recently, the idea of dynamic model content and presentation were treated synonymously. For example, if one was to take a data flow network, which captures the dynamics of a target system in terms of the flow of data through nodal operators, then one would often standardize on rectangles and arrows for the model display. The increasing web emphasis on XML, however, suggests that the network model can have its content specified in an XML language, and then the model can be represented in a number of ways depending on the chosen style. We have developed a formal method, based on styles, that permits a model to be specified in XML and presented in 1D (text), 2D, and 3D. This method allows for customization and personalization to exert their benefits beyond e-commerce, to the area of model structures used in computer simulation. This customization leads naturally to solving the bigger problem of model integration - the act of taking models of a scene and integrating them with that scene so that there is only one unified modeling interface. This work focuses mostly on customization, but we address the integration issue in the future work section.

  4. Development of fast scattering model of complex shape target for seminatural tests of onboard proximity radars in real time mode

    Directory of Open Access Journals (Sweden)

    Likhoedenko Andrei K.

    2016-01-01

    Full Text Available Problems of creation of models of real time of complex shape targets on the basis of use of their polygonal models are considered. Formulas for radar cross section of multipoint model of target and power of input signal of onboard radar are described. Technique of semi-natural tests of onboard radar detector on the base of multipoint model of target is proposed. Results of digital simulation of input signals of the onboard radar detector of the target from the aerodynamic target on the basis of their multipoint models are given.

  5. Semantic modeling and structural synthesis of onboard electronics protection means as open information system

    Science.gov (United States)

    Zhevnerchuk, D. V.; Surkova, A. S.; Lomakina, L. S.; Golubev, A. S.

    2018-05-01

    The article describes the component representation approach and semantic models of on-board electronics protection from ionizing radiation of various nature. Semantic models are constructed, the feature of which is the representation of electronic elements, protection modules, sources of impact in the form of blocks with interfaces. The rules of logical inference and algorithms for synthesizing the object properties of the semantic network, imitating the interface between the components of the protection system and the sources of radiation, are developed. The results of the algorithm are considered using the example of radiation-resistant microcircuits 1645RU5U, 1645RT2U and the calculation and experimental method for estimating the durability of on-board electronics.

  6. Modeling the reaction kinetics of a hydrogen generator onboard a fuel cell -- Electric hybrid motorcycle

    Science.gov (United States)

    Ganesh, Karthik

    Owing to the perceived decline of the fossil fuel reserves in the world and environmental issues like pollution, conventional fuels may be replaced by cleaner alternative fuels. The potential of hydrogen as a fuel in vehicular applications is being explored. Hydrogen as an energy carrier potentially finds applications in internal combustion engines and fuel cells because it is considered a clean fuel and has high specific energy. However, at 6 to 8 per kilogram, not only is hydrogen produced from conventional methods like steam reforming expensive, but also there are storage and handling issues, safety concerns and lack of hydrogen refilling stations across the country. The purpose of this research is to suggest a cheap and viable system that generates hydrogen on demand through a chemical reaction between an aluminum-water slurry and an aqueous sodium hydroxide solution to power a 2 kW fuel cell on a fuel cell hybrid motorcycle. This reaction is essentially an aluminum-water reaction where sodium hydroxide acts as a reaction promoter or catalyst. The Horizon 2000 fuel cell used for this purpose has a maximum hydrogen intake rate of 28 lpm. The study focuses on studying the exothermic reaction between the reactants and proposes a rate law that best describes the rate of generation of hydrogen in connection to the surface area of aluminum available for the certain reaction and the concentration of the sodium hydroxide solution. Further, the proposed rate law is used in the simulation model of the chemical reactor onboard the hybrid motorcycle to determine the hydrogen flow rate to the fuel cell with time. Based on the simulated rate of production of hydrogen from the chemical system, its feasibility of use on different drive cycles is analyzed. The rate of production of hydrogen with a higher concentration of sodium hydroxide and smaller aluminum powder size was found to enable the installation of the chemical reactor on urban cycles with frequent stops and starts

  7. Fault diagnostics for turbo-shaft engine sensors based on a simplified on-board model.

    Science.gov (United States)

    Lu, Feng; Huang, Jinquan; Xing, Yaodong

    2012-01-01

    Combining a simplified on-board turbo-shaft model with sensor fault diagnostic logic, a model-based sensor fault diagnosis method is proposed. The existing fault diagnosis method for turbo-shaft engine key sensors is mainly based on a double redundancies technique, and this can't be satisfied in some occasions as lack of judgment. The simplified on-board model provides the analytical third channel against which the dual channel measurements are compared, while the hardware redundancy will increase the structure complexity and weight. The simplified turbo-shaft model contains the gas generator model and the power turbine model with loads, this is built up via dynamic parameters method. Sensor fault detection, diagnosis (FDD) logic is designed, and two types of sensor failures, such as the step faults and the drift faults, are simulated. When the discrepancy among the triplex channels exceeds a tolerance level, the fault diagnosis logic determines the cause of the difference. Through this approach, the sensor fault diagnosis system achieves the objectives of anomaly detection, sensor fault diagnosis and redundancy recovery. Finally, experiments on this method are carried out on a turbo-shaft engine, and two types of faults under different channel combinations are presented. The experimental results show that the proposed method for sensor fault diagnostics is efficient.

  8. Fault Diagnostics for Turbo-Shaft Engine Sensors Based on a Simplified On-Board Model

    Directory of Open Access Journals (Sweden)

    Yaodong Xing

    2012-08-01

    Full Text Available Combining a simplified on-board turbo-shaft model with sensor fault diagnostic logic, a model-based sensor fault diagnosis method is proposed. The existing fault diagnosis method for turbo-shaft engine key sensors is mainly based on a double redundancies technique, and this can’t be satisfied in some occasions as lack of judgment. The simplified on-board model provides the analytical third channel against which the dual channel measurements are compared, while the hardware redundancy will increase the structure complexity and weight. The simplified turbo-shaft model contains the gas generator model and the power turbine model with loads, this is built up via dynamic parameters method. Sensor fault detection, diagnosis (FDD logic is designed, and two types of sensor failures, such as the step faults and the drift faults, are simulated. When the discrepancy among the triplex channels exceeds a tolerance level, the fault diagnosis logic determines the cause of the difference. Through this approach, the sensor fault diagnosis system achieves the objectives of anomaly detection, sensor fault diagnosis and redundancy recovery. Finally, experiments on this method are carried out on a turbo-shaft engine, and two types of faults under different channel combinations are presented. The experimental results show that the proposed method for sensor fault diagnostics is efficient.

  9. Modeling and Simulation of Truck Engine Cooling System for Onboard Diagnosis

    Institute of Scientific and Technical Information of China (English)

    朱正礼; 张建武; 包继华

    2004-01-01

    A cooling system model of a selected internal combustion engine has been built for onboard diagnosis. The model uses driving cycle data available within the production Engine Control Module (ECM): vehicle speed, engine speed, and fuel flow rate for the given ambient temperature and pressure, etc. Based on the conservation laws for heat transfer and mass flow process, the mathematical descriptions for the components involved in the cooling circuit are obtained and all the components are integrated into a model on Matlab/Simulink platform. The model can simulate the characteristics of thermostat (e.g. time-lag, hysteresis effect).The changes of coolant temperature, heat transfer flow rate, and pressure at individual component site are also shown.

  10. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address

  11. Environmental Models as a Service: Enabling Interoperability ...

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantage of streamlined deployment processes and affordable cloud access to move algorithms and data to the web for discoverability and consumption. In these deployments, environmental models can become available to end users through RESTful web services and consistent application program interfaces (APIs) that consume, manipulate, and store modeling data. RESTful modeling APIs also promote discoverability and guide usability through self-documentation. Embracing the RESTful paradigm allows models to be accessible via a web standard, and the resulting endpoints are platform- and implementation-agnostic while simultaneously presenting significant computational capabilities for spatial and temporal scaling. RESTful APIs present data in a simple verb-noun web request interface: the verb dictates how a resource is consumed using HTTP methods (e.g., GET, POST, and PUT) and the noun represents the URL reference of the resource on which the verb will act. The RESTful API can self-document in both the HTTP response and an interactive web page using the Open API standard. This lets models function as an interoperable service that promotes sharing, documentation, and discoverability. Here, we discuss the

  12. Adapting Modeling & SImulation for Network Enabled Operations

    Science.gov (United States)

    2011-03-01

    Awareness in Aerospace Operations ( AGARD - CP -478; pp. 5/1-5/8), Neuilly Sur Seine, France: NATO- AGARD . 243 ChApter 8 ShAping uk defenCe poliCy...Chapter 3 73 Increasing the Maturity of Command to Deal with Complex, Information Age Environments • Players could concentrate on their own areas; they...The results are shown in figure 4.16, which shows the fit for the first four serials. The model still explains 73 % of the vari- ability, down from 82

  13. Modeling-Enabled Systems Nutritional Immunology

    Science.gov (United States)

    Verma, Meghna; Hontecillas, Raquel; Abedi, Vida; Leber, Andrew; Tubau-Juni, Nuria; Philipson, Casandra; Carbo, Adria; Bassaganya-Riera, Josep

    2016-01-01

    This review highlights the fundamental role of nutrition in the maintenance of health, the immune response, and disease prevention. Emerging global mechanistic insights in the field of nutritional immunology cannot be gained through reductionist methods alone or by analyzing a single nutrient at a time. We propose to investigate nutritional immunology as a massively interacting system of interconnected multistage and multiscale networks that encompass hidden mechanisms by which nutrition, microbiome, metabolism, genetic predisposition, and the immune system interact to delineate health and disease. The review sets an unconventional path to apply complex science methodologies to nutritional immunology research, discovery, and development through “use cases” centered around the impact of nutrition on the gut microbiome and immune responses. Our systems nutritional immunology analyses, which include modeling and informatics methodologies in combination with pre-clinical and clinical studies, have the potential to discover emerging systems-wide properties at the interface of the immune system, nutrition, microbiome, and metabolism. PMID:26909350

  14. Modeling-Enabled Systems Nutritional Immunology

    Directory of Open Access Journals (Sweden)

    Meghna eVerma

    2016-02-01

    Full Text Available This review highlights the fundamental role of nutrition in the maintenance of health, the immune response and disease prevention. Emerging global mechanistic insights in the field of nutritional immunology cannot be gained through reductionist methods alone or by analyzing a single nutrient at a time. We propose to investigate nutritional immunology as a massively interacting system of interconnected multistage and multiscale networks that encompass hidden mechanisms by which nutrition, microbiome, metabolism, genetic predisposition and the immune system interact to delineate health and disease. The review sets an unconventional path to applying complex science methodologies to nutritional immunology research, discovery and development through ‘use cases’ centered around the impact of nutrition on the gut microbiome and immune responses. Our systems nutritional immunology analyses, that include modeling and informatics methodologies in combination with pre-clinical and clinical studies, have the potential to discover emerging systems-wide properties at the interface of the immune system, nutrition, microbiome, and metabolism.

  15. Green communication: The enabler to multiple business models

    DEFF Research Database (Denmark)

    Lindgren, Peter; Clemmensen, Suberia; Taran, Yariv

    2010-01-01

    Companies stand at the forefront of a new business model reality with new potentials - that will change their basic understanding and practice of running their business models radically. One of the drivers to this change is green communication, its strong relation to green business models and its...... possibility to enable lower energy consumption. This paper shows how green communication enables innovation of green business models and multiple business models running simultaneously in different markets to different customers.......Companies stand at the forefront of a new business model reality with new potentials - that will change their basic understanding and practice of running their business models radically. One of the drivers to this change is green communication, its strong relation to green business models and its...

  16. Sample-Data Modeling of a Zero Voltage Transition DC-DC Converter for On-Board Battery Charger in EV

    Directory of Open Access Journals (Sweden)

    Teresa R. Granados-Luna

    2014-01-01

    Full Text Available Battery charger is a key device in electric and hybrid electric vehicles. On-board and off-board topologies are available in the market. Lightweight, small, high performance, and simple control are desired characteristics for on-board chargers. Moreover, isolated single-phase topologies are the most common system in Level 1 battery charger topologies. Following this trend, this paper proposes a sampled-data modelling strategy of a zero voltage transition (ZVT DC-DC converter for an on-board battery charger. A piece-wise linear analysis of the converter is the basis of the technique presented such that a large-signal model and, therefore, a small-signal model of the converter are derived. Numerical and simulation results of a 250 W test rig validate the model.

  17. Statistical analysis of road-vehicle-driver interaction as an enabler to designing behavioural models

    International Nuclear Information System (INIS)

    Chakravarty, T; Chowdhury, A; Ghose, A; Bhaumik, C; Balamuralidhar, P

    2014-01-01

    Telematics form an important technology enabler for intelligent transportation systems. By deploying on-board diagnostic devices, the signatures of vehicle vibration along with its location and time are recorded. Detailed analyses of the collected signatures offer deep insights into the state of the objects under study. Towards that objective, we carried out experiments by deploying telematics device in one of the office bus that ferries employees to office and back. Data is being collected from 3-axis accelerometer, GPS, speed and the time for all the journeys. In this paper, we present initial results of the above exercise by applying statistical methods to derive information through systematic analysis of the data collected over four months. It is demonstrated that the higher order derivative of the measured Z axis acceleration samples display the properties Weibull distribution when the time axis is replaced by the amplitude of such processed acceleration data. Such an observation offers us a method to predict future behaviour where deviations from prediction are classified as context-based aberrations or progressive degradation of the system. In addition we capture the relationship between speed of the vehicle and median of the jerk energy samples using regression analysis. Such results offer an opportunity to develop a robust method to model road-vehicle interaction thereby enabling us to predict such like driving behaviour and condition based maintenance etc

  18. BIM-enabled Conceptual Modelling and Representation of Building Circulation

    OpenAIRE

    Lee, Jin Kook; Kim, Mi Jeong

    2014-01-01

    This paper describes how a building information modelling (BIM)-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs), which follow an object-oriented data modelli...

  19. Sparsity enabled cluster reduced-order models for control

    Science.gov (United States)

    Kaiser, Eurika; Morzyński, Marek; Daviller, Guillaume; Kutz, J. Nathan; Brunton, Bingni W.; Brunton, Steven L.

    2018-01-01

    Characterizing and controlling nonlinear, multi-scale phenomena are central goals in science and engineering. Cluster-based reduced-order modeling (CROM) was introduced to exploit the underlying low-dimensional dynamics of complex systems. CROM builds a data-driven discretization of the Perron-Frobenius operator, resulting in a probabilistic model for ensembles of trajectories. A key advantage of CROM is that it embeds nonlinear dynamics in a linear framework, which enables the application of standard linear techniques to the nonlinear system. CROM is typically computed on high-dimensional data; however, access to and computations on this full-state data limit the online implementation of CROM for prediction and control. Here, we address this key challenge by identifying a small subset of critical measurements to learn an efficient CROM, referred to as sparsity-enabled CROM. In particular, we leverage compressive measurements to faithfully embed the cluster geometry and preserve the probabilistic dynamics. Further, we show how to identify fewer optimized sensor locations tailored to a specific problem that outperform random measurements. Both of these sparsity-enabled sensing strategies significantly reduce the burden of data acquisition and processing for low-latency in-time estimation and control. We illustrate this unsupervised learning approach on three different high-dimensional nonlinear dynamical systems from fluids with increasing complexity, with one application in flow control. Sparsity-enabled CROM is a critical facilitator for real-time implementation on high-dimensional systems where full-state information may be inaccessible.

  20. Formal Modeling and Verification of Opportunity-enabled Risk Management

    OpenAIRE

    Aldini, Alessandro; Seigneur, Jean-Marc; Ballester Lafuente, Carlos; Titi, Xavier; Guislain, Jonathan

    2015-01-01

    With the advent of the Bring-Your-Own-Device (BYOD) trend, mobile work is achieving a widespread diffusion that challenges the traditional view of security standard and risk management. A recently proposed model, called opportunity-enabled risk management (OPPRIM), aims at balancing the analysis of the major threats that arise in the BYOD setting with the analysis of the potential increased opportunities emerging in such an environment, by combining mechanisms of risk estimation with trust an...

  1. BIM-Enabled Conceptual Modelling and Representation of Building Circulation

    Directory of Open Access Journals (Sweden)

    Jin Kook Lee

    2014-08-01

    Full Text Available This paper describes how a building information modelling (BIM-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs, which follow an object-oriented data modelling methodology. Advances in BIM authoring tools, using space objects and their relations defined in an IFC's schema, have made it possible to model, visualize and analyse circulation within buildings prior to their construction. Agent-based circulation has long been an interdisciplinary topic of research across several areas, including design computing, computer science, architectural morphology, human behaviour and environmental psychology. Such conventional approaches to building circulation are centred on navigational knowledge about built environments, and represent specific circulation paths and regulations. This paper, however, places emphasis on the use of ‘space objects’ in BIM-enabled design processes rather than on circulation agents, the latter of which are not defined in the IFCs' schemas. By introducing and reviewing some associated research and projects, this paper also surveys how such a circulation representation is applicable to the analysis of building circulation-related rules.

  2. MATHEMATICAL MODELS OF PROCESSES AND SYSTEMS OF TECHNICAL OPERATION FOR ONBOARD COMPLEXES AND FUNCTIONAL SYSTEMS OF AVIONICS

    Directory of Open Access Journals (Sweden)

    Sergey Viktorovich Kuznetsov

    2017-01-01

    Full Text Available Modern aircraft are equipped with complicated systems and complexes of avionics. Aircraft and its avionics tech- nical operation process is observed as a process with changing of operation states. Mathematical models of avionics pro- cesses and systems of technical operation are represented as Markov chains, Markov and semi-Markov processes. The pur- pose is to develop the graph-models of avionics technical operation processes, describing their work in flight, as well as during maintenance on the ground in the various systems of technical operation. The graph-models of processes and sys- tems of on-board complexes and functional avionics systems in flight are proposed. They are based on the state tables. The models are specified for the various technical operation systems: the system with control of the reliability level, the system with parameters control and the system with resource control. The events, which cause the avionics complexes and func- tional systems change their technical state, are failures and faults of built-in test equipment. Avionics system of technical operation with reliability level control is applicable for objects with constant or slowly varying in time failure rate. Avion- ics system of technical operation with resource control is mainly used for objects with increasing over time failure rate. Avionics system of technical operation with parameters control is used for objects with increasing over time failure rate and with generalized parameters, which can provide forecasting and assign the borders of before-fail technical states. The pro- posed formal graphical approach avionics complexes and systems models designing is the basis for models and complex systems and facilities construction, both for a single aircraft and for an airline aircraft fleet, or even for the entire aircraft fleet of some specific type. The ultimate graph-models for avionics in various systems of technical operation permit the beginning of

  3. DEFINE: A Service-Oriented Dynamically Enabling Function Model

    Directory of Open Access Journals (Sweden)

    Tan Wei-Yi

    2017-01-01

    In this paper, we introduce an innovative Dynamically Enable Function In Network Equipment (DEFINE to allow tenant get the network service quickly. First, DEFINE decouples an application into different functional components, and connects these function components in a reconfigurable method. Second, DEFINE provides a programmable interface to the third party, who can develop their own processing modules according to their own needs. To verify the effectiveness of this model, we set up an evaluating network with a FPGA-based OpenFlow switch prototype, and deployed several applications on it. Our results show that DEFINE has excellent flexibility and performance.

  4. Perspectives on Modelling BIM-enabled Estimating Practices

    Directory of Open Access Journals (Sweden)

    Willy Sher

    2014-12-01

    Full Text Available BIM-enabled estimating processes do not replace or provide a substitute for the traditional approaches used in the architecture, engineering and construction industries. This paper explores the impact of BIM on these traditional processes.  It identifies differences between the approaches used with BIM and other conventional methods, and between the various construction professionals that prepare estimates. We interviewed 17 construction professionals from client organizations, contracting organizations, consulting practices and specialist-project firms. Our analyses highlight several logical relationships between estimating processes and BIM attributes. Estimators need to respond to the challenges BIM poses to traditional estimating practices. BIM-enabled estimating circumvents long-established conventions and traditional approaches, and focuses on data management.  Consideration needs to be given to the model data required for estimating, to the means by which these data may be harnessed when exported, to the means by which the integrity of model data are protected, to the creation and management of tools that work effectively and efficiently in multi-disciplinary settings, and to approaches that narrow the gap between virtual reality and actual reality.  Areas for future research are also identified in the paper.

  5. ARCHITECTURES AND ALGORITHMS FOR COGNITIVE NETWORKS ENABLED BY QUALITATIVE MODELS

    DEFF Research Database (Denmark)

    Balamuralidhar, P.

    2013-01-01

    traditional limitations and potentially achieving better performance. The vision is that, networks should be able to monitor themselves, reason upon changes in self and environment, act towards the achievement of specific goals and learn from experience. The concept of a Cognitive Engine (CE) supporting...... cognitive functions, as part of network elements, enabling above said autonomic capabilities is gathering attention. Awareness of the self and the world is an important aspect of the cognitive engine to be autonomic. This is achieved through embedding their models in the engine, but the complexity...... of the cognitive engine that incorporates a context space based information structure to its knowledge model. I propose a set of guiding principles behind a cognitive system to be autonomic and use them with additional requirements to build a detailed architecture for the cognitive engine. I define a context space...

  6. A Modelling Framework for estimating Road Segment Based On-Board Vehicle Emissions

    International Nuclear Information System (INIS)

    Lin-Jun, Yu; Ya-Lan, Liu; Yu-Huan, Ren; Zhong-Ren, Peng; Meng, Liu Meng

    2014-01-01

    Traditional traffic emission inventory models aim to provide overall emissions at regional level which cannot meet planners' demand for detailed and accurate traffic emissions information at the road segment level. Therefore, a road segment-based emission model for estimating light duty vehicle emissions is proposed, where floating car technology is used to collect information of traffic condition of roads. The employed analysis framework consists of three major modules: the Average Speed and the Average Acceleration Module (ASAAM), the Traffic Flow Estimation Module (TFEM) and the Traffic Emission Module (TEM). The ASAAM is used to obtain the average speed and the average acceleration of the fleet on each road segment using FCD. The TFEM is designed to estimate the traffic flow of each road segment in a given period, based on the speed-flow relationship and traffic flow spatial distribution. Finally, the TEM estimates emissions from each road segment, based on the results of previous two modules. Hourly on-road light-duty vehicle emissions for each road segment in Shenzhen's traffic network are obtained using this analysis framework. The temporal-spatial distribution patterns of the pollutant emissions of road segments are also summarized. The results show high emission road segments cluster in several important regions in Shenzhen. Also, road segments emit more emissions during rush hours than other periods. The presented case study demonstrates that the proposed approach is feasible and easy-to-use to help planners make informed decisions by providing detailed road segment-based emission information

  7. Performance modeling of unmanned aerial vehicles with on-board energy harvesting

    Science.gov (United States)

    Anton, Steven R.; Inman, Daniel J.

    2011-03-01

    The concept of energy harvesting in unmanned aerial vehicles (UAVs) has received much attention in recent years. Solar powered flight of small aircraft dates back to the 1970s when the first fully solar flight of an unmanned aircraft took place. Currently, research has begun to investigate harvesting ambient vibration energy during the flight of UAVs. The authors have recently developed multifunctional piezoelectric self-charging structures in which piezoelectric devices are combined with thin-film lithium batteries and a substrate layer in order to simultaneously harvest energy, store energy, and carry structural load. When integrated into mass and volume critical applications, such as unmanned aircraft, multifunctional devices can provide great benefit over conventional harvesting systems. A critical aspect of integrating any energy harvesting system into a UAV, however, is the potential effect that the additional system has on the performance of the aircraft. Added mass and increased drag can significantly degrade the flight performance of an aircraft, therefore, it is important to ensure that the addition of an energy harvesting system does not adversely affect the efficiency of a host aircraft. In this work, a system level approach is taken to examine the effects of adding both solar and piezoelectric vibration harvesting to a UAV test platform. A formulation recently presented in the literature is applied to describe the changes to the flight endurance of a UAV based on the power available from added harvesters and the mass of the harvesters. Details of the derivation of the flight endurance model are reviewed and the formulation is applied to an EasyGlider remote control foam hobbyist airplane, which is selected as the test platform for this study. A theoretical study is performed in which the normalized change in flight endurance is calculated based on the addition of flexible thin-film solar panels to the upper surface of the wings, as well as the addition

  8. Model-Based Building Detection from Low-Cost Optical Sensors Onboard Unmanned Aerial Vehicles

    Science.gov (United States)

    Karantzalos, K.; Koutsourakis, P.; Kalisperakis, I.; Grammatikopoulos, L.

    2015-08-01

    The automated and cost-effective building detection in ultra high spatial resolution is of major importance for various engineering and smart city applications. To this end, in this paper, a model-based building detection technique has been developed able to extract and reconstruct buildings from UAV aerial imagery and low-cost imaging sensors. In particular, the developed approach through advanced structure from motion, bundle adjustment and dense image matching computes a DSM and a true orthomosaic from the numerous GoPro images which are characterised by important geometric distortions and fish-eye effect. An unsupervised multi-region, graphcut segmentation and a rule-based classification is responsible for delivering the initial multi-class classification map. The DTM is then calculated based on inpaininting and mathematical morphology process. A data fusion process between the detected building from the DSM/DTM and the classification map feeds a grammar-based building reconstruction and scene building are extracted and reconstructed. Preliminary experimental results appear quite promising with the quantitative evaluation indicating detection rates at object level of 88% regarding the correctness and above 75% regarding the detection completeness.

  9. Onboard Autonomous Corrections for Accurate IRF Pointing.

    Science.gov (United States)

    Jorgensen, J. L.; Betto, M.; Denver, T.

    2002-05-01

    Over the past decade, the Noise Equivalent Angle (NEA) of onboard attitude reference instruments, has decreased from tens-of-arcseconds to the sub-arcsecond level. This improved performance is partly due to improved sensor-technology with enhanced signal to noise ratios, partly due to improved processing electronics which allows for more sophisticated and faster signal processing. However, the main reason for the increased precision, is the application of onboard autonomy, which apart from simple outlier rejection also allows for removal of "false positive" answers, and other "unexpected" noise sources, that otherwise would degrade the quality of the measurements (e.g. discrimination between signals caused by starlight and ionizing radiation). The utilization of autonomous signal processing has also provided the means for another onboard processing step, namely the autonomous recovery from lost in space, where the attitude instrument without a priori knowledge derive the absolute attitude, i.e. in IRF coordinates, within fractions of a second. Combined with precise orbital state or position data, the absolute attitude information opens for multiple ways to improve the mission performance, either by reducing operations costs, by increasing pointing accuracy, by reducing mission expendables, or by providing backup decision information in case of anomalies. The Advanced Stellar Compass's (ASC) is a miniature, high accuracy, attitude instrument which features fully autonomous operations. The autonomy encompass all direct steps from automatic health checkout at power-on, over fully automatic SEU and SEL handling and proton induced sparkle removal, to recovery from "lost in space", and optical disturbance detection and handling. But apart from these more obvious autonomy functions, the ASC also features functions to handle and remove the aforementioned residuals. These functions encompass diverse operators such as a full orbital state vector model with automatic cloud

  10. SpF: Enabling Petascale Performance for Pseudospectral Dynamo Models

    Science.gov (United States)

    Jiang, W.; Clune, T.; Vriesema, J.; Gutmann, G.

    2013-12-01

    Pseudospectral (PS) methods possess a number of characteristics (e.g., efficiency, accuracy, natural boundary conditions) that are extremely desirable for dynamo models. Unfortunately, dynamo models based upon PS methods face a number of daunting challenges, which include exposing additional parallelism, leveraging hardware accelerators, exploiting hybrid parallelism, and improving the scalability of global memory transposes. Although these issues are a concern for most models, solutions for PS methods tend to require far more pervasive changes to underlying data and control structures. Further, improvements in performance in one model are difficult to transfer to other models, resulting in significant duplication of effort across the research community. We have developed an extensible software framework for pseudospectral methods called SpF that is intended to enable extreme scalability and optimal performance. High-level abstractions provided by SpF unburden applications of the responsibility of managing domain decomposition and load balance while reducing the changes in code required to adapt to new computing architectures. The key design concept in SpF is that each phase of the numerical calculation is partitioned into disjoint numerical 'kernels' that can be performed entirely in-processor. The granularity of domain-decomposition provided by SpF is only constrained by the data-locality requirements of these kernels. SpF builds on top of optimized vendor libraries for common numerical operations such as transforms, matrix solvers, etc., but can also be configured to use open source alternatives for portability. SpF includes several alternative schemes for global data redistribution and is expected to serve as an ideal testbed for further research into optimal approaches for different network architectures. In this presentation, we will describe the basic architecture of SpF as well as preliminary performance data and experience with adapting legacy dynamo codes

  11. An IT-enabled supply chain model: a simulation study

    Science.gov (United States)

    Cannella, Salvatore; Framinan, Jose M.; Barbosa-Póvoa, Ana

    2014-11-01

    During the last decades, supply chain collaboration practices and the underlying enabling technologies have evolved from the classical electronic data interchange (EDI) approach to a web-based and radio frequency identification (RFID)-enabled collaboration. In this field, most of the literature has focused on the study of optimal parameters for reducing the total cost of suppliers, by adopting operational research (OR) techniques. Herein we are interested in showing that the considered information technology (IT)-enabled structure is resilient, that is, it works well across a reasonably broad range of parameter settings. By adopting a methodological approach based on system dynamics, we study a multi-tier collaborative supply chain. Results show that the IT-enabled supply chain improves operational performance and customer service level. Nonetheless, benefits for geographically dispersed networks are of minor entity.

  12. Creating Data and Modeling Enabled Hydrology Instruction Using Collaborative Approach

    Science.gov (United States)

    Merwade, V.; Rajib, A.; Ruddell, B. L.; Fox, S.

    2017-12-01

    Hydrology instruction typically involves teaching of the hydrologic cycle and the processes associated with it such as precipitation, evapotranspiration, infiltration, runoff generation and hydrograph analysis. With the availability of observed and remotely sensed data related to many hydrologic fluxes, there is an opportunity to use these data for place based learning in hydrology classrooms. However, it is not always easy and possible for an instructor to complement an existing hydrology course with new material that requires both the time and technical expertise, which the instructor may not have. The work presented here describes an effort where students create the data and modeling driven instruction material as a part of their class assignment for a hydrology course at Purdue University. The data driven hydrology education project within Science Education Resources Center (SERC) is used as a platform to publish and share the instruction material so it can be used by future students in the same course or any other course anywhere in the world. Students in the class were divided into groups, and each group was assigned a topic such as precipitation, evapotranspiration, streamflow, flow duration curve and frequency analysis. Each student in the group was then asked to get data and do some analysis for an area with specific landuse characteristic such as urban, rural and agricultural. The student contribution were then organized into learning units such that someone can do a flow duration curve analysis or flood frequency analysis to see how it changes for rural area versus urban area. The hydrology education project within SERC cyberinfrastructure enables any other instructor to adopt this material as is or through modification to suit his/her place based instruction needs.

  13. Rapid Diagnostics of Onboard Sequences

    Science.gov (United States)

    Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.

    2012-01-01

    Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command

  14. Enabling full field physics based OPC via dynamic model generation

    Science.gov (United States)

    Lam, Michael; Clifford, Chris; Raghunathan, Ananthan; Fenger, Germain; Adam, Kostas

    2017-03-01

    As EUV lithography marches closer to reality for high volume production, its peculiar modeling challenges related to both inter- and intra- field effects has necessitated building OPC infrastructure that operates with field position dependency. Previous state of the art approaches to modeling field dependency used piecewise constant models where static input models are assigned to specific x/y-positions within the field. OPC and simulation could assign the proper static model based on simulation-level placement. However, in the realm of 7nm and 5nm feature sizes, small discontinuities in OPC from piecewise constant model changes can cause unacceptable levels of EPE errors. The introduction of Dynamic Model Generation (DMG) can be shown to effectively avoid these dislocations by providing unique mask and optical models per simulation region, allowing a near continuum of models through field. DMG allows unique models for EMF, apodization, aberrations, etc to vary through the entire field and provides a capability to precisely and accurately model systematic field signatures.

  15. GeoPro: Technology to Enable Scientific Modeling

    International Nuclear Information System (INIS)

    C. Juan

    2004-01-01

    Development of the ground-water flow model for the Death Valley Regional Groundwater Flow System (DVRFS) required integration of numerous supporting hydrogeologic investigations. The results from recharge, discharge, hydraulic properties, water level, pumping, model boundaries, and geologic studies were integrated to develop the required conceptual and 3-D framework models, and the flow model itself. To support the complex modeling process and the needs of the multidisciplinary DVRFS team, a hardware and software system called GeoPro (Geoscience Knowledge Integration Protocol) was developed. A primary function of GeoPro is to manage the large volume of disparate data compiled for the 100,000-square-kilometer area of southern Nevada and California. The data are primarily from previous investigations and regional flow models developed for the Nevada Test Site and Yucca Mountain projects. GeoPro utilizes relational database technology (Microsoft SQL Server(trademark)) to store and manage these tabular point data, groundwater flow model ASCII data, 3-D hydrogeologic framework data, 2-D and 2.5-D GIS data, and text documents. Data management consists of versioning, tracking, and reporting data changes as multiple users access the centralized database. GeoPro also supports the modeling process by automating the routine data transformations required to integrate project software. This automation is also crucial to streamlining pre- and post-processing of model data during model calibration. Another function of GeoPro is to facilitate the dissemination and use of the model data and results through web-based documents by linking and allowing access to the underlying database and analysis tools. The intent is to convey to end-users the complex flow model product in a manner that is simple, flexible, and relevant to their needs. GeoPro is evolving from a prototype system to a production-level product. Currently the DVRFS pre- and post-processing modeling tools are being re

  16. Enabling Accessibility Through Model-Based User Interface Development.

    Science.gov (United States)

    Ziegler, Daniel; Peissner, Matthias

    2017-01-01

    Adaptive user interfaces (AUIs) can increase the accessibility of interactive systems. They provide personalized display and interaction modes to fit individual user needs. Most AUI approaches rely on model-based development, which is considered relatively demanding. This paper explores strategies to make model-based development more attractive for mainstream developers.

  17. On-board power supply with fuel cells. Liquid gas fuelled system enables stand alone off-grid power supply; Bordstromversorgung mit Brennstoffzellen. Fluessiggas-betriebenes System ermoeglicht autarke, netzunabhaengige Stromversorgung

    Energy Technology Data Exchange (ETDEWEB)

    Hirn, Gerhard

    2011-07-01

    You reach your final destination for the day, switch off the engine of your motorhome, and sit back to enjoy the view. Cicadas chirping and the music of nature are the only sounds you can hear. And then, far away from the nearest mains outlet, you get your laptop out to check your emails and plan your route for the next day. Quietly, and with low emissions, the electrical power you need is produced by your own on-board fuel cell generator. You know that with the fuel cell, your vehicle battery will always be fully charged. Now that the funded research and testing work has been done, fuel cell hybrid systems are ready for the market. (orig.)

  18. Enabling Cross-Discipline Collaboration Via a Functional Data Model

    Science.gov (United States)

    Lindholm, D. M.; Wilson, A.; Baltzer, T.

    2016-12-01

    Many research disciplines have very specialized data models that are used to express the detailed semantics that are meaningful to that community and easily utilized by their data analysis tools. While invaluable to members of that community, such expressive data structures and metadata are of little value to potential collaborators from other scientific disciplines. Many data interoperability efforts focus on the difficult task of computationally mapping concepts from one domain to another to facilitate discovery and use of data. Although these efforts are important and promising, we have found that a great deal of discovery and dataset understanding still happens at the level of less formal, personal communication. However, a significant barrier to inter-disciplinary data sharing that remains is one of data access.Scientists and data analysts continue to spend inordinate amounts of time simply trying to get data into their analysis tools. Providing data in a standard file format is often not sufficient since data can be structured in many ways. Adhering to more explicit community standards for data structure and metadata does little to help those in other communities.The Functional Data Model specializes the Relational Data Model (used by many database systems)by defining relations as functions between independent (domain) and dependent (codomain) variables. Given that arrays of data in many scientific data formats generally represent functionally related parameters (e.g. temperature as a function of space and time), the Functional Data Model is quite relevant for these datasets as well. The LaTiS software framework implements the Functional Data Model and provides a mechanism to expose an existing data source as a LaTiS dataset. LaTiS datasets can be manipulated using a Functional Algebra and output in any number of formats.LASP has successfully used the Functional Data Model and its implementation in the LaTiS software framework to bridge the gap between

  19. Domain-specific modeling enabling full code generation

    CERN Document Server

    Kelly, Steven

    2007-01-01

    Domain-Specific Modeling (DSM) is the latest approach tosoftware development, promising to greatly increase the speed andease of software creation. Early adopters of DSM have been enjoyingproductivity increases of 500–1000% in production for over adecade. This book introduces DSM and offers examples from variousfields to illustrate to experienced developers how DSM can improvesoftware development in their teams. Two authorities in the field explain what DSM is, why it works,and how to successfully create and use a DSM solution to improveproductivity and quality. Divided into four parts, the book covers:background and motivation; fundamentals; in-depth examples; andcreating DSM solutions. There is an emphasis throughout the book onpractical guidelines for implementing DSM, including how toidentify the nece sary language constructs, how to generate fullcode from models, and how to provide tool support for a new DSMlanguage. The example cases described in the book are available thebook's Website, www.dsmbook....

  20. Scidac-Data: Enabling Data Driven Modeling of Exascale Computing

    Science.gov (United States)

    Mubarak, Misbah; Ding, Pengfei; Aliaga, Leo; Tsaris, Aristeidis; Norman, Andrew; Lyon, Adam; Ross, Robert

    2017-10-01

    The SciDAC-Data project is a DOE-funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab data center on the organization, movement, and consumption of high energy physics (HEP) data. The project analyzes the analysis patterns and data organization that have been used by NOvA, MicroBooNE, MINERvA, CDF, D0, and other experiments to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations are designed to address questions of data handling, cache optimization, and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership-class exascale computing facilities. We present the use of a subset of the SciDAC-Data distributions, acquired from analysis of approximately 71,000 HEP workflows run on the Fermilab data center and corresponding to over 9 million individual analysis jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in high performance computing (HPC) and high throughput computing (HTC) environments. In particular we describe how the Sequential Access via Metadata (SAM) data-handling system in combination with the dCache/Enstore-based data archive facilities has been used to develop radically different models for analyzing the HEP data. We also show how the simulations may be used to assess the impact of design choices in archive facilities.

  1. On-board adaptive model for state of charge estimation of lithium-ion batteries based on Kalman filter with proportional integral-based error adjustment

    Science.gov (United States)

    Wei, Jingwen; Dong, Guangzhong; Chen, Zonghai

    2017-10-01

    With the rapid development of battery-powered electric vehicles, the lithium-ion battery plays a critical role in the reliability of vehicle system. In order to provide timely management and protection for battery systems, it is necessary to develop a reliable battery model and accurate battery parameters estimation to describe battery dynamic behaviors. Therefore, this paper focuses on an on-board adaptive model for state-of-charge (SOC) estimation of lithium-ion batteries. Firstly, a first-order equivalent circuit battery model is employed to describe battery dynamic characteristics. Then, the recursive least square algorithm and the off-line identification method are used to provide good initial values of model parameters to ensure filter stability and reduce the convergence time. Thirdly, an extended-Kalman-filter (EKF) is applied to on-line estimate battery SOC and model parameters. Considering that the EKF is essentially a first-order Taylor approximation of battery model, which contains inevitable model errors, thus, a proportional integral-based error adjustment technique is employed to improve the performance of EKF method and correct model parameters. Finally, the experimental results on lithium-ion batteries indicate that the proposed EKF with proportional integral-based error adjustment method can provide robust and accurate battery model and on-line parameter estimation.

  2. Autonomous onboard optical processor for driving aid

    Science.gov (United States)

    Attia, Mondher; Servel, Alain; Guibert, Laurent

    1995-01-01

    We take advantage of recent technological advances in the field of ferroelectric liquid crystal silicon back plane optoelectronic devices. These are well suited to perform massively parallel processing tasks. That choice enables the design of low cost vision systems and allows the implementation of an on-board system. We focus on transport applications such as road sign recognition. Preliminary in-car experimental results are presented.

  3. Using Onboard Telemetry for MAVEN Orbit Determination

    Science.gov (United States)

    Lam, Try; Trawny, Nikolas; Lee, Clifford

    2013-01-01

    Determination of the spacecraft state has been traditional done using radiometric tracking data before and after the atmosphere drag pass. This paper describes our approach and results to include onboard telemetry measurements in addition to radiometric observables to refine the reconstructed trajectory estimate for the Mars Atmosphere and Volatile Evolution Mission (MAVEN). Uncertainties in the Mars atmosphere models, combined with non-continuous tracking degrade navigation accuracy, making MAVEN a key candidate for using onboard telemetry data to help complement its orbit determination process.

  4. Software Infrastructure to Enable Modeling & Simulation as a Service (M&SaaS), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR Phase 2 project will produce a software service infrastructure that enables most modeling and simulation (M&S) activities from code development and...

  5. Dust Measurements Onboard the Deep Space Gateway

    Science.gov (United States)

    Horanyi, M.; Kempf, S.; Malaspina, D.; Poppe, A.; Srama, R.; Sternovsky, Z.; Szalay, J.

    2018-02-01

    A dust instrument onboard the Deep Space Gateway will revolutionize our understanding of the dust environment at 1 AU, help our understanding of the evolution of the solar system, and improve dust hazard models for the safety of crewed and robotic missions.

  6. Thermal Modelling and Design of On-board DC-DC Power Converter using Finite Element Method

    DEFF Research Database (Denmark)

    Staliulionis, Z.; Zhang, Z.; Pittini, R.

    2014-01-01

    Power electronic converters are widely used and play a pivotal role in electronics area. The temperature causes around 54 % of all power converters failures. Thermal loads are nowadays one of the bottlenecks in the power system design and the cooling efficiency of a system is primarily determined...... by numerical modelling techniques. Therefore, thermal design through thermal modelling and simulation is becoming an integral part of the design process as less expensive compared to the experimental cut-and-try approach. Here the investigation is performed using finite element method-based modelling, and also...

  7. Thermal Modeling and Design of On-board DC-DC Power Converter using Finite Element Method

    DEFF Research Database (Denmark)

    Staliulionis, Zygimantas; Zhang, Zhe; Pittini, Riccardo

    2014-01-01

    Power electronic converters are widely used and play a pivotal role in electronics area . The temperature causes around 54 % of all power converters failures. Thermal loads are nowadays one of the bottlenecks in the power system design and the cooling efficiency of a system is primarily determined...... by numerical modeling techniques. Therefore, thermal design through thermal modeling and simulation is becoming an integral part of the design process as less expensive compared to the experimenta l cut - and - try approach. Here the investigation is performed using finite element method - based modeling...

  8. Onboard Short Term Plan Viewer

    Science.gov (United States)

    Hall, Tim; LeBlanc, Troy; Ulman, Brian; McDonald, Aaron; Gramm, Paul; Chang, Li-Min; Keerthi, Suman; Kivlovitz, Dov; Hadlock, Jason

    2011-01-01

    Onboard Short Term Plan Viewer (OSTPV) is a computer program for electronic display of mission plans and timelines, both aboard the International Space Station (ISS) and in ISS ground control stations located in several countries. OSTPV was specifically designed both (1) for use within the limited ISS computing environment and (2) to be compatible with computers used in ground control stations. OSTPV supplants a prior system in which, aboard the ISS, timelines were printed on paper and incorporated into files that also contained other paper documents. Hence, the introduction of OSTPV has both reduced the consumption of resources and saved time in updating plans and timelines. OSTPV accepts, as input, the mission timeline output of a legacy, print-oriented, UNIX-based program called "Consolidated Planning System" and converts the timeline information for display in an interactive, dynamic, Windows Web-based graphical user interface that is used by both the ISS crew and ground control teams in real time. OSTPV enables the ISS crew to electronically indicate execution of timeline steps, launch electronic procedures, and efficiently report to ground control teams on the statuses of ISS activities, all by use of laptop computers aboard the ISS.

  9. Analytical calculation of electrolyte water content of a Proton Exchange Membrane Fuel Cell for on-board modelling applications

    Science.gov (United States)

    Ferrara, Alessandro; Polverino, Pierpaolo; Pianese, Cesare

    2018-06-01

    This paper proposes an analytical model of the water content of the electrolyte of a Proton Exchange Membrane Fuel Cell. The model is designed by accounting for several simplifying assumptions, which make the model suitable for on-board/online water management applications, while ensuring a good accuracy of the considered phenomena, with respect to advanced numerical solutions. The achieved analytical solution, expressing electrolyte water content, is compared with that obtained by means of a complex numerical approach, used to solve the same mathematical problem. The achieved results show that the mean error is below 5% for electrodes water content values ranging from 2 to 15 (given as boundary conditions), and it does not overcome 0.26% for electrodes water content above 5. These results prove the capability of the solution to correctly model electrolyte water content at any operating condition, aiming at embodiment into more complex frameworks (e.g., cell or stack models), related to fuel cell simulation, monitoring, control, diagnosis and prognosis.

  10. Cultivating Innovative and Entrepreneurial Talent in the Higher Vocational Automotive Major with the "On-Board Educational Factory" Model

    Science.gov (United States)

    Wu, Zhuang-Wen; Zhu, Liang-Rong

    2017-01-01

    In this paper, we investigate the steps necessary to initiate reform in professional education. First, we analyze the advantages and disadvantages of the unified theory and practice model of education currently adopted in mainland China. Next, we suggest a talent cultivation strategy that prioritizes students and views industrial (factory)…

  11. Futures Business Models for an IoT Enabled Healthcare Sector: A Causal Layered Analysis Perspective

    OpenAIRE

    Julius Francis Gomes; Sara Moqaddemerad

    2016-01-01

    Purpose: To facilitate futures business research by proposing a novel way to combine business models as a conceptual tool with futures research techniques. Design: A futures perspective is adopted to foresight business models of the Internet of Things (IoT) enabled healthcare sector by using business models as a futures business research tool. In doing so, business models is coupled with one of the most prominent foresight methodologies, Causal Layered Analysis (CLA). Qualitative analysis...

  12. Modelling Angular Dependencies in Land Surface Temperatures From the SEVIRI Instrument onboard the Geostationary Meteosat Second Generation Satellites

    DEFF Research Database (Denmark)

    Rasmussen, Mads Olander; Pinheiro, AC; Proud, Simon Richard

    2010-01-01

    on vegetation structure and viewing and illumination geometry. Despite this, these effects are not considered in current operational LST products from neither polar-orbiting nor geostationary satellites. In this paper, we simulate the angular dependence that can be expected when estimating LST with the viewing...... geometry of the geostationary Meteosat Second Generation Spinning Enhanced Visible and Infrared Imager sensor across the African continent and compare it to a normalized view geometry. We use the modified geometric projection model that estimates the scene thermal infrared radiance from a surface covered...

  13. Futures Business Models for an IoT Enabled Healthcare Sector: A Causal Layered Analysis Perspective

    Directory of Open Access Journals (Sweden)

    Julius Francis Gomes

    2016-12-01

    Full Text Available Purpose: To facilitate futures business research by proposing a novel way to combine business models as a conceptual tool with futures research techniques. Design: A futures perspective is adopted to foresight business models of the Internet of Things (IoT enabled healthcare sector by using business models as a futures business research tool. In doing so, business models is coupled with one of the most prominent foresight methodologies, Causal Layered Analysis (CLA. Qualitative analysis provides deeper understanding of the phenomenon through the layers of CLA; litany, social causes, worldview and myth. Findings: It is di cult to predict the far future for a technology oriented sector like healthcare. This paper presents three scenarios for short-, medium- and long-term future. Based on these scenarios we also present a set of business model elements for different future time frames. This paper shows a way to combine business models with CLA, a foresight methodology; in order to apply business models in futures business research. Besides offering early results for futures business research, this study proposes a conceptual space to work with individual business models for managerial stakeholders. Originality / Value: Much research on business models has offered conceptualization of the phenomenon, innovation through business model and transformation of business models. However, existing literature does not o er much on using business model as a futures research tool. Enabled by futures thinking, we collected key business model elements and building blocks for the futures market and ana- lyzed them through the CLA framework.

  14. About the Big Graphs Arising when Forming the Diagnostic Models in a Reconfigurable Computing Field of Functional Monitoring and Diagnostics System of the Spacecraft Onboard Control Complex

    Directory of Open Access Journals (Sweden)

    L. V. Savkin

    2015-01-01

    Full Text Available One of the problems in implementation of the multipurpose complete systems based on the reconfigurable computing fields (RCF is the problem of optimum redistribution of logicalarithmetic resources in growing scope of functional tasks. Irrespective of complexity, all of them are transformed into an orgraph, which functional and topological structure is appropriately imposed on the RCF based, as a rule, on the field programmable gate array (FPGA.Due to limitation of the hardware configurations and functions realized by means of the switched logical blocks (SLB, the abovementioned problem becomes even more critical when there is a need, within the strictly allocated RCF fragment, to realize even more complex challenge in comparison with the problem which was solved during the previous computing step. In such cases it is possible to speak about graphs of big dimensions with respect to allocated RCF fragment.The article considers this problem through development of diagnostic algorithms to implement diagnostics and control of an onboard control complex of the spacecraft using RCF. It gives examples of big graphs arising with respect to allocated RCF fragment when forming the hardware levels of a diagnostic model, which, in this case, is any hardware-based algorithm of diagnostics in RCF.The article reviews examples of arising big graphs when forming the complicated diagnostic models due to drastic difference in formation of hardware levels on closely located RCF fragments. It also pays attention to big graphs emerging when the multichannel diagnostic models are formed.Three main ways to solve the problem of big graphs with respect to allocated RCF fragment are given. These are: splitting the graph into fragments, use of pop-up windows with relocating and memorizing intermediate values of functions of high hardware levels of diagnostic models, and deep adaptive update of diagnostic model.It is shown that the last of three ways is the most efficient

  15. Exploring How Usage-Focused Business Models Enable Circular Economy through Digital Technologies

    Directory of Open Access Journals (Sweden)

    Gianmarco Bressanelli

    2018-02-01

    Full Text Available Recent studies advocate that digital technologies are key enabling factors for the introduction of servitized business models. At the same time, these technologies support the implementation of the circular economy (CE paradigm into businesses. Despite this general agreement, the literature still overlooks how digital technologies enable such a CE transition. To fill the gap, this paper develops a conceptual framework, based on the literature and a case study of a company implementing a usage-focused servitized business model in the household appliance industry. This study focuses on the Internet of Things (IoT, Big Data, and analytics, and identifies eight specific functionalities enabled by such technologies (improving product design, attracting target customers, monitoring and tracking product activity, providing technical support, providing preventive and predictive maintenance, optimizing the product usage, upgrading the product, enhancing renovation and end-of-life activities. By investigating how these functionalities affect three CE value drivers (increasing resource efficiency, extending lifespan, and closing the loop, the conceptual framework developed in this paper advances knowledge about the role of digital technologies as an enabler of the CE within usage-focused business models. Finally, this study shows how digital technologies help overcome the drawback of usage-focused business models for the adoption of CE pointed out by previous literature.

  16. IT-enabled dynamic capability on performance: An empirical study of BSC model

    Directory of Open Access Journals (Sweden)

    Adilson Carlos Yoshikuni

    2017-05-01

    Full Text Available ew studies have investigated the influence of “information capital,” through IT-enabled dynamic capability, on corporate performance, particularly in economic turbulence. Our study investigates the causal relationship between performance perspectives of the balanced scorecard using partial least squares path modeling. Using data on 845 Brazilian companies, we conduct a quantitative empirical study of firms during an economic crisis and observe the following interesting results. Operational and analytical IT-enabled dynamic capability had positive effects on business process improvement and corporate performance. Results pertaining to mediation (endogenous variables and moderation (control variables clarify IT’s role in and benefits for corporate performance.

  17. Collaborative Cloud Manufacturing: Design of Business Model Innovations Enabled by Cyberphysical Systems in Distributed Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Erwin Rauch

    2016-01-01

    Full Text Available Collaborative cloud manufacturing, as a concept of distributed manufacturing, allows different opportunities for changing the logic of generating and capturing value. Cyberphysical systems and the technologies behind them are the enablers for new business models which have the potential to be disruptive. This paper introduces the topics of distributed manufacturing as well as cyberphysical systems. Furthermore, the main business model clusters of distributed manufacturing systems are described, including collaborative cloud manufacturing. The paper aims to provide support for developing business model innovations based on collaborative cloud manufacturing. Therefore, three business model architecture types of a differentiated business logic are discussed, taking into consideration the parameters which have an influence and the design of the business model and its architecture. As a result, new business models can be developed systematically and new ideas can be generated to boost the concept of collaborative cloud manufacturing within all sustainable business models.

  18. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  19. Multi-dimensional knowledge translation: enabling health informatics capacity audits using patient journey models.

    Science.gov (United States)

    Catley, Christina; McGregor, Carolyn; Percival, Jennifer; Curry, Joanne; James, Andrew

    2008-01-01

    This paper presents a multi-dimensional approach to knowledge translation, enabling results obtained from a survey evaluating the uptake of Information Technology within Neonatal Intensive Care Units to be translated into knowledge, in the form of health informatics capacity audits. Survey data, having multiple roles, patient care scenarios, levels, and hospitals, is translated using a structured data modeling approach, into patient journey models. The data model is defined such that users can develop queries to generate patient journey models based on a pre-defined Patient Journey Model architecture (PaJMa). PaJMa models are then analyzed to build capacity audits. Capacity audits offer a sophisticated view of health informatics usage, providing not only details of what IT solutions a hospital utilizes, but also answering the questions: when, how and why, by determining when the IT solutions are integrated into the patient journey, how they support the patient information flow, and why they improve the patient journey.

  20. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  1. Interpretive Structural Modeling Of Implementation Enablers For Just In Time In ICPI

    Directory of Open Access Journals (Sweden)

    Nitin Upadhye

    2014-12-01

    Full Text Available Indian Corrugated Packaging Industries (ICPI have built up tough competition among the industries in terms of product cost, quality, product delivery, flexibility, and finally customer’s demand. As their customers, mostly OEMs are asking Just in Time deliveries, ICPI must implement JIT in their system. The term "JIT” as, it denotes a system that utilizes less, in terms of all inputs, to create the same outputs as those created by a traditional mass production system, while contributing increased varieties for the end customer. (Womack et al. 1990 "JIT" focuses on abolishing or reducing Muda (“Muda", the Japanese word for waste and on maximizing or fully utilizing activities that add value from the customer's perspective. There is lack of awareness in identifying the right enablers of JIT implementation. Therefore, this study has tried to find out the enablers from the literature review and expert’s opinions from corrugated packaging industries and developed the relationship matrix to see the driving power and dependence between them. In this study, modeling has been done in order to know the interrelationships between the enablers with the help of Interpretive Structural Modeling and Cross Impact Matrix Multiplication Applied to Classification (MICMAC analysis for the performance of Indian corrugated packaging industries.

  2. An Integrated Architecture for On-Board Aircraft Engine Performance Trend Monitoring and Gas Path Fault Diagnostics

    Science.gov (United States)

    Simon, Donald L.

    2010-01-01

    Aircraft engine performance trend monitoring and gas path fault diagnostics are closely related technologies that assist operators in managing the health of their gas turbine engine assets. Trend monitoring is the process of monitoring the gradual performance change that an aircraft engine will naturally incur over time due to turbomachinery deterioration, while gas path diagnostics is the process of detecting and isolating the occurrence of any faults impacting engine flow-path performance. Today, performance trend monitoring and gas path fault diagnostic functions are performed by a combination of on-board and off-board strategies. On-board engine control computers contain logic that monitors for anomalous engine operation in real-time. Off-board ground stations are used to conduct fleet-wide engine trend monitoring and fault diagnostics based on data collected from each engine each flight. Continuing advances in avionics are enabling the migration of portions of the ground-based functionality on-board, giving rise to more sophisticated on-board engine health management capabilities. This paper reviews the conventional engine performance trend monitoring and gas path fault diagnostic architecture commonly applied today, and presents a proposed enhanced on-board architecture for future applications. The enhanced architecture gains real-time access to an expanded quantity of engine parameters, and provides advanced on-board model-based estimation capabilities. The benefits of the enhanced architecture include the real-time continuous monitoring of engine health, the early diagnosis of fault conditions, and the estimation of unmeasured engine performance parameters. A future vision to advance the enhanced architecture is also presented and discussed

  3. Reduced ENSO Variability at the LGM Revealed by an Isotope-Enabled Earth System Model

    Science.gov (United States)

    Zhu, Jiang; Liu, Zhengyu; Brady, Esther; Otto-Bliesner, Bette; Zhang, Jiaxu; Noone, David; Tomas, Robert; Nusbaumer, Jesse; Wong, Tony; Jahn, Alexandra; hide

    2017-01-01

    Studying the El Nino Southern Oscillation (ENSO) in the past can help us better understand its dynamics and improve its future projections. However, both paleoclimate reconstructions and model simulations of ENSO strength at the Last Glacial Maximum (LGM; 21 ka B.P.) have led to contradicting results. Here we perform model simulations using the recently developed water isotope-enabled Community Earth System Model (iCESM). For the first time, model-simulated oxygen isotopes are directly compared with those from ENSO reconstructions using the individual foraminifera analysis (IFA). We find that the LGM ENSO is most likely weaker comparing with the preindustrial. The iCESM suggests that total variance of the IFA records may only reflect changes in the annual cycle instead of ENSO variability as previously assumed. Furthermore, the interpretation of subsurface IFA records can be substantially complicated by the habitat depth of thermocline-dwelling foraminifera and their vertical migration with a temporally varying thermocline.

  4. Volterra representation enables modeling of complex synaptic nonlinear dynamics in large-scale simulations.

    Science.gov (United States)

    Hu, Eric Y; Bouteiller, Jean-Marie C; Song, Dong; Baudry, Michel; Berger, Theodore W

    2015-01-01

    Chemical synapses are comprised of a wide collection of intricate signaling pathways involving complex dynamics. These mechanisms are often reduced to simple spikes or exponential representations in order to enable computer simulations at higher spatial levels of complexity. However, these representations cannot capture important nonlinear dynamics found in synaptic transmission. Here, we propose an input-output (IO) synapse model capable of generating complex nonlinear dynamics while maintaining low computational complexity. This IO synapse model is an extension of a detailed mechanistic glutamatergic synapse model capable of capturing the input-output relationships of the mechanistic model using the Volterra functional power series. We demonstrate that the IO synapse model is able to successfully track the nonlinear dynamics of the synapse up to the third order with high accuracy. We also evaluate the accuracy of the IO synapse model at different input frequencies and compared its performance with that of kinetic models in compartmental neuron models. Our results demonstrate that the IO synapse model is capable of efficiently replicating complex nonlinear dynamics that were represented in the original mechanistic model and provide a method to replicate complex and diverse synaptic transmission within neuron network simulations.

  5. Implementing novel models of posttreatment care for cancer survivors: Enablers, challenges and recommendations.

    Science.gov (United States)

    Jefford, Michael; Kinnane, Nicole; Howell, Paula; Nolte, Linda; Galetakis, Spiridoula; Bruce Mann, Gregory; Naccarella, Lucio; Lai-Kwon, Julia; Simons, Katherine; Avery, Sharon; Thompson, Kate; Ashley, David; Haskett, Martin; Davies, Elise; Whitfield, Kathryn

    2015-12-01

    The American Society of Clinical Oncology and US Institute of Medicine emphasize the need to trial novel models of posttreatment care, and disseminate findings. In 2011, the Victorian State Government (Australia) established the Victorian Cancer Survivorship Program (VCSP), funding six 2-year demonstration projects, targeting end of initial cancer treatment. Projects considered various models, enrolling people of differing cancer types, age and residential areas. We sought to determine common enablers of success, as well as challenges/barriers. Throughout the duration of the projects, a formal "community of practice" met regularly to share experiences. Projects provided regular formal progress reports. An analysis framework was developed to synthesize key themes and identify critical enablers and challenges. Two external reviewers examined final project reports. Discussion with project teams clarified content. Survivors reported interventions to be acceptable, appropriate and effective. Strong clinical leadership was identified as a critical success factor. Workforce education was recognized as important. Partnerships with consumers, primary care and community organizations; risk stratified pathways with rapid re-access to specialist care; and early preparation for survivorship, self-management and shared care models supported positive project outcomes. Tailoring care to individual needs and predicted risks was supported. Challenges included: lack of valid assessment and prediction tools; limited evidence to support novel care models; workforce redesign; and effective engagement with community-based care and issues around survivorship terminology. The VCSP project outcomes have added to growing evidence around posttreatment care. Future projects should consider the identified enablers and challenges when designing and implementing survivorship care. © 2015 Wiley Publishing Asia Pty Ltd.

  6. An Observation Capability Metadata Model for EO Sensor Discovery in Sensor Web Enablement Environments

    Directory of Open Access Journals (Sweden)

    Chuli Hu

    2014-10-01

    Full Text Available Accurate and fine-grained discovery by diverse Earth observation (EO sensors ensures a comprehensive response to collaborative observation-required emergency tasks. This discovery remains a challenge in an EO sensor web environment. In this study, we propose an EO sensor observation capability metadata model that reuses and extends the existing sensor observation-related metadata standards to enable the accurate and fine-grained discovery of EO sensors. The proposed model is composed of five sub-modules, namely, ObservationBreadth, ObservationDepth, ObservationFrequency, ObservationQuality and ObservationData. The model is applied to different types of EO sensors and is formalized by the Open Geospatial Consortium Sensor Model Language 1.0. The GeosensorQuery prototype retrieves the qualified EO sensors based on the provided geo-event. An actual application to flood emergency observation in the Yangtze River Basin in China is conducted, and the results indicate that sensor inquiry can accurately achieve fine-grained discovery of qualified EO sensors and obtain enriched observation capability information. In summary, the proposed model enables an efficient encoding system that ensures minimum unification to represent the observation capabilities of EO sensors. The model functions as a foundation for the efficient discovery of EO sensors. In addition, the definition and development of this proposed EO sensor observation capability metadata model is a helpful step in extending the Sensor Model Language (SensorML 2.0 Profile for the description of the observation capabilities of EO sensors.

  7. Analysis of diet optimization models for enabling conditions for hypertrophic muscle enlargement in athletes

    Directory of Open Access Journals (Sweden)

    L. Matijević

    2013-01-01

    Full Text Available In this study mathematical models were created and used in diet optimization for an athlete – recreational bodybuilder for pretournament period. The main aim was to determine weekly menus that can enable conditions for the hypertrophic muscle enlargement and to reduce the fat mass in a body. Each daily offer was planned to contain six to seven meals but with respect to several user’s personal demands. Optimal carbohydrates, fat and protein ratio in diet for enabling hypertrophy, recommended in literature, was found to be 43:30:27 and was chosen as the target in this research. Variables included in models were presented dishes and constraints, observed values of the offers; price, mass of consumed food, energy, water and content of different nutrients. The general idea was to create the models and to compare different programs in solving a problem. LINDO and MS Excel were recognized as widely spread and were chosen for model testing and examination. Both programs were suggested weekly menus that were acceptable to the user and were met all recommendations and demands. Weekly menus were analysed and compared. Sensitivity tests from both programs were used to detect possible critical points in the menu. Used programs produced slightly different results but still with very high correlation between proposed weekly intakes (R2=0.99856, p<0.05 so both can be successfully used in the pretournament period of bodybuilding and recommended for this complex task.

  8. Enabling Real-time Water Decision Support Services Using Model as a Service

    Science.gov (United States)

    Zhao, T.; Minsker, B. S.; Lee, J. S.; Salas, F. R.; Maidment, D. R.; David, C. H.

    2014-12-01

    Through application of computational methods and an integrated information system, data and river modeling services can help researchers and decision makers more rapidly understand river conditions under alternative scenarios. To enable this capability, workflows (i.e., analysis and model steps) are created and published as Web services delivered through an internet browser, including model inputs, a published workflow service, and visualized outputs. The RAPID model, which is a river routing model developed at University of Texas Austin for parallel computation of river discharge, has been implemented as a workflow and published as a Web application. This allows non-technical users to remotely execute the model and visualize results as a service through a simple Web interface. The model service and Web application has been prototyped in the San Antonio and Guadalupe River Basin in Texas, with input from university and agency partners. In the future, optimization model workflows will be developed to link with the RAPID model workflow to provide real-time water allocation decision support services.

  9. Genomic-Enabled Prediction Kernel Models with Random Intercepts for Multi-environment Trials

    Science.gov (United States)

    Cuevas, Jaime; Granato, Italo; Fritsche-Neto, Roberto; Montesinos-Lopez, Osval A.; Burgueño, Juan; Bandeira e Sousa, Massaine; Crossa, José

    2018-01-01

    In this study, we compared the prediction accuracy of the main genotypic effect model (MM) without G×E interactions, the multi-environment single variance G×E deviation model (MDs), and the multi-environment environment-specific variance G×E deviation model (MDe) where the random genetic effects of the lines are modeled with the markers (or pedigree). With the objective of further modeling the genetic residual of the lines, we incorporated the random intercepts of the lines (l) and generated another three models. Each of these 6 models were fitted with a linear kernel method (Genomic Best Linear Unbiased Predictor, GB) and a Gaussian Kernel (GK) method. We compared these 12 model-method combinations with another two multi-environment G×E interactions models with unstructured variance-covariances (MUC) using GB and GK kernels (4 model-method). Thus, we compared the genomic-enabled prediction accuracy of a total of 16 model-method combinations on two maize data sets with positive phenotypic correlations among environments, and on two wheat data sets with complex G×E that includes some negative and close to zero phenotypic correlations among environments. The two models (MDs and MDE with the random intercept of the lines and the GK method) were computationally efficient and gave high prediction accuracy in the two maize data sets. Regarding the more complex G×E wheat data sets, the prediction accuracy of the model-method combination with G×E, MDs and MDe, including the random intercepts of the lines with GK method had important savings in computing time as compared with the G×E interaction multi-environment models with unstructured variance-covariances but with lower genomic prediction accuracy. PMID:29476023

  10. Kinetic Modeling of Accelerated Stability Testing Enabled by Second Harmonic Generation Microscopy.

    Science.gov (United States)

    Song, Zhengtian; Sarkar, Sreya; Vogt, Andrew D; Danzer, Gerald D; Smith, Casey J; Gualtieri, Ellen J; Simpson, Garth J

    2018-04-03

    The low limits of detection afforded by second harmonic generation (SHG) microscopy coupled with image analysis algorithms enabled quantitative modeling of the temperature-dependent crystallization of active pharmaceutical ingredients (APIs) within amorphous solid dispersions (ASDs). ASDs, in which an API is maintained in an amorphous state within a polymer matrix, are finding increasing use to address solubility limitations of small-molecule APIs. Extensive stability testing is typically performed for ASD characterization, the time frame for which is often dictated by the earliest detectable onset of crystal formation. Here a study of accelerated stability testing on ritonavir, a human immunodeficiency virus (HIV) protease inhibitor, has been conducted. Under the condition for accelerated stability testing at 50 °C/75%RH and 40 °C/75%RH, ritonavir crystallization kinetics from amorphous solid dispersions were monitored by SHG microscopy. SHG microscopy coupled by image analysis yielded limits of detection for ritonavir crystals as low as 10 ppm, which is about 2 orders of magnitude lower than other methods currently available for crystallinity detection in ASDs. The four decade dynamic range of SHG microscopy enabled quantitative modeling with an established (JMAK) kinetic model. From the SHG images, nucleation and crystal growth rates were independently determined.

  11. Ames Culture Chamber System: Enabling Model Organism Research Aboard the international Space Station

    Science.gov (United States)

    Steele, Marianne

    2014-01-01

    Understanding the genetic, physiological, and behavioral effects of spaceflight on living organisms and elucidating the molecular mechanisms that underlie these effects are high priorities for NASA. Certain organisms, known as model organisms, are widely studied to help researchers better understand how all biological systems function. Small model organisms such as nem-atodes, slime mold, bacteria, green algae, yeast, and moss can be used to study the effects of micro- and reduced gravity at both the cellular and systems level over multiple generations. Many model organisms have sequenced genomes and published data sets on their transcriptomes and proteomes that enable scientific investigations of the molecular mechanisms underlying the adaptations of these organisms to space flight.

  12. Cloud-enabled large-scale land surface model simulations with the NASA Land Information System

    Science.gov (United States)

    Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.

    2017-12-01

    Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and

  13. Genomic-Enabled Prediction in Maize Using Kernel Models with Genotype × Environment Interaction.

    Science.gov (United States)

    Bandeira E Sousa, Massaine; Cuevas, Jaime; de Oliveira Couto, Evellyn Giselly; Pérez-Rodríguez, Paulino; Jarquín, Diego; Fritsche-Neto, Roberto; Burgueño, Juan; Crossa, Jose

    2017-06-07

    Multi-environment trials are routinely conducted in plant breeding to select candidates for the next selection cycle. In this study, we compare the prediction accuracy of four developed genomic-enabled prediction models: (1) single-environment, main genotypic effect model (SM); (2) multi-environment, main genotypic effects model (MM); (3) multi-environment, single variance G×E deviation model (MDs); and (4) multi-environment, environment-specific variance G×E deviation model (MDe). Each of these four models were fitted using two kernel methods: a linear kernel Genomic Best Linear Unbiased Predictor, GBLUP (GB), and a nonlinear kernel Gaussian kernel (GK). The eight model-method combinations were applied to two extensive Brazilian maize data sets (HEL and USP data sets), having different numbers of maize hybrids evaluated in different environments for grain yield (GY), plant height (PH), and ear height (EH). Results show that the MDe and the MDs models fitted with the Gaussian kernel (MDe-GK, and MDs-GK) had the highest prediction accuracy. For GY in the HEL data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 9 to 32%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 9 to 49%. For GY in the USP data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 0 to 7%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 34 to 70%. For traits PH and EH, gains in prediction accuracy of models with GK compared to models with GB were smaller than those achieved in GY. Also, these gains in prediction accuracy decreased when a more difficult prediction problem was studied. Copyright © 2017 Bandeira e Sousa et al.

  14. Genomic-Enabled Prediction in Maize Using Kernel Models with Genotype × Environment Interaction

    Directory of Open Access Journals (Sweden)

    Massaine Bandeira e Sousa

    2017-06-01

    Full Text Available Multi-environment trials are routinely conducted in plant breeding to select candidates for the next selection cycle. In this study, we compare the prediction accuracy of four developed genomic-enabled prediction models: (1 single-environment, main genotypic effect model (SM; (2 multi-environment, main genotypic effects model (MM; (3 multi-environment, single variance G×E deviation model (MDs; and (4 multi-environment, environment-specific variance G×E deviation model (MDe. Each of these four models were fitted using two kernel methods: a linear kernel Genomic Best Linear Unbiased Predictor, GBLUP (GB, and a nonlinear kernel Gaussian kernel (GK. The eight model-method combinations were applied to two extensive Brazilian maize data sets (HEL and USP data sets, having different numbers of maize hybrids evaluated in different environments for grain yield (GY, plant height (PH, and ear height (EH. Results show that the MDe and the MDs models fitted with the Gaussian kernel (MDe-GK, and MDs-GK had the highest prediction accuracy. For GY in the HEL data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 9 to 32%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 9 to 49%. For GY in the USP data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 0 to 7%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 34 to 70%. For traits PH and EH, gains in prediction accuracy of models with GK compared to models with GB were smaller than those achieved in GY. Also, these gains in prediction accuracy decreased when a more difficult prediction problem was studied.

  15. Comparison between linear and non-parametric regression models for genome-enabled prediction in wheat.

    Science.gov (United States)

    Pérez-Rodríguez, Paulino; Gianola, Daniel; González-Camacho, Juan Manuel; Crossa, José; Manès, Yann; Dreisigacker, Susanne

    2012-12-01

    In genome-enabled prediction, parametric, semi-parametric, and non-parametric regression models have been used. This study assessed the predictive ability of linear and non-linear models using dense molecular markers. The linear models were linear on marker effects and included the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B. The non-linear models (this refers to non-linearity on markers) were reproducing kernel Hilbert space (RKHS) regression, Bayesian regularized neural networks (BRNN), and radial basis function neural networks (RBFNN). These statistical models were compared using 306 elite wheat lines from CIMMYT genotyped with 1717 diversity array technology (DArT) markers and two traits, days to heading (DTH) and grain yield (GY), measured in each of 12 environments. It was found that the three non-linear models had better overall prediction accuracy than the linear regression specification. Results showed a consistent superiority of RKHS and RBFNN over the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B models.

  16. Method of optimization onboard communication network

    Science.gov (United States)

    Platoshin, G. A.; Selvesuk, N. I.; Semenov, M. E.; Novikov, V. M.

    2018-02-01

    In this article the optimization levels of onboard communication network (OCN) are proposed. We defined the basic parameters, which are necessary for the evaluation and comparison of modern OCN, we identified also a set of initial data for possible modeling of the OCN. We also proposed a mathematical technique for implementing the OCN optimization procedure. This technique is based on the principles and ideas of binary programming. It is shown that the binary programming technique allows to obtain an inherently optimal solution for the avionics tasks. An example of the proposed approach implementation to the problem of devices assignment in OCN is considered.

  17. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    Science.gov (United States)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  18. The shared circuits model (SCM): how control, mirroring, and simulation can enable imitation, deliberation, and mindreading.

    Science.gov (United States)

    Hurley, Susan

    2008-02-01

    Imitation, deliberation, and mindreading are characteristically human sociocognitive skills. Research on imitation and its role in social cognition is flourishing across various disciplines. Imitation is surveyed in this target article under headings of behavior, subpersonal mechanisms, and functions of imitation. A model is then advanced within which many of the developments surveyed can be located and explained. The shared circuits model (SCM) explains how imitation, deliberation, and mindreading can be enabled by subpersonal mechanisms of control, mirroring, and simulation. It is cast at a middle, functional level of description, that is, between the level of neural implementation and the level of conscious perceptions and intentional actions. The SCM connects shared informational dynamics for perception and action with shared informational dynamics for self and other, while also showing how the action/perception, self/other, and actual/possible distinctions can be overlaid on these shared informational dynamics. It avoids the common conception of perception and action as separate and peripheral to central cognition. Rather, it contributes to the situated cognition movement by showing how mechanisms for perceiving action can be built on those for active perception.;>;>The SCM is developed heuristically, in five layers that can be combined in various ways to frame specific ontogenetic or phylogenetic hypotheses. The starting point is dynamic online motor control, whereby an organism is closely attuned to its embedding environment through sensorimotor feedback. Onto this are layered functions of prediction and simulation of feedback, mirroring, simulation of mirroring, monitored inhibition of motor output, and monitored simulation of input. Finally, monitored simulation of input specifying possible actions plus inhibited mirroring of such possible actions can generate information about the possible as opposed to actual instrumental actions of others, and the

  19. A Novel Experimental and Modelling Strategy for Nanoparticle Toxicity Testing Enabling the Use of Small Quantities

    Directory of Open Access Journals (Sweden)

    Marinda van Pomeren

    2017-11-01

    Full Text Available Metallic nanoparticles (NPs differ from other metal forms with respect to their large surface to volume ratio and subsequent inherent reactivity. Each new modification to a nanoparticle alters the surface to volume ratio, fate and subsequently the toxicity of the particle. Newly-engineered NPs are commonly available only in low quantities whereas, in general, rather large amounts are needed for fate characterizations and effect studies. This challenge is especially relevant for those NPs that have low inherent toxicity combined with low bioavailability. Therefore, within our study, we developed new testing strategies that enable working with low quantities of NPs. The experimental testing method was tailor-made for NPs, whereas we also developed translational models based on different dose-metrics allowing to determine dose-response predictions for NPs. Both the experimental method and the predictive models were verified on the basis of experimental effect data collected using zebrafish embryos exposed to metallic NPs in a range of different chemical compositions and shapes. It was found that the variance in the effect data in the dose-response predictions was best explained by the minimal diameter of the NPs, whereas the data confirmed that the predictive model is widely applicable to soluble metallic NPs. The experimental and model approach developed in our study support the development of (ecotoxicity assays tailored to nano-specific features.

  20. Modeling of RFID-Enabled Real-Time Manufacturing Execution System in Mixed-Model Assembly Lines

    Directory of Open Access Journals (Sweden)

    Zhixin Yang

    2015-01-01

    Full Text Available To quickly respond to the diverse product demands, mixed-model assembly lines are well adopted in discrete manufacturing industries. Besides the complexity in material distribution, mixed-model assembly involves a variety of components, different process plans and fast production changes, which greatly increase the difficulty for agile production management. Aiming at breaking through the bottlenecks in existing production management, a novel RFID-enabled manufacturing execution system (MES, which is featured with real-time and wireless information interaction capability, is proposed to identify various manufacturing objects including WIPs, tools, and operators, etc., and to trace their movements throughout the production processes. However, being subject to the constraints in terms of safety stock, machine assignment, setup, and scheduling requirements, the optimization of RFID-enabled MES model for production planning and scheduling issues is a NP-hard problem. A new heuristical generalized Lagrangian decomposition approach has been proposed for model optimization, which decomposes the model into three subproblems: computation of optimal configuration of RFID senor networks, optimization of production planning subjected to machine setup cost and safety stock constraints, and optimization of scheduling for minimized overtime. RFID signal processing methods that could solve unreliable, redundant, and missing tag events are also described in detail. The model validity is discussed through algorithm analysis and verified through numerical simulation. The proposed design scheme has important reference value for the applications of RFID in multiple manufacturing fields, and also lays a vital research foundation to leverage digital and networked manufacturing system towards intelligence.

  1. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  2. Impact of convective activity on precipitation δ18O in isotope-enabled models

    Science.gov (United States)

    Hu, J.; Emile-Geay, J.; Dee, S.

    2017-12-01

    The ^18O signal preserved in paleo-archives (e.g. speleothem, tree ring cellulose, ice cores) is widely used to reconstruct precipitation or temperature. In the tropics, the inverse relationship between precipitation ^18O and rainfall amount, namely "amount effect" [Dansgaard, Tellus, 1964], is often used to interpret precipitation ^18O. However, recent studies have shown that precipitation ^18O is also influenced by precipitation type [Kurita et al, JGR, 2009; Moerman et al, EPSL, 2013], and recent observations indicate that it is negatively correlated with the fraction of precipitation associated with stratiform clouds [Aggarwal et al, Nature Geosci, 2016]. It is thus important to determine to what extent isotope-enabled climate models can reproduce these relationships. Here we do so using output from LMDZ, CAM2, and isoGSM from the Stable Water Isotope Intercomparison Group, Phase 2 (SWING2) project and results of SPEEDY-IER [Dee et al, JGR, 2015] from an AMIP-style experiment. The results show that these models simulate the "amount effect" well in the tropics, and the relationship between precipitation ^18O and precipitation is reversed in many places in mid-latitudes, in accordance with observations [Bowen, JGR, 2008]. Also, these models can all reproduce the negative correlation between monthly precipitation ^18O and stratiform precipitation proportion in mid-latitude (30°N-50°N; 50°S-30°S), but in the tropics (30°S-30°N), models show a positive correlation instead. The reason for this bias will be investigated within idealized experiments with SPEEDY-IER. The correct simulations of the impact of convective activity on precipitation ^18O in isotope-enabled models will improve our interpretation of paleoclimate proxies with respect to hydroclimate variability. P. K. Aggarwal et al. (2016), Nature Geosci., 9, 624-629, doi:10.1038/ngeo2739. G. J. Bowen. (2008), J. Geophys. Res., 113, D05113, doi:10.1029/2007JD009295. W. Dansgaard (1964), Tellus, 16(4), 436

  3. Testing, Modeling, and Monitoring to Enable Simpler, Cheaper, Longer-lived Surface Caps

    International Nuclear Information System (INIS)

    Piet, S. J.; Breckenridge, R. P.; Burns, D. E.

    2003-01-01

    Society has and will continue to generate hazardous wastes whose risks must be managed. For exceptionally toxic, long-lived, and feared waste, the solution is deep burial, e.g., deep geological disposal at Yucca Mtn. For some waste, recycle or destruction/treatment is possible. The alternative for other wastes is storage at or near the ground level (in someone's back yard); most of these storage sites include a surface barrier (cap) to prevent downward water migration. Some of the hazards will persist indefinitely. As society and regulators have demanded additional proof that caps are robust against more threats and for longer time periods, the caps have become increasingly complex and expensive. As in other industries, increased complexity will eventually increase the difficulty in estimating performance, in monitoring system/component performance, and in repairing or upgrading barriers as risks are managed. An approach leading to simpler, less expensive, longer-lived, more manageable caps is needed. Our project, which started in April 2002, aims to catalyze a Barrier Improvement Cycle (iterative learning and application) and thus enable Remediation System Performance Management (doing the right maintenance neither too early nor too late). The knowledge gained and the capabilities built will help verify the adequacy of past remedial decisions, improve barrier management, and enable improved solutions for future decisions. We believe it will be possible to develop simpler, longer-lived, less expensive caps that are easier to monitor, manage, and repair. The project is planned to: (a) improve the knowledge of degradation mechanisms in times shorter than service life; (b) improve modeling of barrier degradation dynamics; (c) develop sensor systems to identify early degradation; and (d) provide a better basis for developing and testing of new barrier systems. This project combines selected exploratory studies (benchtop and field scale), coupled effects accelerated

  4. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    Science.gov (United States)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including

  5. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  6. Model-based MPC enables curvilinear ILT using either VSB or multi-beam mask writers

    Science.gov (United States)

    Pang, Linyong; Takatsukasa, Yutetsu; Hara, Daisuke; Pomerantsev, Michael; Su, Bo; Fujimura, Aki

    2017-07-01

    Inverse Lithography Technology (ILT) is becoming the choice for Optical Proximity Correction (OPC) of advanced technology nodes in IC design and production. Multi-beam mask writers promise significant mask writing time reduction for complex ILT style masks. Before multi-beam mask writers become the main stream working tools in mask production, VSB writers will continue to be the tool of choice to write both curvilinear ILT and Manhattanized ILT masks. To enable VSB mask writers for complex ILT style masks, model-based mask process correction (MB-MPC) is required to do the following: 1). Make reasonable corrections for complex edges for those features that exhibit relatively large deviations from both curvilinear ILT and Manhattanized ILT designs. 2). Control and manage both Edge Placement Errors (EPE) and shot count. 3. Assist in easing the migration to future multi-beam mask writer and serve as an effective backup solution during the transition. In this paper, a solution meeting all those requirements, MB-MPC with GPU acceleration, will be presented. One model calibration per process allows accurate correction regardless of the target mask writer.

  7. New On-board Microprocessors

    Science.gov (United States)

    Weigand, R.

    (for SW development on PC etc.), or to consider using it as a PCI master controller in an on-board system. Advanced SEU fault tolerance is in- troduced by design, using triple modular redundancy (TMR) flip-flops for all registers and EDAC protection for all memories. The device will be manufactured in a radia- tion hard Atmel 0.25 um technology, targeting 100 MHz processor clock frequency. The non fault-tolerant LEON processor VHDL model is available as free source code, and the SPARC architecture is a well-known industry standard. Therefore, know-how, software tools and operating systems are widely available.

  8. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2015-10-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  9. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2012-12-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  10. A grey DEMATEL-based approach for modeling enablers of green innovation in manufacturing organizations.

    Science.gov (United States)

    Gupta, Himanshu; Barua, Mukesh Kumar

    2018-04-01

    Incorporating green practices into the manufacturing process has gained momentum over the past few years and is a matter of great concern for both manufacturers as well as researchers. Regulatory pressures in developed countries have forced the organizations to adopt green practices; however, this issue still lacks attention in developing economies like India. There is an urgent need to identify enablers of green innovation for manufacturing organizations and also to identify prominent enablers among those. This study is an attempt to first identify enablers of green innovation and then establish a causal relationship among them to identify the enablers that can drive others. Grey DEMATEL (Decision Making Trial and Evaluation Laboratory) methodology is used for establishing the causal relationship among enablers. The novelty of this study lies in the fact that no study has been done in the past to identify the enablers of green innovation and then establishing the causal relationship among them. A total of 21 enablers of green innovation have been identified; research indicates developing green manufacturing capabilities, resources for green innovation, ease of getting loans from financial institutions, and environmental regulations as the most influential enablers of green innovation. Managerial and practical implications of the research are also presented to assist managers of the case company in adopting green innovation practices at their end.

  11. Expert judgment based multi-criteria decision model to address uncertainties in risk assessment of nanotechnology-enabled food products

    International Nuclear Information System (INIS)

    Flari, Villie; Chaudhry, Qasim; Neslo, Rabin; Cooke, Roger

    2011-01-01

    Currently, risk assessment of nanotechnology-enabled food products is considered difficult due to the large number of uncertainties involved. We developed an approach which could address some of the main uncertainties through the use of expert judgment. Our approach employs a multi-criteria decision model, based on probabilistic inversion that enables capturing experts’ preferences in regard to safety of nanotechnology-enabled food products, and identifying their opinions in regard to the significance of key criteria that are important in determining the safety of such products. An advantage of these sample-based techniques is that they provide out-of-sample validation and therefore a robust scientific basis. This validation in turn adds predictive power to the model developed. We achieved out-of-sample validation in two ways: (1) a portion of the expert preference data was excluded from the model’s fitting and was then predicted by the model fitted on the remaining rankings and (2) a (partially) different set of experts generated new scenarios, using the same criteria employed in the model, and ranked them; their ranks were compared with ranks predicted by the model. The degree of validation in each method was less than perfect but reasonably substantial. The validated model we applied captured and modelled experts’ preferences regarding safety of hypothetical nanotechnology-enabled food products. It appears therefore that such an approach can provide a promising route to explore further for assessing the risk of nanotechnology-enabled food products.

  12. Fullrmc, a rigid body Reverse Monte Carlo modeling package enabled with machine learning and artificial intelligence.

    Science.gov (United States)

    Aoun, Bachir

    2016-05-05

    A new Reverse Monte Carlo (RMC) package "fullrmc" for atomic or rigid body and molecular, amorphous, or crystalline materials is presented. fullrmc main purpose is to provide a fully modular, fast and flexible software, thoroughly documented, complex molecules enabled, written in a modern programming language (python, cython, C and C++ when performance is needed) and complying to modern programming practices. fullrmc approach in solving an atomic or molecular structure is different from existing RMC algorithms and software. In a nutshell, traditional RMC methods and software randomly adjust atom positions until the whole system has the greatest consistency with a set of experimental data. In contrast, fullrmc applies smart moves endorsed with reinforcement machine learning to groups of atoms. While fullrmc allows running traditional RMC modeling, the uniqueness of this approach resides in its ability to customize grouping atoms in any convenient way with no additional programming efforts and to apply smart and more physically meaningful moves to the defined groups of atoms. In addition, fullrmc provides a unique way with almost no additional computational cost to recur a group's selection, allowing the system to go out of local minimas by refining a group's position or exploring through and beyond not allowed positions and energy barriers the unrestricted three dimensional space around a group. © 2016 Wiley Periodicals, Inc.

  13. The Cancer Cell Line Encyclopedia enables predictive modelling of anticancer drug sensitivity.

    Science.gov (United States)

    Barretina, Jordi; Caponigro, Giordano; Stransky, Nicolas; Venkatesan, Kavitha; Margolin, Adam A; Kim, Sungjoon; Wilson, Christopher J; Lehár, Joseph; Kryukov, Gregory V; Sonkin, Dmitriy; Reddy, Anupama; Liu, Manway; Murray, Lauren; Berger, Michael F; Monahan, John E; Morais, Paula; Meltzer, Jodi; Korejwa, Adam; Jané-Valbuena, Judit; Mapa, Felipa A; Thibault, Joseph; Bric-Furlong, Eva; Raman, Pichai; Shipway, Aaron; Engels, Ingo H; Cheng, Jill; Yu, Guoying K; Yu, Jianjun; Aspesi, Peter; de Silva, Melanie; Jagtap, Kalpana; Jones, Michael D; Wang, Li; Hatton, Charles; Palescandolo, Emanuele; Gupta, Supriya; Mahan, Scott; Sougnez, Carrie; Onofrio, Robert C; Liefeld, Ted; MacConaill, Laura; Winckler, Wendy; Reich, Michael; Li, Nanxin; Mesirov, Jill P; Gabriel, Stacey B; Getz, Gad; Ardlie, Kristin; Chan, Vivien; Myer, Vic E; Weber, Barbara L; Porter, Jeff; Warmuth, Markus; Finan, Peter; Harris, Jennifer L; Meyerson, Matthew; Golub, Todd R; Morrissey, Michael P; Sellers, William R; Schlegel, Robert; Garraway, Levi A

    2012-03-28

    The systematic translation of cancer genomic data into knowledge of tumour biology and therapeutic possibilities remains challenging. Such efforts should be greatly aided by robust preclinical model systems that reflect the genomic diversity of human cancers and for which detailed genetic and pharmacological annotation is available. Here we describe the Cancer Cell Line Encyclopedia (CCLE): a compilation of gene expression, chromosomal copy number and massively parallel sequencing data from 947 human cancer cell lines. When coupled with pharmacological profiles for 24 anticancer drugs across 479 of the cell lines, this collection allowed identification of genetic, lineage, and gene-expression-based predictors of drug sensitivity. In addition to known predictors, we found that plasma cell lineage correlated with sensitivity to IGF1 receptor inhibitors; AHR expression was associated with MEK inhibitor efficacy in NRAS-mutant lines; and SLFN11 expression predicted sensitivity to topoisomerase inhibitors. Together, our results indicate that large, annotated cell-line collections may help to enable preclinical stratification schemata for anticancer agents. The generation of genetic predictions of drug response in the preclinical setting and their incorporation into cancer clinical trial design could speed the emergence of 'personalized' therapeutic regimens.

  14. The Cancer Cell Line Encyclopedia enables predictive modeling of anticancer drug sensitivity

    Science.gov (United States)

    Barretina, Jordi; Caponigro, Giordano; Stransky, Nicolas; Venkatesan, Kavitha; Margolin, Adam A.; Kim, Sungjoon; Wilson, Christopher J.; Lehár, Joseph; Kryukov, Gregory V.; Sonkin, Dmitriy; Reddy, Anupama; Liu, Manway; Murray, Lauren; Berger, Michael F.; Monahan, John E.; Morais, Paula; Meltzer, Jodi; Korejwa, Adam; Jané-Valbuena, Judit; Mapa, Felipa A.; Thibault, Joseph; Bric-Furlong, Eva; Raman, Pichai; Shipway, Aaron; Engels, Ingo H.; Cheng, Jill; Yu, Guoying K.; Yu, Jianjun; Aspesi, Peter; de Silva, Melanie; Jagtap, Kalpana; Jones, Michael D.; Wang, Li; Hatton, Charles; Palescandolo, Emanuele; Gupta, Supriya; Mahan, Scott; Sougnez, Carrie; Onofrio, Robert C.; Liefeld, Ted; MacConaill, Laura; Winckler, Wendy; Reich, Michael; Li, Nanxin; Mesirov, Jill P.; Gabriel, Stacey B.; Getz, Gad; Ardlie, Kristin; Chan, Vivien; Myer, Vic E.; Weber, Barbara L.; Porter, Jeff; Warmuth, Markus; Finan, Peter; Harris, Jennifer L.; Meyerson, Matthew; Golub, Todd R.; Morrissey, Michael P.; Sellers, William R.; Schlegel, Robert; Garraway, Levi A.

    2012-01-01

    The systematic translation of cancer genomic data into knowledge of tumor biology and therapeutic avenues remains challenging. Such efforts should be greatly aided by robust preclinical model systems that reflect the genomic diversity of human cancers and for which detailed genetic and pharmacologic annotation is available1. Here we describe the Cancer Cell Line Encyclopedia (CCLE): a compilation of gene expression, chromosomal copy number, and massively parallel sequencing data from 947 human cancer cell lines. When coupled with pharmacologic profiles for 24 anticancer drugs across 479 of the lines, this collection allowed identification of genetic, lineage, and gene expression-based predictors of drug sensitivity. In addition to known predictors, we found that plasma cell lineage correlated with sensitivity to IGF1 receptor inhibitors; AHR expression was associated with MEK inhibitor efficacy in NRAS-mutant lines; and SLFN11 expression predicted sensitivity to topoisomerase inhibitors. Altogether, our results suggest that large, annotated cell line collections may help to enable preclinical stratification schemata for anticancer agents. The generation of genetic predictions of drug response in the preclinical setting and their incorporation into cancer clinical trial design could speed the emergence of “personalized” therapeutic regimens2. PMID:22460905

  15. Fractal Geometry Enables Classification of Different Lung Morphologies in a Model of Experimental Asthma

    Science.gov (United States)

    Obert, Martin; Hagner, Stefanie; Krombach, Gabriele A.; Inan, Selcuk; Renz, Harald

    2015-06-01

    Animal models represent the basis of our current understanding of the pathophysiology of asthma and are of central importance in the preclinical development of drug therapies. The characterization of irregular lung shapes is a major issue in radiological imaging of mice in these models. The aim of this study was to find out whether differences in lung morphology can be described by fractal geometry. Healthy and asthmatic mouse groups, before and after an acute asthma attack induced by methacholine, were studied. In vivo flat-panel-based high-resolution Computed Tomography (CT) was used for mice's thorax imaging. The digital image data of the mice's lungs were segmented from the surrounding tissue. After that, the lungs were divided by image gray-level thresholds into two additional subsets. One subset contained basically the air transporting bronchial system. The other subset corresponds mainly to the blood vessel system. We estimated the fractal dimension of all sets of the different mouse groups using the mass radius relation (mrr). We found that the air transporting subset of the bronchial lung tissue enables a complete and significant differentiation between all four mouse groups (mean D of control mice before methacholine treatment: 2.64 ± 0.06; after treatment: 2.76 ± 0.03; asthma mice before methacholine treatment: 2.37 ± 0.16; after treatment: 2.71 ± 0.03; p < 0.05). We conclude that the concept of fractal geometry allows a well-defined, quantitative numerical and objective differentiation of lung shapes — applicable most likely also in human asthma diagnostics.

  16. Enabling Parametric Optimal Ascent Trajectory Modeling During Early Phases of Design

    Science.gov (United States)

    Holt, James B.; Dees, Patrick D.; Diaz, Manuel J.

    2015-01-01

    -modal due to the interaction of various constraints. Additionally, when these obstacles are coupled with The Program to Optimize Simulated Trajectories [1] (POST), an industry standard program to optimize ascent trajectories that is difficult to use, it requires expert trajectory analysts to effectively optimize a vehicle's ascent trajectory. As it has been pointed out, the paradigm of trajectory optimization is still a very manual one because using modern computational resources on POST is still a challenging problem. The nuances and difficulties involved in correctly utilizing, and therefore automating, the program presents a large problem. In order to address these issues, the authors will discuss a methodology that has been developed. The methodology is two-fold: first, a set of heuristics will be introduced and discussed that were captured while working with expert analysts to replicate the current state-of-the-art; secondly, leveraging the power of modern computing to evaluate multiple trajectories simultaneously, and therefore, enable the exploration of the trajectory's design space early during the pre-conceptual and conceptual phases of design. When this methodology is coupled with design of experiments in order to train surrogate models, the authors were able to visualize the trajectory design space, enabling parametric optimal ascent trajectory information to be introduced with other pre-conceptual and conceptual design tools. The potential impact of this methodology's success would be a fully automated POST evaluation suite for the purpose of conceptual and preliminary design trade studies. This will enable engineers to characterize the ascent trajectory's sensitivity to design changes in an arbitrary number of dimensions and for finding settings for trajectory specific variables, which result in optimal performance for a "dialed-in" launch vehicle design. The effort described in this paper was developed for the Advanced Concepts Office [2] at NASA Marshall

  17. Onboard autonomous mineral detectors for Mars rovers

    Science.gov (United States)

    Gilmore, M. S.; Bornstein, B.; Castano, R.; Merrill, M.; Greenwood, J.

    2005-12-01

    Mars rovers and orbiters currently collect far more data than can be downlinked to Earth, which reduces mission science return; this problem will be exacerbated by future rovers of enhanced capabilities and lifetimes. We are developing onboard intelligence sufficient to extract geologically meaningful data from spectrometer measurements of soil and rock samples, and thus to guide the selection, measurement and return of these data from significant targets at Mars. Here we report on techniques to construct mineral detectors capable of running on current and future rover and orbital hardware. We focus on carbonate and sulfate minerals which are of particular geologic importance because they can signal the presence of water and possibly life. Sulfates have also been discovered at the Eagle and Endurance craters in Meridiani Planum by the Mars Exploration Rover (MER) Opportunity and at other regions on Mars by the OMEGA instrument aboard Mars Express. We have developed highly accurate artificial neural network (ANN) and Support Vector Machine (SVM) based detectors capable of identifying calcite (CaCO3) and jarosite (KFe3(SO4)2(OH)6) in the visible/NIR (350-2500 nm) spectra of both laboratory specimens and rocks in Mars analogue field environments. To train the detectors, we used a generative model to create 1000s of linear mixtures of library end-member spectra in geologically realistic percentages. We have also augmented the model to include nonlinear mixing based on Hapke's models of bidirectional reflectance spectroscopy. Both detectors perform well on the spectra of real rocks that contain intimate mixtures of minerals, rocks in natural field environments, calcite covered by Mars analogue dust, and AVIRIS hyperspectral cubes. We will discuss the comparison of ANN and SVM classifiers for this task, technical challenges (weathering rinds, atmospheric compositions, and computational complexity), and plans for integration of these detectors into both the Coupled Layer

  18. Evaluating the skills of isotope-enabled general circulation models against in situ atmospheric water vapor isotope observations

    DEFF Research Database (Denmark)

    Steen-Larsen, Hans Christian; Risi, C.; Werner, M.

    2017-01-01

    The skills of isotope-enabled general circulation models are evaluated against atmospheric water vapor isotopes. We have combined in situ observations of surface water vapor isotopes spanning multiple field seasons (2010, 2011, and 2012) from the top of the Greenland Ice Sheet (NEEM site: 77.45°N......: 2014). This allows us to benchmark the ability to simulate the daily water vapor isotope variations from five different simulations using isotope-enabled general circulation models. Our model-data comparison documents clear isotope biases both on top of the Greenland Ice Sheet (1-11% for δ18O and 4...... boundary layer water vapor isotopes of the Baffin Bay region show strong influence on the water vapor isotopes at the NEEM deep ice core-drilling site in northwest Greenland. Our evaluation of the simulations using isotope-enabled general circulation models also documents wide intermodel spatial...

  19. Enabling School Structure, Collective Responsibility, and a Culture of Academic Optimism: Toward a Robust Model of School Performance in Taiwan

    Science.gov (United States)

    Wu, Jason H.; Hoy, Wayne K.; Tarter, C. John

    2013-01-01

    Purpose: The purpose of this research is twofold: to test a theory of academic optimism in Taiwan elementary schools and to expand the theory by adding new variables, collective responsibility and enabling school structure, to the model. Design/methodology/approach: Structural equation modeling was used to test, refine, and expand an…

  20. Model-Based Off-Nominal State Isolation and Detection System for Autonomous Fault Management, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed model-based Fault Management system addresses the need for cost-effective solutions that enable higher levels of onboard spacecraft autonomy to reliably...

  1. Model-Based Off-Nominal State Isolation and Detection System for Autonomous Fault Management, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed model-based Fault Management system addresses the need for cost-effective solutions that enable higher levels of onboard spacecraft autonomy to reliably...

  2. A New GPU-Enabled MODTRAN Thermal Model for the PLUME TRACKER Volcanic Emission Analysis Toolkit

    Science.gov (United States)

    Acharya, P. K.; Berk, A.; Guiang, C.; Kennett, R.; Perkins, T.; Realmuto, V. J.

    2013-12-01

    Real-time quantification of volcanic gaseous and particulate releases is important for (1) recognizing rapid increases in SO2 gaseous emissions which may signal an impending eruption; (2) characterizing ash clouds to enable safe and efficient commercial aviation; and (3) quantifying the impact of volcanic aerosols on climate forcing. The Jet Propulsion Laboratory (JPL) has developed state-of-the-art algorithms, embedded in their analyst-driven Plume Tracker toolkit, for performing SO2, NH3, and CH4 retrievals from remotely sensed multi-spectral Thermal InfraRed spectral imagery. While Plume Tracker provides accurate results, it typically requires extensive analyst time. A major bottleneck in this processing is the relatively slow but accurate FORTRAN-based MODTRAN atmospheric and plume radiance model, developed by Spectral Sciences, Inc. (SSI). To overcome this bottleneck, SSI in collaboration with JPL, is porting these slow thermal radiance algorithms onto massively parallel, relatively inexpensive and commercially-available GPUs. This paper discusses SSI's efforts to accelerate the MODTRAN thermal emission algorithms used by Plume Tracker. Specifically, we are developing a GPU implementation of the Curtis-Godson averaging and the Voigt in-band transmittances from near line center molecular absorption, which comprise the major computational bottleneck. The transmittance calculations were decomposed into separate functions, individually implemented as GPU kernels, and tested for accuracy and performance relative to the original CPU code. Speedup factors of 14 to 30× were realized for individual processing components on an NVIDIA GeForce GTX 295 graphics card with no loss of accuracy. Due to the separate host (CPU) and device (GPU) memory spaces, a redesign of the MODTRAN architecture was required to ensure efficient data transfer between host and device, and to facilitate high parallel throughput. Currently, we are incorporating the separate GPU kernels into a

  3. Onboard Plasmatron Hydrogen Production for Improved Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Daniel R. Cohn; Leslie Bromberg; Kamal Hadidi

    2005-12-31

    technology for onboard applications in internal combustion engine vehicles using diesel, gasoline and biofuels. This included the reduction of NOx and particulate matter emissions from diesel engines using plasmatron reformer generated hydrogen-rich gas, conversion of ethanol and bio-oils into hydrogen rich gas, and the development of new concepts for the use of plasmatron fuel reformers for enablement of HCCI engines.

  4. Toward quantitative prediction of charge mobility in organic semiconductors: tunneling enabled hopping model.

    Science.gov (United States)

    Geng, Hua; Peng, Qian; Wang, Linjun; Li, Haijiao; Liao, Yi; Ma, Zhiying; Shuai, Zhigang

    2012-07-10

    A tunneling-enabled hopping mechanism is proposed, providing a pratical tool to quantitatively assess charge mobility in organic semiconductors. The paradoxical phenomena in TIPS-pentacene is well explained in that the optical probe indicates localized charges while transport measurements show bands of charge. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Modeling and Simulation With Operational Databases to Enable Dynamic Situation Assessment & Prediction

    Science.gov (United States)

    2010-11-01

    subsections discuss the design of the simulations. 3.12.1 Lanchester5D Simulation A Lanchester simulation was developed to conduct performance...benchmarks using the WarpIV Kernel and HyperWarpSpeed. The Lanchester simulation contains a user-definable number of grid cells in which blue and red...forces engage in battle using Lanchester equations. Having a user-definable number of grid cells enables the simulation to be stressed with high entity

  6. Enabling proactive agricultural drainage reuse for improved water quality through collaborative networks and low-complexity data-driven modelling

    OpenAIRE

    Zia, Huma

    2015-01-01

    With increasing prevalence of Wireless Sensor Networks (WSNs) in agriculture and hydrology, there exists an opportunity for providing a technologically viable solution for the conservation of already scarce fresh water resources. In this thesis, a novel framework is proposed for enabling a proactive management of agricultural drainage and nutrient losses at farm scale where complex models are replaced by in-situ sensing, communication and low complexity predictive models suited to an autonomo...

  7. Enabling Integrated Decision Making for Electronic-Commerce by Modelling an Enterprise's Sharable Knowledge.

    Science.gov (United States)

    Kim, Henry M.

    2000-01-01

    An enterprise model, a computational model of knowledge about an enterprise, is a useful tool for integrated decision-making by e-commerce suppliers and customers. Sharable knowledge, once represented in an enterprise model, can be integrated by the modeled enterprise's e-commerce partners. Presents background on enterprise modeling, followed by…

  8. On-board Data Mining

    Science.gov (United States)

    Tanner, Steve; Stein, Cara; Graves, Sara J.

    Networks of remote sensors are becoming more common as technology improves and costs decline. In the past, a remote sensor was usually a device that collected data to be retrieved at a later time by some other mechanism. This collected data were usually processed well after the fact at a computer greatly removed from the in situ sensing location. This has begun to change as sensor technology, on-board processing, and network communication capabilities have increased and their prices have dropped. There has been an explosion in the number of sensors and sensing devices, not just around the world, but literally throughout the solar system. These sensors are not only becoming vastly more sophisticated, accurate, and detailed in the data they gather but they are also becoming cheaper, lighter, and smaller. At the same time, engineers have developed improved methods to embed computing systems, memory, storage, and communication capabilities into the platforms that host these sensors. Now, it is not unusual to see large networks of sensors working in cooperation with one another. Nor does it seem strange to see the autonomous operation of sensorbased systems, from space-based satellites to smart vacuum cleaners that keep our homes clean and robotic toys that help to entertain and educate our children. But access to sensor data and computing power is only part of the story. For all the power of these systems, there are still substantial limits to what they can accomplish. These include the well-known limits to current Artificial Intelligence capabilities and our limited ability to program the abstract concepts, goals, and improvisation needed for fully autonomous systems. But it also includes much more basic engineering problems such as lack of adequate power, communications bandwidth, and memory, as well as problems with the geolocation and real-time georeferencing required to integrate data from multiple sensors to be used together.

  9. Rapid Onboard Trajectory Design for Autonomous Spacecraft in Multibody Systems

    Science.gov (United States)

    Trumbauer, Eric Michael

    This research develops automated, on-board trajectory planning algorithms in order to support current and new mission concepts. These include orbiter missions to Phobos or Deimos, Outer Planet Moon orbiters, and robotic and crewed missions to small bodies. The challenges stem from the limited on-board computing resources which restrict full trajectory optimization with guaranteed convergence in complex dynamical environments. The approach taken consists of leveraging pre-mission computations to create a large database of pre-computed orbits and arcs. Such a database is used to generate a discrete representation of the dynamics in the form of a directed graph, which acts to index these arcs. This allows the use of graph search algorithms on-board in order to provide good approximate solutions to the path planning problem. Coupled with robust differential correction and optimization techniques, this enables the determination of an efficient path between any boundary conditions with very little time and computing effort. Furthermore, the optimization methods developed here based on sequential convex programming are shown to have provable convergence properties, as well as generating feasible major iterates in case of a system interrupt -- a key requirement for on-board application. The outcome of this project is thus the development of an algorithmic framework which allows the deployment of this approach in a variety of specific mission contexts. Test cases related to missions of interest to NASA and JPL such as a Phobos orbiter and a Near Earth Asteroid interceptor are demonstrated, including the results of an implementation on the RAD750 flight processor. This method fills a gap in the toolbox being developed to create fully autonomous space exploration systems.

  10. Aerial Logistics Management for Carrier Onboard Delivery

    Science.gov (United States)

    2016-09-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS AERIAL LOGISTICS MANAGEMENT FOR CARRIER ONBOARD DELIVERY by Samuel L. Chen September 2016...AND SUBTITLE AERIAL LOGISTICS MANAGEMENT FOR CARRIER ONBOARD DELIVERY 5. FUNDING NUMBERS 6. AUTHOR(S) Samuel L. Chen 7. PERFORMING ORGANIZATION NAME(S...delivery (COD) is the use of aircraft to transport people and cargo from a forward logistics site (FLS) to a carrier strike group (CSG). The goal of

  11. Mesospheric CO2 ice clouds on Mars observed by Planetary Fourier Spectrometer onboard Mars Express

    Science.gov (United States)

    Aoki, S.; Sato, Y.; Giuranna, M.; Wolkenberg, P.; Sato, T. M.; Nakagawa, H.; Kasaba, Y.

    2018-03-01

    We have investigated mesospheric CO2 ice clouds on Mars through analysis of near-infrared spectra acquired by Planetary Fourier Spectrometer (PFS) onboard the Mars Express (MEx) from MY 27 to MY 32. With the highest spectral resolution achieved thus far in the relevant spectral range among remote-sensing experiments orbiting Mars, PFS enables precise identification of the scattering peak of CO2 ice at the bottom of the 4.3 μm CO2 band. A total of 111 occurrences of CO2 ice cloud features have been detected over the period investigated. Data from the OMEGA imaging spectrometer onboard MEx confirm all of PFS detections from times when OMEGA operated simultaneously with PFS. The spatial and seasonal distributions of the CO2 ice clouds detected by PFS are consistent with previous observations by other instruments. We find CO2 ice clouds between Ls = 0° and 140° in distinct longitudinal corridors around the equatorial region (± 20°N). Moreover, CO2 ice clouds were preferentially detected at the observational LT range between 15-16 h in MY 29. However, observational biases prevent from distinguishing local time dependency from inter-annual variation. PFS also enables us to investigate the shape of mesospheric CO2 ice cloud spectral features in detail. In all cases, peaks were found between 4.240 and 4.265 μm. Relatively small secondary peaks were occasionally observed around 4.28 μm (8 occurrences). These spectral features cannot be reproduced using our radiative transfer model, which may be because the available CO2 ice refractive indices are inappropriate for the mesospheric temperatures of Mars, or because of the assumption in our model that the CO2 ice crystals are spherical and composed by pure CO2 ice.

  12. Proton exchange membrane fuel cells for electrical power generation on-board commercial airplanes

    International Nuclear Information System (INIS)

    Pratt, Joseph W.; Klebanoff, Leonard E.; Munoz-Ramos, Karina; Akhil, Abbas A.; Curgus, Dita B.; Schenkman, Benjamin L.

    2013-01-01

    Highlights: ► We examine proton exchange membrane fuel cells on-board commercial airplanes. ► We model the added fuel cell system’s effect on overall airplane performance. ► It is feasible to implement an on-board fuel cell system with current technology. ► Systems that maximize waste heat recovery are the best performing. ► Current PEM and H 2 storage technology results in an airplane performance penalty. -- Abstract: Deployed on a commercial airplane, proton exchange membrane (PEM) fuel cells may offer emissions reductions, thermal efficiency gains, and enable locating the power near the point of use. This work seeks to understand whether on-board fuel cell systems are technically feasible, and, if so, if they could offer a performance advantage for the airplane when using today’s off-the-shelf technology. We also examine the effects of the fuel cell system on airplane performance with (1) different electrical loads, (2) different locations on the airplane, and (3) expected advances in fuel cell and hydrogen storage technologies. Through hardware analysis and thermodynamic simulation, we found that an additional fuel cell system on a commercial airplane is technically feasible using current technology. Although applied to a Boeing 787-type airplane, the method presented is applicable to other airframes as well. Recovery and on-board use of the heat and water that is generated by the fuel cell is an important method to increase the benefit of such a system. The best performance is achieved when the fuel cell is coupled to a load that utilizes the full output of the fuel cell for the entire flight. The effects of location are small and location may be better determined by other considerations such as safety and modularity. Although the PEM fuel cell generates power more efficiently than the gas turbine generators currently used, when considering the effect of the fuel cell system on the airplane’s overall performance we found that an overall

  13. Enabling full-field physics-based optical proximity correction via dynamic model generation

    Science.gov (United States)

    Lam, Michael; Clifford, Chris; Raghunathan, Ananthan; Fenger, Germain; Adam, Kostas

    2017-07-01

    As extreme ultraviolet lithography becomes closer to reality for high volume production, its peculiar modeling challenges related to both inter and intrafield effects have necessitated building an optical proximity correction (OPC) infrastructure that operates with field position dependency. Previous state-of-the-art approaches to modeling field dependency used piecewise constant models where static input models are assigned to specific x/y-positions within the field. OPC and simulation could assign the proper static model based on simulation-level placement. However, in the realm of 7 and 5 nm feature sizes, small discontinuities in OPC from piecewise constant model changes can cause unacceptable levels of edge placement errors. The introduction of dynamic model generation (DMG) can be shown to effectively avoid these dislocations by providing unique mask and optical models per simulation region, allowing a near continuum of models through the field. DMG allows unique models for electromagnetic field, apodization, aberrations, etc. to vary through the entire field and provides a capability to precisely and accurately model systematic field signatures.

  14. Contagion effect of enabling or coercive use of costing model within the managerial couple in lean organizations

    DEFF Research Database (Denmark)

    Kristensen, Thomas; Israelsen, Poul

    In the lean strategy is enabling formalization behaviour expected at the lower levels of management to be successful. We study the contagion effect between the superior, middle manager, of the lower level manager. This effect is proposed to be a dominant contingency variable for the use of costin...... models at the lower levels of management. Thus the use of costing models at the middle manager level is an important key to be successful with the lean package.......In the lean strategy is enabling formalization behaviour expected at the lower levels of management to be successful. We study the contagion effect between the superior, middle manager, of the lower level manager. This effect is proposed to be a dominant contingency variable for the use of costing...

  15. Help seeking in older Asian people with dementia in Melbourne: using the Cultural Exchange Model to explore barriers and enablers.

    Science.gov (United States)

    Haralambous, Betty; Dow, Briony; Tinney, Jean; Lin, Xiaoping; Blackberry, Irene; Rayner, Victoria; Lee, Sook-Meng; Vrantsidis, Freda; Lautenschlager, Nicola; Logiudice, Dina

    2014-03-01

    The prevalence of dementia is increasing in Australia. Limited research is available on access to Cognitive Dementia and Memory Services (CDAMS) for people with dementia from Culturally and Linguistically Diverse (CALD) communities. This study aimed to determine the barriers and enablers to accessing CDAMS for people with dementia and their families of Chinese and Vietnamese backgrounds. Consultations with community members, community workers and health professionals were conducted using the "Cultural Exchange Model" framework. For carers, barriers to accessing services included the complexity of the health system, lack of time, travel required to get to services, language barriers, interpreters and lack of knowledge of services. Similarly, community workers and health professionals identified language, interpreters, and community perceptions as key barriers to service access. Strategies to increase knowledge included providing information via radio, printed material and education in community group settings. The "Cultural Exchange Model" enabled engagement with and modification of the approaches to meet the needs of the targeted CALD communities.

  16. Annotation of rule-based models with formal semantics to enable creation, analysis, reuse and visualization

    Science.gov (United States)

    Misirli, Goksel; Cavaliere, Matteo; Waites, William; Pocock, Matthew; Madsen, Curtis; Gilfellon, Owen; Honorato-Zimmer, Ricardo; Zuliani, Paolo; Danos, Vincent; Wipat, Anil

    2016-01-01

    Motivation: Biological systems are complex and challenging to model and therefore model reuse is highly desirable. To promote model reuse, models should include both information about the specifics of simulations and the underlying biology in the form of metadata. The availability of computationally tractable metadata is especially important for the effective automated interpretation and processing of models. Metadata are typically represented as machine-readable annotations which enhance programmatic access to information about models. Rule-based languages have emerged as a modelling framework to represent the complexity of biological systems. Annotation approaches have been widely used for reaction-based formalisms such as SBML. However, rule-based languages still lack a rich annotation framework to add semantic information, such as machine-readable descriptions, to the components of a model. Results: We present an annotation framework and guidelines for annotating rule-based models, encoded in the commonly used Kappa and BioNetGen languages. We adapt widely adopted annotation approaches to rule-based models. We initially propose a syntax to store machine-readable annotations and describe a mapping between rule-based modelling entities, such as agents and rules, and their annotations. We then describe an ontology to both annotate these models and capture the information contained therein, and demonstrate annotating these models using examples. Finally, we present a proof of concept tool for extracting annotations from a model that can be queried and analyzed in a uniform way. The uniform representation of the annotations can be used to facilitate the creation, analysis, reuse and visualization of rule-based models. Although examples are given, using specific implementations the proposed techniques can be applied to rule-based models in general. Availability and implementation: The annotation ontology for rule-based models can be found at http

  17. Systemic therapy and the social relational model of disability: enabling practices with people with intellectual disability

    OpenAIRE

    Haydon-Laurelut, Mark

    2009-01-01

    Therapy has been critiqued for personalizing the political (Kitzinger, 1993). The social-relational model (Thomas, 1999) is one theoretical resource for understanding the practices of therapy through a political lens. The social model(s) have viewed therapy with suspicion. This paper highlights – using composite case examples and the authors primary therapeutic modality, systemic therapy – some systemic practices with adults with Intellectual Disability (ID) that enact a position that it is s...

  18. Enabling intelligent copernicus services for carbon and water balance modeling of boreal forest ecosystems - North State

    Science.gov (United States)

    Häme, Tuomas; Mutanen, Teemu; Rauste, Yrjö; Antropov, Oleg; Molinier, Matthieu; Quegan, Shaun; Kantzas, Euripides; Mäkelä, Annikki; Minunno, Francesco; Atli Benediktsson, Jon; Falco, Nicola; Arnason, Kolbeinn; Storvold, Rune; Haarpaintner, Jörg; Elsakov, Vladimir; Rasinmäki, Jussi

    2015-04-01

    The objective of project North State, funded by Framework Program 7 of the European Union, is to develop innovative data fusion methods that exploit the new generation of multi-source data from Sentinels and other satellites in an intelligent, self-learning framework. The remote sensing outputs are interfaced with state-of-the-art carbon and water flux models for monitoring the fluxes over boreal Europe to reduce current large uncertainties. This will provide a paradigm for the development of products for future Copernicus services. The models to be interfaced are a dynamic vegetation model and a light use efficiency model. We have identified four groups of variables that will be estimated with remote sensed data: land cover variables, forest characteristics, vegetation activity, and hydrological variables. The estimates will be used as model inputs and to validate the model outputs. The earth observation variables are computed as automatically as possible, with an objective to completely automatic estimation. North State has two sites for intensive studies in southern and northern Finland, respectively, one in Iceland and one in state Komi of Russia. Additionally, the model input variables will be estimated and models applied over European boreal and sub-arctic region from Ural Mountains to Iceland. The accuracy assessment of the earth observation variables will follow statistical sampling design. Model output predictions are compared to earth observation variables. Also flux tower measurements are applied in the model assessment. In the paper, results of hyperspectral, Sentinel-1, and Landsat data and their use in the models is presented. Also an example of a completely automatic land cover class prediction is reported.

  19. ICoNOs MM: The IT-enabled Collaborative Networked Organizations Maturity Model

    NARCIS (Netherlands)

    Santana Tapia, R.G.

    2009-01-01

    The focus of this paper is to introduce a comprehensive model for assessing and improving maturity of business-IT alignment (B-ITa) in collaborative networked organizations (CNOs): the ICoNOs MM. This two dimensional maturity model (MM) addresses five levels of maturity as well as four domains to

  20. Investigating dye performance and crosstalk in fluorescence enabled bioimaging using a model system

    DEFF Research Database (Denmark)

    Arppe, Riikka; R. Carro-Temboury, Miguel; Hempel, Casper

    2017-01-01

    -talk of fluorophores on the detected fluorescence signal. The described model system comprises of lanthanide (III) ion doped Linde Type A zeolites dispersed in a PVA film stained with fluorophores. We tested: F18, MitoTracker Red and ATTO647N. This model system allowed comparing performance of the fluorophores...

  1. A controlled human malaria infection model enabling evaluation of transmission-blocking interventions

    NARCIS (Netherlands)

    Collins, K.A.; Wang, C.Y.; Adams, M.; Mitchell, H.; Rampton, M.; Elliott, S.; Reuling, I.J.; Bousema, T.; Sauerwein, R.; Chalon, S.; Mohrle, J.J.; McCarthy, J.S.

    2018-01-01

    BACKGROUND: Drugs and vaccines that can interrupt the transmission of Plasmodium falciparum will be important for malaria control and elimination. However, models for early clinical evaluation of candidate transmission-blocking interventions are currently unavailable. Here, we describe a new model

  2. Open Knee: Open Source Modeling & Simulation to Enable Scientific Discovery and Clinical Care in Knee Biomechanics

    Science.gov (United States)

    Erdemir, Ahmet

    2016-01-01

    Virtual representations of the knee joint can provide clinicians, scientists, and engineers the tools to explore mechanical function of the knee and its tissue structures in health and disease. Modeling and simulation approaches such as finite element analysis also provide the possibility to understand the influence of surgical procedures and implants on joint stresses and tissue deformations. A large number of knee joint models are described in the biomechanics literature. However, freely accessible, customizable, and easy-to-use models are scarce. Availability of such models can accelerate clinical translation of simulations, where labor intensive reproduction of model development steps can be avoided. The interested parties can immediately utilize readily available models for scientific discovery and for clinical care. Motivated by this gap, this study aims to describe an open source and freely available finite element representation of the tibiofemoral joint, namely Open Knee, which includes detailed anatomical representation of the joint's major tissue structures, their nonlinear mechanical properties and interactions. Three use cases illustrate customization potential of the model, its predictive capacity, and its scientific and clinical utility: prediction of joint movements during passive flexion, examining the role of meniscectomy on contact mechanics and joint movements, and understanding anterior cruciate ligament mechanics. A summary of scientific and clinically directed studies conducted by other investigators are also provided. The utilization of this open source model by groups other than its developers emphasizes the premise of model sharing as an accelerator of simulation-based medicine. Finally, the imminent need to develop next generation knee models are noted. These are anticipated to incorporate individualized anatomy and tissue properties supported by specimen-specific joint mechanics data for evaluation, all acquired in vitro from varying age

  3. STOL terminal area operating systems (aircraft and onboard avionics, ATC, navigation aids)

    Science.gov (United States)

    Burrous, C.; Erzberger, H.; Johnson, N.; Neuman, F.

    1974-01-01

    Operational procedures and systems onboard the STOL aircraft which are required to enable the aircraft to perform acceptably in restricted airspace in all types of atmospheric conditions and weather are discussed. Results of simulation and flight investigations to establish operational criteria are presented.

  4. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  5. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  6. A GIS-Enabled, Michigan-Specific, Hierarchical Groundwater Modeling and Visualization System

    Science.gov (United States)

    Liu, Q.; Li, S.; Mandle, R.; Simard, A.; Fisher, B.; Brown, E.; Ross, S.

    2005-12-01

    Efficient management of groundwater resources relies on a comprehensive database that represents the characteristics of the natural groundwater system as well as analysis and modeling tools to describe the impacts of decision alternatives. Many agencies in Michigan have spent several years compiling expensive and comprehensive surface water and groundwater inventories and other related spatial data that describe their respective areas of responsibility. However, most often this wealth of descriptive data has only been utilized for basic mapping purposes. The benefits from analyzing these data, using GIS analysis functions or externally developed analysis models or programs, has yet to be systematically realized. In this talk, we present a comprehensive software environment that allows Michigan groundwater resources managers and frontline professionals to make more effective use of the available data and improve their ability to manage and protect groundwater resources, address potential conflicts, design cleanup schemes, and prioritize investigation activities. In particular, we take advantage of the Interactive Ground Water (IGW) modeling system and convert it to a customized software environment specifically for analyzing, modeling, and visualizing the Michigan statewide groundwater database. The resulting Michigan IGW modeling system (IGW-M) is completely window-based, fully interactive, and seamlessly integrated with a GIS mapping engine. The system operates in real-time (on the fly) providing dynamic, hierarchical mapping, modeling, spatial analysis, and visualization. Specifically, IGW-M allows water resources and environmental professionals in Michigan to: * Access and utilize the extensive data from the statewide groundwater database, interactively manipulate GIS objects, and display and query the associated data and attributes; * Analyze and model the statewide groundwater database, interactively convert GIS objects into numerical model features

  7. The DSET Tool Library: A software approach to enable data exchange between climate system models

    Energy Technology Data Exchange (ETDEWEB)

    McCormick, J. [Lawrence Livermore National Lab., CA (United States)

    1994-12-01

    Climate modeling is a computationally intensive process. Until recently computers were not powerful enough to perform the complex calculations required to simulate the earth`s climate. As a result standalone programs were created that represent components of the earth`s climate (e.g., Atmospheric Circulation Model). However, recent advances in computing, including massively parallel computing, make it possible to couple the components forming a complete earth climate simulation. The ability to couple different climate model components will significantly improve our ability to predict climate accurately and reliably. Historically each major component of the coupled earth simulation is a standalone program designed independently with different coordinate systems and data representations. In order for two component models to be coupled, the data of one model must be mapped to the coordinate system of the second model. The focus of this project is to provide a general tool to facilitate the mapping of data between simulation components, with an emphasis on using object-oriented programming techniques to provide polynomial interpolation, line and area weighting, and aggregation services.

  8. Subject-enabled analytics model on measurement statistics in health risk expert system for public health informatics.

    Science.gov (United States)

    Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun

    2017-11-01

    This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. IMPEx : enabling model/observational data comparison in planetary plasma sciences

    Science.gov (United States)

    Génot, V.; Khodachenko, M.; Kallio, E. J.; Al-Ubaidi, T.; Alexeev, I. I.; Topf, F.; Gangloff, M.; André, N.; Bourrel, N.; Modolo, R.; Hess, S.; Perez-Suarez, D.; Belenkaya, E. S.; Kalegaev, V.

    2013-09-01

    The FP7 IMPEx infrastructure, whose general goal is to encourage and facilitate inter-comparison between observational and model data in planetary plasma sciences, is now established for 2 years. This presentation will focus on a tour of the different achievements which occurred during this period. Within the project, data originate from multiple sources : large observational databases (CDAWeb, AMDA at CDPP, CLWeb at IRAP), simulation databases for hybrid and MHD codes (FMI, LATMOS), planetary magnetic field models database and online services (SINP). Each of these databases proposes dedicated access to their models and runs (HWA@FMI, LATHYS@LATMOS, SMDC@SINP). To gather this large data ensemble, IMPEx offers a distributed framework in which these data may be visualized, analyzed, and shared thanks to interoperable tools; they comprise of AMDA - an online space physics analysis tool -, 3DView - a tool for data visualization in 3D planetary context -, and CLWeb - an online space physics visualization tool. A simulation data model, based on SPASE, has been designed to ease data exchange within the infrastructure. On the communication point of view, the VO paradigm has been retained and the architecture is based on web services and the IVOA protocol SAMP. The presentation will focus on how the tools may be operated synchronously to manipulate these heterogeneous data sets. Use cases based on in-flight missions and associated model runs will be proposed for the demonstration. Finally the motivation and functionalities of the future IMPEx portal will be exposed. As requirements to and potentialities of joining the IMPEx infrastructure will be shown, the presentation could be seen as an invitation to other modeling teams in the community which may be interested to promote their results via IMPEx.

  10. Integrating semantics and procedural generation: key enabling factors for declarative modeling of virtual worlds

    NARCIS (Netherlands)

    Bidarra, R.; Kraker, K.J. de; Smelik, R.M.; Tutenel, T.

    2010-01-01

    Manual content creation for virtual worlds can no longer satisfy the increasing demand arising from areas as entertainment and serious games, simulations, movies, etc. Furthermore, currently deployed modeling tools basically do not scale up: while they become more and more specialized and complex,

  11. Developmental Impact Analysis of an ICT-Enabled Scalable Healthcare Model in BRICS Economies

    Directory of Open Access Journals (Sweden)

    Dhrubes Biswas

    2012-06-01

    Full Text Available This article highlights the need for initiating a healthcare business model in a grassroots, emerging-nation context. This article’s backdrop is a history of chronic anomalies afflicting the healthcare sector in India and similarly placed BRICS nations. In these countries, a significant percentage of populations remain deprived of basic healthcare facilities and emergency services. Community (primary care services are being offered by public and private stakeholders as a panacea to the problem. Yet, there is an urgent need for specialized (tertiary care services at all levels. As a response to this challenge, an all-inclusive health-exchange system (HES model, which utilizes information communication technology (ICT to provide solutions in rural India, has been developed. The uniqueness of the model lies in its innovative hub-and-spoke architecture and its emphasis on affordability, accessibility, and availability to the masses. This article describes a developmental impact analysis (DIA that was used to assess the impact of this model. The article contributes to the knowledge base of readers by making them aware of the healthcare challenges emerging nations are facing and ways to mitigate those challenges using entrepreneurial solutions.

  12. Neonatal tolerance induction enables accurate evaluation of gene therapy for MPS I in a canine model.

    Science.gov (United States)

    Hinderer, Christian; Bell, Peter; Louboutin, Jean-Pierre; Katz, Nathan; Zhu, Yanqing; Lin, Gloria; Choa, Ruth; Bagel, Jessica; O'Donnell, Patricia; Fitzgerald, Caitlin A; Langan, Therese; Wang, Ping; Casal, Margret L; Haskins, Mark E; Wilson, James M

    2016-09-01

    High fidelity animal models of human disease are essential for preclinical evaluation of novel gene and protein therapeutics. However, these studies can be complicated by exaggerated immune responses against the human transgene. Here we demonstrate that dogs with a genetic deficiency of the enzyme α-l-iduronidase (IDUA), a model of the lysosomal storage disease mucopolysaccharidosis type I (MPS I), can be rendered immunologically tolerant to human IDUA through neonatal exposure to the enzyme. Using MPS I dogs tolerized to human IDUA as neonates, we evaluated intrathecal delivery of an adeno-associated virus serotype 9 vector expressing human IDUA as a therapy for the central nervous system manifestations of MPS I. These studies established the efficacy of the human vector in the canine model, and allowed for estimation of the minimum effective dose, providing key information for the design of first-in-human trials. This approach can facilitate evaluation of human therapeutics in relevant animal models, and may also have clinical applications for the prevention of immune responses to gene and protein replacement therapies. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    Science.gov (United States)

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  14. Single-shot spiral imaging enabled by an expanded encoding model: Demonstration in diffusion MRI.

    Science.gov (United States)

    Wilm, Bertram J; Barmet, Christoph; Gross, Simon; Kasper, Lars; Vannesjo, S Johanna; Haeberlin, Max; Dietrich, Benjamin E; Brunner, David O; Schmid, Thomas; Pruessmann, Klaas P

    2017-01-01

    The purpose of this work was to improve the quality of single-shot spiral MRI and demonstrate its application for diffusion-weighted imaging. Image formation is based on an expanded encoding model that accounts for dynamic magnetic fields up to third order in space, nonuniform static B 0 , and coil sensitivity encoding. The encoding model is determined by B 0 mapping, sensitivity mapping, and concurrent field monitoring. Reconstruction is performed by iterative inversion of the expanded signal equations. Diffusion-tensor imaging with single-shot spiral readouts is performed in a phantom and in vivo, using a clinical 3T instrument. Image quality is assessed in terms of artefact levels, image congruence, and the influence of the different encoding factors. Using the full encoding model, diffusion-weighted single-shot spiral imaging of high quality is accomplished both in vitro and in vivo. Accounting for actual field dynamics, including higher orders, is found to be critical to suppress blurring, aliasing, and distortion. Enhanced image congruence permitted data fusion and diffusion tensor analysis without coregistration. Use of an expanded signal model largely overcomes the traditional vulnerability of spiral imaging with long readouts. It renders single-shot spirals competitive with echo-planar readouts and thus deploys shorter echo times and superior readout efficiency for diffusion imaging and further prospective applications. Magn Reson Med 77:83-91, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  15. Attention Deficit Hyperactivity Disorder and Scholastic Achievement: A Model of Mediation via Academic Enablers

    Science.gov (United States)

    Volpe, Robert J.; DuPaul, George J.; DiPerna, James C.; Jitendra, Asha K.; Lutz, J. Gary; Tresco, Katy; Junod, Rosemary Vile

    2006-01-01

    The current study examined the influence of symptoms of attention deficit hyperactivity disorder (ADHD) on student academic achievement in reading and in mathematics in a sample of 146 first- through fourth-grade students, 103 of which were identified as having ADHD and academic problems in reading and/or math. A theoretical model was examined…

  16. Energy Consumption Model and Measurement Results for Network Coding-enabled IEEE 802.11 Meshed Wireless Networks

    DEFF Research Database (Denmark)

    Paramanathan, Achuthan; Rasmussen, Ulrik Wilken; Hundebøll, Martin

    2012-01-01

    This paper presents an energy model and energy measurements for network coding enabled wireless meshed networks based on IEEE 802.11 technology. The energy model and the energy measurement testbed is limited to a simple Alice and Bob scenario. For this toy scenario we compare the energy usages...... for a system with and without network coding support. While network coding reduces the number of radio transmissions, the operational activity on the devices due to coding will be increased. We derive an analytical model for the energy consumption and compare it to real measurements for which we build...... a flexible, low cost tool to be able to measure at any given node in a meshed network. We verify the precision of our tool by comparing it to a sophisticated device. Our main results in this paper are the derivation of an analytical energy model, the implementation of a distributed energy measurement testbed...

  17. Describing the clinical reasoning process: application of a model of enablement to a pediatric case.

    Science.gov (United States)

    Furze, Jennifer; Nelson, Kelly; O'Hare, Megan; Ortner, Amanda; Threlkeld, A Joseph; Jensen, Gail M

    2013-04-01

    Clinical reasoning is a core tenet of physical therapy practice leading to optimal patient care. The purpose of this case was to describe the outcomes, subjective experience, and reflective clinical reasoning process for a child with cerebral palsy using the International Classification of Functioning, Disability, and Health (ICF) model. Application of the ICF framework to a 9-year-old boy with spastic triplegic cerebral palsy was utilized to capture the interwoven factors present in this case. Interventions in the pool occurred twice weekly for 1 h over a 10-week period. Immediately post and 4 months post-intervention, the child made functional and meaningful gains. The family unit also developed an enjoyment of exercising together. Each individual family member described psychological, emotional, or physical health improvements. Reflection using the ICF model as a framework to discuss clinical reasoning can highlight important factors contributing to effective patient management.

  18. A probabilistic generative model for quantification of DNA modifications enables analysis of demethylation pathways.

    Science.gov (United States)

    Äijö, Tarmo; Huang, Yun; Mannerström, Henrik; Chavez, Lukas; Tsagaratou, Ageliki; Rao, Anjana; Lähdesmäki, Harri

    2016-03-14

    We present a generative model, Lux, to quantify DNA methylation modifications from any combination of bisulfite sequencing approaches, including reduced, oxidative, TET-assisted, chemical-modification assisted, and methylase-assisted bisulfite sequencing data. Lux models all cytosine modifications (C, 5mC, 5hmC, 5fC, and 5caC) simultaneously together with experimental parameters, including bisulfite conversion and oxidation efficiencies, as well as various chemical labeling and protection steps. We show that Lux improves the quantification and comparison of cytosine modification levels and that Lux can process any oxidized methylcytosine sequencing data sets to quantify all cytosine modifications. Analysis of targeted data from Tet2-knockdown embryonic stem cells and T cells during development demonstrates DNA modification quantification at unprecedented detail, quantifies active demethylation pathways and reveals 5hmC localization in putative regulatory regions.

  19. A model to enable indirect manufacturing options transactions between organisations: An application to the ceramic industry

    International Nuclear Information System (INIS)

    Rodriguez-Rodriguez, R.; Gomez-Gasquet, P.; Oltra-Badenes, R. F.

    2014-01-01

    In the current competitive contexts, it is widely accepted and proved that inter-enterprise collaboration lead in many occasions to better results. The Spanish ceramic industry must improve, dropping its manufacturing costs in order to be able to compete with low cost products coming from Asia. In this sense, this work presents the main results obtained from applying an innovative model, which facilitates the transfer of manufacturing options between two ceramic enterprises that share a common supplier in the scenario where one of them needs more manufacturing capacity than the one booked according to its demand forecast and the another need less. Then, some decisional mechanisms are applied, which output the values for certain parameters in order to augment the benefit of all the three participants. With the application of this model better organisational results both economic and of service level are achieved. (Author)

  20. A User-Centric Knowledge Creation Model in a Web of Object-Enabled Internet of Things Environment

    Science.gov (United States)

    Kibria, Muhammad Golam; Fattah, Sheik Mohammad Mostakim; Jeong, Kwanghyeon; Chong, Ilyoung; Jeong, Youn-Kwae

    2015-01-01

    User-centric service features in a Web of Object-enabled Internet of Things environment can be provided by using a semantic ontology that classifies and integrates objects on the World Wide Web as well as shares and merges context-aware information and accumulated knowledge. The semantic ontology is applied on a Web of Object platform to virtualize the real world physical devices and information to form virtual objects that represent the features and capabilities of devices in the virtual world. Detailed information and functionalities of multiple virtual objects are combined with service rules to form composite virtual objects that offer context-aware knowledge-based services, where context awareness plays an important role in enabling automatic modification of the system to reconfigure the services based on the context. Converting the raw data into meaningful information and connecting the information to form the knowledge and storing and reusing the objects in the knowledge base can both be expressed by semantic ontology. In this paper, a knowledge creation model that synchronizes a service logistic model and a virtual world knowledge model on a Web of Object platform has been proposed. To realize the context-aware knowledge-based service creation and execution, a conceptual semantic ontology model has been developed and a prototype has been implemented for a use case scenario of emergency service. PMID:26393609

  1. A User-Centric Knowledge Creation Model in a Web of Object-Enabled Internet of Things Environment

    Directory of Open Access Journals (Sweden)

    Muhammad Golam Kibria

    2015-09-01

    Full Text Available User-centric service features in a Web of Object-enabled Internet of Things environment can be provided by using a semantic ontology that classifies and integrates objects on the World Wide Web as well as shares and merges context-aware information and accumulated knowledge. The semantic ontology is applied on a Web of Object platform to virtualize the real world physical devices and information to form virtual objects that represent the features and capabilities of devices in the virtual world. Detailed information and functionalities of multiple virtual objects are combined with service rules to form composite virtual objects that offer context-aware knowledge-based services, where context awareness plays an important role in enabling automatic modification of the system to reconfigure the services based on the context. Converting the raw data into meaningful information and connecting the information to form the knowledge and storing and reusing the objects in the knowledge base can both be expressed by semantic ontology. In this paper, a knowledge creation model that synchronizes a service logistic model and a virtual world knowledge model on a Web of Object platform has been proposed. To realize the context-aware knowledge-based service creation and execution, a conceptual semantic ontology model has been developed and a prototype has been implemented for a use case scenario of emergency service.

  2. A User-Centric Knowledge Creation Model in a Web of Object-Enabled Internet of Things Environment.

    Science.gov (United States)

    Kibria, Muhammad Golam; Fattah, Sheik Mohammad Mostakim; Jeong, Kwanghyeon; Chong, Ilyoung; Jeong, Youn-Kwae

    2015-09-18

    User-centric service features in a Web of Object-enabled Internet of Things environment can be provided by using a semantic ontology that classifies and integrates objects on the World Wide Web as well as shares and merges context-aware information and accumulated knowledge. The semantic ontology is applied on a Web of Object platform to virtualize the real world physical devices and information to form virtual objects that represent the features and capabilities of devices in the virtual world. Detailed information and functionalities of multiple virtual objects are combined with service rules to form composite virtual objects that offer context-aware knowledge-based services, where context awareness plays an important role in enabling automatic modification of the system to reconfigure the services based on the context. Converting the raw data into meaningful information and connecting the information to form the knowledge and storing and reusing the objects in the knowledge base can both be expressed by semantic ontology. In this paper, a knowledge creation model that synchronizes a service logistic model and a virtual world knowledge model on a Web of Object platform has been proposed. To realize the context-aware knowledge-based service creation and execution, a conceptual semantic ontology model has been developed and a prototype has been implemented for a use case scenario of emergency service.

  3. Spatiotemporal Stochastic Modeling of IoT Enabled Cellular Networks: Scalability and Stability Analysis

    KAUST Repository

    Gharbieh, Mohammad; Elsawy, Hesham; Bader, Ahmed; Alouini, Mohamed-Slim

    2017-01-01

    The Internet of Things (IoT) is large-scale by nature, which is manifested by the massive number of connected devices as well as their vast spatial existence. Cellular networks, which provide ubiquitous, reliable, and efficient wireless access, will play fundamental rule in delivering the first-mile access for the data tsunami to be generated by the IoT. However, cellular networks may have scalability problems to provide uplink connectivity to massive numbers of connected things. To characterize the scalability of cellular uplink in the context of IoT networks, this paper develops a traffic-aware spatiotemporal mathematical model for IoT devices supported by cellular uplink connectivity. The developed model is based on stochastic geometry and queueing theory to account for the traffic requirement per IoT device, the different transmission strategies, and the mutual interference between the IoT devices. To this end, the developed model is utilized to characterize the extent to which cellular networks can accommodate IoT traffic as well as to assess and compare three different transmission strategies that incorporate a combination of transmission persistency, backoff, and power-ramping. The analysis and the results clearly illustrate the scalability problem imposed by IoT on cellular network and offer insights into effective scenarios for each transmission strategy.

  4. Spatiotemporal Stochastic Modeling of IoT Enabled Cellular Networks: Scalability and Stability Analysis

    KAUST Repository

    Gharbieh, Mohammad

    2017-05-02

    The Internet of Things (IoT) is large-scale by nature, which is manifested by the massive number of connected devices as well as their vast spatial existence. Cellular networks, which provide ubiquitous, reliable, and efficient wireless access, will play fundamental rule in delivering the first-mile access for the data tsunami to be generated by the IoT. However, cellular networks may have scalability problems to provide uplink connectivity to massive numbers of connected things. To characterize the scalability of cellular uplink in the context of IoT networks, this paper develops a traffic-aware spatiotemporal mathematical model for IoT devices supported by cellular uplink connectivity. The developed model is based on stochastic geometry and queueing theory to account for the traffic requirement per IoT device, the different transmission strategies, and the mutual interference between the IoT devices. To this end, the developed model is utilized to characterize the extent to which cellular networks can accommodate IoT traffic as well as to assess and compare three different transmission strategies that incorporate a combination of transmission persistency, backoff, and power-ramping. The analysis and the results clearly illustrate the scalability problem imposed by IoT on cellular network and offer insights into effective scenarios for each transmission strategy.

  5. Optimization of Planck-LFI on-board data handling

    Energy Technology Data Exchange (ETDEWEB)

    Maris, M; Galeotta, S; Frailis, M; Zacchei, A; Fogliani, S; Gasparo, F [INAF-OATs, Via G.B. Tiepolo 11, 34131 Trieste (Italy); Tomasi, M; Bersanelli, M [Universita di Milano, Dipartimento di Fisica, Via G. Celoria 16, 20133 Milano (Italy); Miccolis, M [Thales Alenia Space Italia S.p.A., S.S. Padana Superiore 290, 20090 Vimodrone (Italy); Hildebrandt, S; Chulani, H; Gomez, F [Instituto de Astrofisica de Canarias (IAC), C/o Via Lactea, s/n E38205 - La Laguna, Tenerife (Spain); Rohlfs, R; Morisset, N; Binko, P [ISDC Data Centre for Astrophysics, University of Geneva, ch. d' Ecogia 16, 1290 Versoix (Switzerland); Burigana, C; Butler, R C; Cuttaia, F; Franceschi, E [INAF-IASF Bologna, Via P. Gobetti, 101, 40129 Bologna (Italy); D' Arcangelo, O, E-mail: maris@oats.inaf.i [IFP-CNR, via Cozzi 53, 20125 Milano (Italy)

    2009-12-15

    To asses stability against 1/f noise, the Low Frequency Instrument (LFI) on-board the Planck mission will acquire data at a rate much higher than the data rate allowed by the science telemetry bandwith of 35.5 Kbps. The data are processed by an on-board pipeline, followed on-ground by a decoding and reconstruction step, to reduce the volume of data to a level compatible with the bandwidth while minimizing the loss of information. This paper illustrates the on-board processing of the scientific data used by Planck/LFI to fit the allowed data-rate, an intrinsecally lossy process which distorts the signal in a manner which depends on a set of five free parameters (N{sub aver}, r{sub 1}, r{sub 2}, q, O) for each of the 44 LFI detectors. The paper quantifies the level of distortion introduced by the on-board processing as a function of these parameters. It describes the method of tuning the on-board processing chain to cope with the limited bandwidth while keeping to a minimum the signal distortion. Tuning is sensitive to the statistics of the signal and has to be constantly adapted during flight. The tuning procedure is based on a optimization algorithm applied to unprocessed and uncompressed raw data provided either by simulations, pre-launch tests or data taken in flight from LFI operating in a special diagnostic acquisition mode. All the needed optimization steps are performed by an automated tool, OCA2, which simulates the on-board processing, explores the space of possible combinations of parameters, and produces a set of statistical indicators, among them: the compression rate C{sub r} and the processing noise epsilon{sub Q}. For Planck/LFI it is required that C{sub r} = 2.4 while, as for other systematics, epsilon{sub Q} would have to be less than 10% of rms of the instrumental white noise. An analytical model is developed that is able to extract most of the relevant information on the processing errors and the compression rate as a function of the signal

  6. SciDAC-Data, A Project to Enabling Data Driven Modeling of Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mubarak, M.; Ding, P.; Aliaga, L.; Tsaris, A.; Norman, A.; Lyon, A.; Ross, R.

    2016-10-10

    The SciDAC-Data project is a DOE funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab Data Center on the organization, movement, and consumption of High Energy Physics data. The project will analyze the analysis patterns and data organization that have been used by the NOvA, MicroBooNE, MINERvA and other experiments, to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations are designed to address questions of data handling, cache optimization and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership class exascale computing facilities. We will address the use of the SciDAC-Data distributions acquired from Fermilab Data Center’s analysis workflows and corresponding to around 71,000 HEP jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in HPC environments. In particular we describe in detail how the Sequential Access via Metadata (SAM) data handling system in combination with the dCache/Enstore based data archive facilities have been analyzed to develop the radically different models of the analysis of HEP data. We present how the simulation may be used to analyze the impact of design choices in archive facilities.

  7. Remote patient management: technology-enabled innovation and evolving business models for chronic disease care.

    Science.gov (United States)

    Coye, Molly Joel; Haselkorn, Ateret; DeMello, Steven

    2009-01-01

    Remote patient management (RPM) is a transformative technology that improves chronic care management while reducing net spending for chronic disease. Broadly deployed within the Veterans Health Administration and in many small trials elsewhere, RPM has been shown to support patient self-management, shift responsibilities to non-clinical providers, and reduce the use of emergency department and hospital services. Because transformative technologies offer major opportunities to advance national goals of improved quality and efficiency in health care, it is important to understand their evolution, the experiences of early adopters, and the business models that may support their deployment.

  8. Automation of On-Board Flightpath Management

    Science.gov (United States)

    Erzberger, H.

    1981-01-01

    The status of concepts and techniques for the design of onboard flight path management systems is reviewed. Such systems are designed to increase flight efficiency and safety by automating the optimization of flight procedures onboard aircraft. After a brief review of the origins and functions of such systems, two complementary methods are described for attacking the key design problem, namely, the synthesis of efficient trajectories. One method optimizes en route, the other optimizes terminal area flight; both methods are rooted in optimal control theory. Simulation and flight test results are reviewed to illustrate the potential of these systems for fuel and cost savings.

  9. Genome-scale modeling enables metabolic engineering of Saccharomyces cerevisiae for succinic acid production.

    Science.gov (United States)

    Agren, Rasmus; Otero, José Manuel; Nielsen, Jens

    2013-07-01

    In this work, we describe the application of a genome-scale metabolic model and flux balance analysis for the prediction of succinic acid overproduction strategies in Saccharomyces cerevisiae. The top three single gene deletion strategies, Δmdh1, Δoac1, and Δdic1, were tested using knock-out strains cultivated anaerobically on glucose, coupled with physiological and DNA microarray characterization. While Δmdh1 and Δoac1 strains failed to produce succinate, Δdic1 produced 0.02 C-mol/C-mol glucose, in close agreement with model predictions (0.03 C-mol/C-mol glucose). Transcriptional profiling suggests that succinate formation is coupled to mitochondrial redox balancing, and more specifically, reductive TCA cycle activity. While far from industrial titers, this proof-of-concept suggests that in silico predictions coupled with experimental validation can be used to identify novel and non-intuitive metabolic engineering strategies.

  10. Machine learning methods enable predictive modeling of antibody feature:function relationships in RV144 vaccinees.

    Science.gov (United States)

    Choi, Ickwon; Chung, Amy W; Suscovich, Todd J; Rerks-Ngarm, Supachai; Pitisuttithum, Punnee; Nitayaphan, Sorachai; Kaewkungwal, Jaranit; O'Connell, Robert J; Francis, Donald; Robb, Merlin L; Michael, Nelson L; Kim, Jerome H; Alter, Galit; Ackerman, Margaret E; Bailey-Kellogg, Chris

    2015-04-01

    The adaptive immune response to vaccination or infection can lead to the production of specific antibodies to neutralize the pathogen or recruit innate immune effector cells for help. The non-neutralizing role of antibodies in stimulating effector cell responses may have been a key mechanism of the protection observed in the RV144 HIV vaccine trial. In an extensive investigation of a rich set of data collected from RV144 vaccine recipients, we here employ machine learning methods to identify and model associations between antibody features (IgG subclass and antigen specificity) and effector function activities (antibody dependent cellular phagocytosis, cellular cytotoxicity, and cytokine release). We demonstrate via cross-validation that classification and regression approaches can effectively use the antibody features to robustly predict qualitative and quantitative functional outcomes. This integration of antibody feature and function data within a machine learning framework provides a new, objective approach to discovering and assessing multivariate immune correlates.

  11. Machine learning methods enable predictive modeling of antibody feature:function relationships in RV144 vaccinees.

    Directory of Open Access Journals (Sweden)

    Ickwon Choi

    2015-04-01

    Full Text Available The adaptive immune response to vaccination or infection can lead to the production of specific antibodies to neutralize the pathogen or recruit innate immune effector cells for help. The non-neutralizing role of antibodies in stimulating effector cell responses may have been a key mechanism of the protection observed in the RV144 HIV vaccine trial. In an extensive investigation of a rich set of data collected from RV144 vaccine recipients, we here employ machine learning methods to identify and model associations between antibody features (IgG subclass and antigen specificity and effector function activities (antibody dependent cellular phagocytosis, cellular cytotoxicity, and cytokine release. We demonstrate via cross-validation that classification and regression approaches can effectively use the antibody features to robustly predict qualitative and quantitative functional outcomes. This integration of antibody feature and function data within a machine learning framework provides a new, objective approach to discovering and assessing multivariate immune correlates.

  12. Lazy Updating of hubs can enable more realistic models by speeding up stochastic simulations

    International Nuclear Information System (INIS)

    Ehlert, Kurt; Loewe, Laurence

    2014-01-01

    To respect the nature of discrete parts in a system, stochastic simulation algorithms (SSAs) must update for each action (i) all part counts and (ii) each action's probability of occurring next and its timing. This makes it expensive to simulate biological networks with well-connected “hubs” such as ATP that affect many actions. Temperature and volume also affect many actions and may be changed significantly in small steps by the network itself during fever and cell growth, respectively. Such trends matter for evolutionary questions, as cell volume determines doubling times and fever may affect survival, both key traits for biological evolution. Yet simulations often ignore such trends and assume constant environments to avoid many costly probability updates. Such computational convenience precludes analyses of important aspects of evolution. Here we present “Lazy Updating,” an add-on for SSAs designed to reduce the cost of simulating hubs. When a hub changes, Lazy Updating postpones all probability updates for reactions depending on this hub, until a threshold is crossed. Speedup is substantial if most computing time is spent on such updates. We implemented Lazy Updating for the Sorting Direct Method and it is easily integrated into other SSAs such as Gillespie's Direct Method or the Next Reaction Method. Testing on several toy models and a cellular metabolism model showed >10× faster simulations for its use-cases—with a small loss of accuracy. Thus we see Lazy Updating as a valuable tool for some special but important simulation problems that are difficult to address efficiently otherwise

  13. Enabling Persistent Autonomy for Underwater Gliders with Ocean Model Predictions and Terrain Based Navigation

    Directory of Open Access Journals (Sweden)

    Andrew eStuntz

    2016-04-01

    Full Text Available Effective study of ocean processes requires sampling over the duration of long (weeks to months oscillation patterns. Such sampling requires persistent, autonomous underwater vehicles, that have a similarly long deployment duration. The spatiotemporal dynamics of the ocean environment, coupled with limited communication capabilities, make navigation and localization difficult, especially in coastal regions where the majority of interesting phenomena occur. In this paper, we consider the combination of two methods for reducing navigation and localization error; a predictive approach based on ocean model predictions and a prior information approach derived from terrain-based navigation. The motivation for this work is not only for real-time state estimation, but also for accurately reconstructing the actual path that the vehicle traversed to contextualize the gathered data, with respect to the science question at hand. We present an application for the practical use of priors and predictions for large-scale ocean sampling. This combined approach builds upon previous works by the authors, and accurately localizes the traversed path of an underwater glider over long-duration, ocean deployments. The proposed method takes advantage of the reliable, short-term predictions of an ocean model, and the utility of priors used in terrain-based navigation over areas of significant bathymetric relief to bound uncertainty error in dead-reckoning navigation. This method improves upon our previously published works by 1 demonstrating the utility of our terrain-based navigation method with multiple field trials, and 2 presenting a hybrid algorithm that combines both approaches to bound navigational error and uncertainty for long-term deployments of underwater vehicles. We demonstrate the approach by examining data from actual field trials with autonomous underwater gliders, and demonstrate an ability to estimate geographical location of an underwater glider to 2

  14. Lunar Landing Trajectory Design for Onboard Hazard Detection and Avoidance

    Science.gov (United States)

    Paschall, Steve; Brady, Tye; Sostaric, Ron

    2009-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing the software and hardware technology needed to support a safe and precise landing for the next generation of lunar missions. ALHAT provides this capability through terrain-relative navigation measurements to enhance global-scale precision, an onboard hazard detection system to select safe landing locations, and an Autonomous Guidance, Navigation, and Control (AGNC) capability to process these measurements and safely direct the vehicle to a landing location. This paper focuses on the key trajectory design issues relevant to providing an onboard Hazard Detection and Avoidance (HDA) capability for the lander. Hazard detection can be accomplished by the crew visually scanning the terrain through a window, a sensor system imaging the terrain, or some combination of both. For ALHAT, this hazard detection activity is provided by a sensor system, which either augments the crew s perception or entirely replaces the crew in the case of a robotic landing. Detecting hazards influences the trajectory design by requiring the proper perspective, range to the landing site, and sufficient time to view the terrain. Following this, the trajectory design must provide additional time to process this information and make a decision about where to safely land. During the final part of the HDA process, the trajectory design must provide sufficient margin to enable a hazard avoidance maneuver. In order to demonstrate the effects of these constraints on the landing trajectory, a tradespace of trajectory designs was created for the initial ALHAT Design Analysis Cycle (ALDAC-1) and each case evaluated with these HDA constraints active. The ALHAT analysis process, described in this paper, narrows down this tradespace and subsequently better defines the trajectory design needed to support onboard HDA. Future ALDACs will enhance this trajectory design by balancing these issues and others in an overall system

  15. Modeling ductal carcinoma in situ: a HER2-Notch3 collaboration enables luminal filling.

    LENUS (Irish Health Repository)

    Pradeep, C-R

    2012-02-16

    A large fraction of ductal carcinoma in situ (DCIS), a non-invasive precursor lesion of invasive breast cancer, overexpresses the HER2\\/neu oncogene. The ducts of DCIS are abnormally filled with cells that evade apoptosis, but the underlying mechanisms remain incompletely understood. We overexpressed HER2 in mammary epithelial cells and observed growth factor-independent proliferation. When grown in extracellular matrix as three-dimensional spheroids, control cells developed a hollow lumen, but HER2-overexpressing cells populated the lumen by evading apoptosis. We demonstrate that HER2 overexpression in this cellular model of DCIS drives transcriptional upregulation of multiple components of the Notch survival pathway. Importantly, luminal filling required upregulation of a signaling pathway comprising Notch3, its cleaved intracellular domain and the transcriptional regulator HES1, resulting in elevated levels of c-MYC and cyclin D1. In line with HER2-Notch3 collaboration, drugs intercepting either arm reverted the DCIS-like phenotype. In addition, we report upregulation of Notch3 in hyperplastic lesions of HER2 transgenic animals, as well as an association between HER2 levels and expression levels of components of the Notch pathway in tumor specimens of breast cancer patients. Therefore, it is conceivable that the integration of the Notch and HER2 signaling pathways contributes to the pathophysiology of DCIS.

  16. Conceptual model and economic experiments to explain nonpersistence and enable mechanism designs fostering behavioral change.

    Science.gov (United States)

    Djawadi, Behnud Mir; Fahr, René; Turk, Florian

    2014-12-01

    Medical nonpersistence is a worldwide problem of striking magnitude. Although many fields of studies including epidemiology, sociology, and psychology try to identify determinants for medical nonpersistence, comprehensive research to explain medical nonpersistence from an economics perspective is rather scarce. The aim of the study was to develop a conceptual framework that augments standard economic choice theory with psychological concepts of behavioral economics to understand how patients' preferences for discontinuing with therapy arise over the course of the medical treatment. The availability of such a framework allows the targeted design of mechanisms for intervention strategies. Our conceptual framework models the patient as an active economic agent who evaluates the benefits and costs for continuing with therapy. We argue that a combination of loss aversion and mental accounting operations explains why patients discontinue with therapy at a specific point in time. We designed a randomized laboratory economic experiment with a student subject pool to investigate the behavioral predictions. Subjects continue with therapy as long as experienced utility losses have to be compensated. As soon as previous losses are evened out, subjects perceive the marginal benefit of persistence lower than in the beginning of the treatment. Consequently, subjects start to discontinue with therapy. Our results highlight that concepts of behavioral economics capture the dynamic structure of medical nonpersistence better than does standard economic choice theory. We recommend that behavioral economics should be a mandatory part of the development of possible intervention strategies aimed at improving patients' compliance and persistence behavior. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  17. DNA sequence+shape kernel enables alignment-free modeling of transcription factor binding.

    Science.gov (United States)

    Ma, Wenxiu; Yang, Lin; Rohs, Remo; Noble, William Stafford

    2017-10-01

    Transcription factors (TFs) bind to specific DNA sequence motifs. Several lines of evidence suggest that TF-DNA binding is mediated in part by properties of the local DNA shape: the width of the minor groove, the relative orientations of adjacent base pairs, etc. Several methods have been developed to jointly account for DNA sequence and shape properties in predicting TF binding affinity. However, a limitation of these methods is that they typically require a training set of aligned TF binding sites. We describe a sequence + shape kernel that leverages DNA sequence and shape information to better understand protein-DNA binding preference and affinity. This kernel extends an existing class of k-mer based sequence kernels, based on the recently described di-mismatch kernel. Using three in vitro benchmark datasets, derived from universal protein binding microarrays (uPBMs), genomic context PBMs (gcPBMs) and SELEX-seq data, we demonstrate that incorporating DNA shape information improves our ability to predict protein-DNA binding affinity. In particular, we observe that (i) the k-spectrum + shape model performs better than the classical k-spectrum kernel, particularly for small k values; (ii) the di-mismatch kernel performs better than the k-mer kernel, for larger k; and (iii) the di-mismatch + shape kernel performs better than the di-mismatch kernel for intermediate k values. The software is available at https://bitbucket.org/wenxiu/sequence-shape.git. rohs@usc.edu or william-noble@uw.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  18. MALDI-TOF-MS with PLS Modeling Enables Strain Typing of the Bacterial Plant Pathogen Xanthomonas axonopodis

    Science.gov (United States)

    Sindt, Nathan M.; Robison, Faith; Brick, Mark A.; Schwartz, Howard F.; Heuberger, Adam L.; Prenni, Jessica E.

    2018-02-01

    Matrix-assisted desorption/ionization time of flight mass spectrometry (MALDI-TOF-MS) is a fast and effective tool for microbial species identification. However, current approaches are limited to species-level identification even when genetic differences are known. Here, we present a novel workflow that applies the statistical method of partial least squares discriminant analysis (PLS-DA) to MALDI-TOF-MS protein fingerprint data of Xanthomonas axonopodis, an important bacterial plant pathogen of fruit and vegetable crops. Mass spectra of 32 X. axonopodis strains were used to create a mass spectral library and PLS-DA was employed to model the closely related strains. A robust workflow was designed to optimize the PLS-DA model by assessing the model performance over a range of signal-to-noise ratios (s/n) and mass filter (MF) thresholds. The optimized parameters were observed to be s/n = 3 and MF = 0.7. The model correctly classified 83% of spectra withheld from the model as a test set. A new decision rule was developed, termed the rolled-up Maximum Decision Rule (ruMDR), and this method improved identification rates to 92%. These results demonstrate that MALDI-TOF-MS protein fingerprints of bacterial isolates can be utilized to enable identification at the strain level. Furthermore, the open-source framework of this workflow allows for broad implementation across various instrument platforms as well as integration with alternative modeling and classification algorithms.

  19. Arctic summer school onboard an icebreaker

    Science.gov (United States)

    Alexeev, Vladimir A.; Repina, Irina A.

    2014-05-01

    The International Arctic Research Center (IARC) of the University of Alaska Fairbanks conducted a summer school for PhD students, post-docs and early career scientists in August-September 2013, jointly with an arctic expedition as a part of NABOS project (Nansen and Amundsen Basin Observational System) onboard the Russian research vessel "Akademik Fedorov". Both the summer school and NABOS expedition were funded by the National Science Foundation. The one-month long summer school brought together graduate students and young scientists with specialists in arctic oceanography and climate to convey to a new generation of scientists the opportunities and challenges of arctic climate observations and modeling. Young scientists gained hands-on experience during the field campaign and learned about key issues in arctic climate from observational, diagnostic, and modeling perspectives. The summer school consisted of background lectures, participation in fieldwork and mini-projects. The mini-projects were performed in collaboration with summer school instructors and members of the expedition. Key topics covered in the lectures included: - arctic climate: key characteristics and processes; - physical processes in the Arctic Ocean; - sea ice and the Arctic Ocean; - trace gases, aerosols, and chemistry: importance for climate changes; - feedbacks in the arctic system (e.g., surface albedo, clouds, water vapor, circulation); - arctic climate variations: past, ongoing, and projected; - global climate models: an overview. An outreach specialist from the Miami Science Museum was writing a blog from the icebreaker with some very impressive statistics (results as of January 1, 2014): Total number of blog posts: 176 Blog posts written/contributed by scientists: 42 Blog views: 22,684 Comments: 1,215 Number of countries who viewed the blog: 89 (on 6 continents) The 33-day long NABOS expedition started on August 22, 2013 from Kirkenes, Norway. The vessel ("Akademik Fedorov") returned to

  20. Iterative maximum a posteriori (IMAP-DOAS for retrieval of strongly absorbing trace gases: Model studies for CH4 and CO2 retrieval from near infrared spectra of SCIAMACHY onboard ENVISAT

    Directory of Open Access Journals (Sweden)

    C. Frankenberg

    2005-01-01

    Full Text Available In the past, differential optical absorption spectroscopy (DOAS has mostly been employed for atmospheric trace gas retrieval in the UV/Vis spectral region. New spectrometers such as SCIAMACHY onboard ENVISAT also provide near infrared channels and thus allow for the detection of greenhouse gases like CH4, CO2, or N2O. However, modifications of the classical DOAS algorithm are necessary to account for the idiosyncrasies of this spectral region, i.e. the temperature and pressure dependence of the high resolution absorption lines. Furthermore, understanding the sensitivity of the measurement of these high resolution, strong absorption lines by means of a non-ideal device, i.e. having finite spectral resolution, is of special importance. This applies not only in the NIR, but can also prove to be an issue for the UV/Vis spectral region. This paper presents a modified iterative maximum a posteriori-DOAS (IMAP-DOAS algorithm based on optimal estimation theory introduced to the remote sensing community by rodgers76. This method directly iterates the vertical column densities of the absorbers of interest until the modeled total optical density fits the measurement. Although the discussion in this paper lays emphasis on satellite retrieval, the basic principles of the algorithm also hold for arbitrary measurement geometries. This new approach is applied to modeled spectra based on a comprehensive set of atmospheric temperature and pressure profiles. This analysis reveals that the sensitivity of measurement strongly depends on the prevailing pressure-height. The IMAP-DOAS algorithm properly accounts for the sensitivity of measurement on pressure due to pressure broadening of the absorption lines. Thus, biases in the retrieved vertical columns that would arise in classical algorithms, are obviated. Here, we analyse and quantify these systematic biases as well as errors due to variations in the temperature and pressure profiles, which is indispensable for

  1. Construction and Optimization of a Heterologous Pathway for Protocatechuate Catabolism in Escherichia coli Enables Bioconversion of Model Aromatic Compounds.

    Science.gov (United States)

    Clarkson, Sonya M; Giannone, Richard J; Kridelbaugh, Donna M; Elkins, James G; Guss, Adam M; Michener, Joshua K

    2017-09-15

    The production of biofuels from lignocellulose yields a substantial lignin by-product stream that currently has few applications. Biological conversion of lignin-derived compounds into chemicals and fuels has the potential to improve the economics of lignocellulose-derived biofuels, but few microbes are able both to catabolize lignin-derived aromatic compounds and to generate valuable products. While Escherichia coli has been engineered to produce a variety of fuels and chemicals, it is incapable of catabolizing most aromatic compounds. Therefore, we engineered E. coli to catabolize protocatechuate, a common intermediate in lignin degradation, as the sole source of carbon and energy via heterologous expression of a nine-gene pathway from Pseudomonas putida KT2440. We next used experimental evolution to select for mutations that increased growth with protocatechuate more than 2-fold. Increasing the strength of a single ribosome binding site in the heterologous pathway was sufficient to recapitulate the increased growth. After optimization of the core pathway, we extended the pathway to enable catabolism of a second model compound, 4-hydroxybenzoate. These engineered strains will be useful platforms to discover, characterize, and optimize pathways for conversions of lignin-derived aromatics. IMPORTANCE Lignin is a challenging substrate for microbial catabolism due to its polymeric and heterogeneous chemical structure. Therefore, engineering microbes for improved catabolism of lignin-derived aromatic compounds will require the assembly of an entire network of catabolic reactions, including pathways from genetically intractable strains. Constructing defined pathways for aromatic compound degradation in a model host would allow rapid identification, characterization, and optimization of novel pathways. We constructed and optimized one such pathway in E. coli to enable catabolism of a model aromatic compound, protocatechuate, and then extended the pathway to a related

  2. GATE V6: a major enhancement of the GATE simulation platform enabling modelling of CT and radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Jan, S; Becheva, E [DSV/I2BM/SHFJ, Commissariat a l' Energie Atomique, Orsay (France); Benoit, D; Rehfeld, N; Stute, S; Buvat, I [IMNC-UMR 8165 CNRS-Paris 7 and Paris 11 Universities, 15 rue Georges Clemenceau, 91406 Orsay Cedex (France); Carlier, T [INSERM U892-Cancer Research Center, University of Nantes, Nantes (France); Cassol, F; Morel, C [Centre de physique des particules de Marseille, CNRS-IN2P3 and Universite de la Mediterranee, Aix-Marseille II, 163, avenue de Luminy, 13288 Marseille Cedex 09 (France); Descourt, P; Visvikis, D [INSERM, U650, Laboratoire du Traitement de l' Information Medicale (LaTIM), CHU Morvan, Brest (France); Frisson, T; Grevillot, L; Guigues, L; Sarrut, D; Zahra, N [Universite de Lyon, CREATIS, CNRS UMR5220, Inserm U630, INSA-Lyon, Universite Lyon 1, Centre Leon Berard (France); Maigne, L; Perrot, Y [Laboratoire de Physique Corpusculaire, 24 Avenue des Landais, 63177 Aubiere Cedex (France); Schaart, D R [Delft University of Technology, Radiation Detection and Medical Imaging, Mekelweg 15, 2629 JB Delft (Netherlands); Pietrzyk, U, E-mail: buvat@imnc.in2p3.fr [Reseach Center Juelich, Institute of Neurosciences and Medicine and Department of Physics, University of Wuppertal (Germany)

    2011-02-21

    GATE (Geant4 Application for Emission Tomography) is a Monte Carlo simulation platform developed by the OpenGATE collaboration since 2001 and first publicly released in 2004. Dedicated to the modelling of planar scintigraphy, single photon emission computed tomography (SPECT) and positron emission tomography (PET) acquisitions, this platform is widely used to assist PET and SPECT research. A recent extension of this platform, released by the OpenGATE collaboration as GATE V6, now also enables modelling of x-ray computed tomography and radiation therapy experiments. This paper presents an overview of the main additions and improvements implemented in GATE since the publication of the initial GATE paper (Jan et al 2004 Phys. Med. Biol. 49 4543-61). This includes new models available in GATE to simulate optical and hadronic processes, novelties in modelling tracer, organ or detector motion, new options for speeding up GATE simulations, examples illustrating the use of GATE V6 in radiotherapy applications and CT simulations, and preliminary results regarding the validation of GATE V6 for radiation therapy applications. Upon completion of extensive validation studies, GATE is expected to become a valuable tool for simulations involving both radiotherapy and imaging.

  3. A study on the real-time reliability of on-board equipment of train control system

    Science.gov (United States)

    Zhang, Yong; Li, Shiwei

    2018-05-01

    Real-time reliability evaluation is conducive to establishing a condition based maintenance system for the purpose of guaranteeing continuous train operation. According to the inherent characteristics of the on-board equipment, the connotation of reliability evaluation of on-board equipment is defined and the evaluation index of real-time reliability is provided in this paper. From the perspective of methodology and practical application, the real-time reliability of the on-board equipment is discussed in detail, and the method of evaluating the realtime reliability of on-board equipment at component level based on Hidden Markov Model (HMM) is proposed. In this method the performance degradation data is used directly to realize the accurate perception of the hidden state transition process of on-board equipment, which can achieve a better description of the real-time reliability of the equipment.

  4. Applying CASE Tools for On-Board Software Development

    Science.gov (United States)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  5. Application of the Perceptual Factors, Enabling and Reinforcing Model on Pap Smaear Screening in Iranian Northern Woman

    Directory of Open Access Journals (Sweden)

    Abolhassan Naghibi

    2016-03-01

    Full Text Available Background and Purpose: Cervical cancer is the most prevalent cancer among women in the world. Cervical cancer is no symptoms and can be treated if diagnosed in the first stage of the disease. The aim of this study was to survey the affecting factors of the Pap smears test on perceptual factors, enabling and reinforcing (PEN-3 model constructs in women. Materials and Methods: This study was a descriptive cross-sectional study. The sample size was 416 married women with random sampling. The questionnaire had 50 questions based on PEN-3 model structures. Data were analyzed by descriptive statistics and logistic regression method in software SPSS 20. Results: The mean age of women was 32.70 ± 21.00 years. The knowledge of risk factors and screening methods for cervical cancer was 37.2. About 40% of women had a history of Pap smears. The most important of perception factors were effective, family history of the disease, encourage people to Pap smear, and fear of detecting of cervical cancer. The most important enabling factors were the presence of expert health personnel to provide training and Pap smear test (50.3%, lack of time and too busy to do Pap smear test (23.2%. The reinforcing factors were the media advice (41.3%, doctor’s advice (32.5% and neglect and forgetfulness (36.2%. Conclusion: This study has shown the Pap smear screening behavior affected by personal factors, family, cultural and economic. Application of PEN-3 can effective in planning and designing intervention programs for cervical cancer screening.

  6. Enablers and inhibitors of the implementation of the Casalud Model, a Mexican innovative healthcare model for non-communicable disease prevention and control.

    Science.gov (United States)

    Tapia-Conyer, Roberto; Saucedo-Martinez, Rodrigo; Mujica-Rosales, Ricardo; Gallardo-Rincon, Hector; Campos-Rivera, Paola Abril; Lee, Evan; Waugh, Craig; Guajardo, Lucia; Torres-Beltran, Braulio; Quijano-Gonzalez, Ursula; Soni-Gallardo, Lidia

    2016-07-22

    The Mexican healthcare system is under increasing strain due to the rising prevalence of non-communicable diseases (especially type 2 diabetes), mounting costs, and a reactive curative approach focused on treating existing diseases and their complications rather than preventing them. Casalud is a comprehensive primary healthcare model that enables proactive prevention and disease management throughout the continuum of care, using innovative technologies and a patient-centred approach. Data were collected over a 2-year period in eight primary health clinics (PHCs) in two states in central Mexico to identify and assess enablers and inhibitors of the implementation process of Casalud. We used mixed quantitative and qualitative data collection tools: surveys, in-depth interviews, and participant and non-participant observations. Transcripts and field notes were analyzed and coded using Framework Analysis, focusing on defining and describing enablers and inhibitors of the implementation process. We identified seven recurring topics in the analyzed textual data. Four topics were categorized as enablers: political support for the Casalud model, alignment with current healthcare trends, ongoing technical improvements (to ease adoption and support), and capacity building. Three topics were categorized as inhibitors: administrative practices, health clinic human resources, and the lack of a shared vision of the model. Enablers are located at PHCs and across all levels of government, and include political support for, and the technological validity of, the model. The main inhibitor is the persistence of obsolete administrative practices at both state and PHC levels, which puts the administrative feasibility of the model's implementation in jeopardy. Constructing a shared vision around the model could facilitate the implementation of Casalud as well as circumvent administrative inhibitors. In order to overcome PHC-level barriers, it is crucial to have an efficient and

  7. Fresh water generators onboard a floating platform

    International Nuclear Information System (INIS)

    Tewari, P.K.; Verma, R.K.; Misra, B.M.; Sadhulkan, H.K.

    1997-01-01

    A dependable supply of fresh water is essential for any ocean going vessel. The operating and maintenance personnel on offshore platforms and marine structures also require a constant and regular supply of fresh water to meet their essential daily needs. A seawater thermal desalination unit onboard delivers good quality fresh water from seawater. The desalination units developed by Bhabha Atomic Research Centre (BARC) suitable for ocean going vessels and offshore platforms have been discussed. Design considerations of such units with reference to floating platforms and corrosive environments have been presented. The feasibility of coupling a low temperature vacuum evaporation (LTVE) desalination plant suitable for an onboard floating platform to a PHWR nuclear power plant has also been discussed. (author). 1 ref., 3 figs, 2 tabs

  8. On-boarding the Middle Manager.

    Science.gov (United States)

    OʼConnor, Mary

    The trend of promoting clinical experts into management roles continues. New middle managers need a transitional plan that includes support, mentoring, and direction from senior leaders, including the chief nursing officer (CNO). This case study demonstrates how the CNO of one organization collaborated with a faculty member colleague to develop and implement a yearlong personalized on-boarding program for a group of new nurse middle managers.

  9. The AGILE on-board Kalman filter

    International Nuclear Information System (INIS)

    Giuliani, A.; Cocco, V.; Mereghetti, S.; Pittori, C.; Tavani, M.

    2006-01-01

    On-board reduction of particle background is one of the main challenges of space instruments dedicated to gamma-ray astrophysics. We present in this paper a discussion of the method and main simulation results of the on-board background filter of the Gamma-Ray Imaging Detector (GRID) of the AGILE mission. The GRID is capable of detecting and imaging with optimal point spread function gamma-ray photons in the range 30MeV-30GeV. The AGILE planned orbit is equatorial, with an altitude of 550km. This is an optimal orbit from the point of view of the expected particle background. For this orbit, electrons and positrons of kinetic energies between 20MeV and hundreds of MeV dominate the particle background, with significant contributions from high-energy (primary) and low-energy protons, and gamma-ray albedo-photons. We present here the main results obtained by extensive simulations of the on-board AGILE-GRID particle/photon background rejection algorithms based on a special application of Kalman filter techniques. This filter is applied (Level-2) sequentially after other data processing techniques characterizing the Level-1 processing. We show that, in conjunction with the Level-1 processing, the adopted Kalman filtering is expected to reduce the total particle/albedo-photon background rate to a value (=<10-30Hz) that is compatible with the AGILE telemetry. The AGILE on-board Kalman filter is also effective in reducing the Earth-albedo-photon background rate, and therefore contributes to substantially increase the AGILE exposure for celestial gamma-ray sources

  10. Onboard Data Processors for Planetary Ice-Penetrating Sounding Radars

    Science.gov (United States)

    Tan, I. L.; Friesenhahn, R.; Gim, Y.; Wu, X.; Jordan, R.; Wang, C.; Clark, D.; Le, M.; Hand, K. P.; Plaut, J. J.

    2011-12-01

    Among the many concerns faced by outer planetary missions, science data storage and transmission hold special significance. Such missions must contend with limited onboard storage, brief data downlink windows, and low downlink bandwidths. A potential solution to these issues lies in employing onboard data processors (OBPs) to convert raw data into products that are smaller and closely capture relevant scientific phenomena. In this paper, we present the implementation of two OBP architectures for ice-penetrating sounding radars tasked with exploring Europa and Ganymede. Our first architecture utilizes an unfocused processing algorithm extended from the Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS, Jordan et. al. 2009). Compared to downlinking raw data, we are able to reduce data volume by approximately 100 times through OBP usage. To ensure the viability of our approach, we have implemented, simulated, and synthesized this architecture using both VHDL and Matlab models (with fixed-point and floating-point arithmetic) in conjunction with Modelsim. Creation of a VHDL model of our processor is the principle step in transitioning to actual digital hardware, whether in a FPGA (field-programmable gate array) or an ASIC (application-specific integrated circuit), and successful simulation and synthesis strongly indicate feasibility. In addition, we examined the tradeoffs faced in the OBP between fixed-point accuracy, resource consumption, and data product fidelity. Our second architecture is based upon a focused fast back projection (FBP) algorithm that requires a modest amount of computing power and on-board memory while yielding high along-track resolution and improved slope detection capability. We present an overview of the algorithm and details of our implementation, also in VHDL. With the appropriate tradeoffs, the use of OBPs can significantly reduce data downlink requirements without sacrificing data product fidelity. Through the development

  11. Gas monitoring onboard ISS using FTIR spectroscopy

    Science.gov (United States)

    Gisi, Michael; Stettner, Armin; Seurig, Roland; Honne, Atle; Witt, Johannes; Rebeyre, Pierre

    2017-06-01

    In the confined, enclosed environment of a spacecraft, the air quality must be monitored continuously in order to safeguard the crew's health. For this reason, OHB builds the ANITA2 (Analysing Interferometer for Ambient Air) technology demonstrator for trace gas monitoring onboard the International Space Station (ISS). The measurement principle of ANITA2 is based on the Fourier Transform Infrared (FTIR) technology with dedicated gas analysis software from the Norwegian partner SINTEF. This combination proved to provide high sensitivity, accuracy and precision for parallel measurements of 33 trace gases simultaneously onboard ISS by the precursor instrument ANITA1. The paper gives a technical overview about the opto-mechanical components of ANITA2, such as the interferometer, the reference Laser, the infrared source and the gas cell design and a quick overview about the gas analysis. ANITA2 is very well suited for measuring gas concentrations specifically but not limited to usage onboard spacecraft, as no consumables are required and measurements are performed autonomously. ANITA2 is a programme under the contract of the European Space Agency, and the air quality monitoring system is a stepping stone into the future, as a precursor system for manned exploration missions.

  12. Models everywhere. How a fully integrated model-based test environment can enable progress in the future

    Energy Technology Data Exchange (ETDEWEB)

    Ben Gaid, Mongi; Lebas, Romain; Fremovici, Morgan; Font, Gregory; Le Solliec, Gunael [IFP Energies nouvelles, Rueil-Malmaison (France); Albrecht, Antoine [D2T Powertrain Engineering, Rueil-Malmaison (France)

    2011-07-01

    The aim of this paper is to demonstrate how advanced modelling approaches coupled with powerful tools allow to set up a complete and coherent test environment suite. Based on a real study focused on the development of a Euro 6 hybrid powertrain with a Euro 5 turbocharged diesel engine, the authors present how a diesel engine simulator including an in-cylinder phenomenological approach to predict the raw emissions can be coupled with a DOC and DPF after-treatment system and embedded in the complete hybrid powertrain to be used in various test environments: - coupled with the control software in a multi-model multi-core simulation platform with test automation features, allowing the simulation speed to be faster than the real-time; - exported in a real time hardware in the loop platform with the ECU and hardware actuators; embedded at the experimental engine test bed to perform driving cycles such as NEDC or FTP cycles with the hybrid powertrain management. Thanks to these complete and versatile test platform suite xMOD/Morphee, all the key issues of a full hybrid powertrain can be addressed efficiently and at low cost compared to the experimental powertrain prototypes: consumption minimisation, energy optimisation, thermal exhaust management. NOx/soots trade off, NO/NO2 ratios.. Having a good balance between versatility and compliancy of the model oriented test platforms such as presented in this paper is the best way to take the maximum benefit of the model developed at each stage of the powertrain development. (orig.)

  13. Construction and Optimization of a Heterologous Pathway for Protocatechuate Catabolism in Escherichia coli Enables Bioconversion of Model Aromatic Compounds

    Energy Technology Data Exchange (ETDEWEB)

    Clarkson, Sonya M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Biosciences Division; Giannone, Richard J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Chemical Sciences Division; Kridelbaugh, Donna M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Biosciences Division; Elkins, James G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Biosciences Division; Guss, Adam M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Biosciences Division; Michener, Joshua K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Biosciences Division, BioEnergy Science Center; Vieille, Claire [Michigan State Univ., East Lansing, MI (United States)

    2017-07-21

    The production of biofuels from lignocellulose yields a substantial lignin by-product stream that currently has few applications. Biological conversion of lignin-derived compounds into chemicals and fuels has the potential to improve the economics of lignocellulose-derived biofuels, but few microbes are able both to catabolize lignin-derived aromatic compounds and to generate valuable products. WhileEscherichia colihas been engineered to produce a variety of fuels and chemicals, it is incapable of catabolizing most aromatic compounds. Therefore, we engineeredE. colito catabolize protocatechuate, a common intermediate in lignin degradation, as the sole source of carbon and energy via heterologous expression of a nine-gene pathway fromPseudomonas putidaKT2440. Then, we used experimental evolution to select for mutations that increased growth with protocatechuate more than 2-fold. Increasing the strength of a single ribosome binding site in the heterologous pathway was sufficient to recapitulate the increased growth. After optimization of the core pathway, we extended the pathway to enable catabolism of a second model compound, 4-hydroxybenzoate. These engineered strains will be useful platforms to discover, characterize, and optimize pathways for conversions of lignin-derived aromatics.

    IMPORTANCELignin is a challenging substrate for microbial catabolism due to its polymeric and heterogeneous chemical structure. Therefore, engineering microbes for improved catabolism of lignin-derived aromatic compounds will require the assembly of an entire network of catabolic reactions, including pathways from genetically intractable strains. By constructing defined pathways for aromatic compound degradation in a model host would allow rapid

  14. EarthServer - an FP7 project to enable the web delivery and analysis of 3D/4D models

    Science.gov (United States)

    Laxton, John; Sen, Marcus; Passmore, James

    2013-04-01

    EarthServer aims at open access and ad-hoc analytics on big Earth Science data, based on the OGC geoservice standards Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS). The WCS model defines "coverages" as a unifying paradigm for multi-dimensional raster data, point clouds, meshes, etc., thereby addressing a wide range of Earth Science data including 3D/4D models. WCPS allows declarative SQL-style queries on coverages. The project is developing a pilot implementing these standards, and will also investigate the use of GeoSciML to describe coverages. Integration of WCPS with XQuery will in turn allow coverages to be queried in combination with their metadata and GeoSciML description. The unified service will support navigation, extraction, aggregation, and ad-hoc analysis on coverage data from SQL. Clients will range from mobile devices to high-end immersive virtual reality, and will enable 3D model visualisation using web browser technology coupled with developing web standards. EarthServer is establishing open-source client and server technology intended to be scalable to Petabyte/Exabyte volumes, based on distributed processing, supercomputing, and cloud virtualization. Implementation will be based on the existing rasdaman server technology developed. Services using rasdaman technology are being installed serving the atmospheric, oceanographic, geological, cryospheric, planetary and general earth observation communities. The geology service (http://earthserver.bgs.ac.uk/) is being provided by BGS and at present includes satellite imagery, superficial thickness data, onshore DTMs and 3D models for the Glasgow area. It is intended to extend the data sets available to include 3D voxel models. Use of the WCPS standard allows queries to be constructed against single or multiple coverages. For example on a single coverage data for a particular area can be selected or data with a particular range of pixel values. Queries on multiple surfaces can be

  15. TECHNICAL MAINTENANCE EFFICIENCY OF THE AIRCRAFT MAINTENANCE-FREE ON-BOARD SYSTEM BETWEEN SCHEDULED MAINTENANCES

    Directory of Open Access Journals (Sweden)

    A. M. Bronnikov

    2017-01-01

    Full Text Available The avionics concept of the maintenance-free on-board equipment implies the absence of necessity to maintain onboard systems between scheduled maintenance, preserving the required operational and technical characteristics; it should be achieved by automatic diagnosis of the technical condition and the application of active means of ensuring a failsafe design, allowing to change the structure of the system to maintain its functions in case of failure. It is supposed that such equipment will reduce substantially and in the limit eliminate traditional maintenance of aircraft between scheduled maintenance, ensuring maximum readiness for use, along with improving safety. The paper proposes a methodology for evaluating the efficiency of maintenance-free between scheduled maintenance aircraft system with homogeneous redundancy. The excessive redundant elements allow the system to accumulate failures which are repaired during the routine maintenance. If the number of failures of any reserve is approaching a critical value, the recovery of the on-board system (elimination of all failures is carried out between scheduled maintenance by conducting rescue and recovery operations. It is believed that service work leads to the elimination of all failures and completely updates the on-board system. The process of system operational status changes is described with the discrete-continuous model in the flight time. The average losses in the sorties and the average cost of operation are used as integrated efficiency indicators of system operation. For example, the evaluation of the operation efficiency of formalized on-board system with homogeneous redundancy demonstrates the efficiency of the proposed methodology and the possibility of its use while analyzing the efficiency of the maintenance-free operation equipment between scheduled periods. As well as a comparative analysis of maintenance-free operation efficiency of the on-board system with excessive

  16. Towards a Good Practice Model for an Entrepreneurial HEI: Perspectives of Academics, Enterprise Enablers and Graduate Entrepreneurs

    Science.gov (United States)

    Williams, Perri; Fenton, Mary

    2013-01-01

    This paper reports on an examination of the perspectives of academics, enterprise enablers and graduate entrepreneurs of an entrepreneurial higher education institution (HEI). The research was conducted in Ireland among 30 graduate entrepreneurs and 15 academics and enterprise enablers (enterprise development agency personnel) to provide a…

  17. Use of eHealth technologies to enable the implementation of musculoskeletal Models of Care: Evidence and practice.

    Science.gov (United States)

    Slater, Helen; Dear, Blake F; Merolli, Mark A; Li, Linda C; Briggs, Andrew M

    2016-06-01

    Musculoskeletal (MSK) conditions are the second leading cause of morbidity-related burden of disease globally. EHealth is a potentially critical factor that enables the implementation of accessible, sustainable and more integrated MSK models of care (MoCs). MoCs serve as a vehicle to drive evidence into policy and practice through changes at a health system, clinician and patient level. The use of eHealth to implement MoCs is intuitive, given the capacity to scale technologies to deliver system and economic efficiencies, to contribute to sustainability, to adapt to low-resource settings and to mitigate access and care disparities. We follow a practice-oriented approach to describing the 'what' and 'how' to harness eHealth in the implementation of MSK MoCs. We focus on the practical application of eHealth technologies across care settings to those MSK conditions contributing most substantially to the burden of disease, including osteoarthritis and inflammatory arthritis, skeletal fragility-associated conditions and persistent MSK pain. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Proposal for a Conceptual Model for Evaluating Lean Product Development Performance: A Study of LPD Enablers in Manufacturing Companies

    Science.gov (United States)

    Osezua Aikhuele, Daniel; Mohd Turan, Faiz

    2016-02-01

    The instability in today's market and the emerging demands for mass customized products by customers, are driving companies to seek for cost effective and time efficient improvements in their production system and this have led to real pressure for the adaptation of new developmental architecture and operational parameters to remain competitive in the market. Among such developmental architecture adopted, is the integration of lean thinking in the product development process. However, due to lack of clear understanding of the lean performance and its measurements, many companies are unable to implement and fully integrate the lean principle into their product development process and without a proper performance measurement, the performance level of the organizational value stream will be unknown and the specific area of improvement as it relates to the LPD program cannot be tracked. Hence, it will result in poor decision making in the LPD implementation. This paper therefore seeks to present a conceptual model for evaluation of LPD performances by identifying and analysing the core existing LPD enabler (Chief Engineer, Cross-functional teams, Set-based engineering, Poka-yoke (mistakeproofing), Knowledge-based environment, Value-focused planning and development, Top management support, Technology, Supplier integration, Workforce commitment and Continuous improvement culture) for assessing the LPD performance.

  19. On-Board Rendezvous Targeting for Orion

    Science.gov (United States)

    Weeks, Michael W.; DSouza, Christopher N.

    2010-01-01

    The Orion On-board GNC system is among the most complex ever developed for a space mission. It is designed to operate autonomously (independent of the ground). The rendezvous system in particular was designed to operate on the far side of the moon, and in the case of loss-of-communications with the ground. The vehicle GNC system is designed to retarget the rendezvous maneuvers, given a mission plan. As such, all the maneuvers which will be performed by Orion, have been designed and are being incorporated into the flight code.

  20. On-board processing for telecommunications satellites

    Science.gov (United States)

    Nuspl, P. P.; Dong, G.

    1991-01-01

    In this decade, communications satellite systems will probably face dramatic challenges from alternative transmission means. To balance and overcome such competition, and to prepare for new requirements, INTELSAT has developed several on-board processing techniques, including Satellite-Switched TDMA (SS-TDMA), Satellite-Switched FDMA (SS-FDMA), several Modulators/Demodulators (Modem), a Multicarrier Multiplexer and Demodulator MCDD), an International Business Service (IBS)/Intermediate Data Rate (IDR) BaseBand Processor (BBP), etc. Some proof-of-concept hardware and software were developed, and tested recently in the INTELSAT Technical Laboratories. These techniques and some test results are discussed.

  1. Genome-Enabled Modeling of Biogeochemical Processes Predicts Metabolic Dependencies that Connect the Relative Fitness of Microbial Functional Guilds

    Science.gov (United States)

    Brodie, E.; King, E.; Molins, S.; Karaoz, U.; Steefel, C. I.; Banfield, J. F.; Beller, H. R.; Anantharaman, K.; Ligocki, T. J.; Trebotich, D.

    2015-12-01

    Pore-scale processes mediated by microorganisms underlie a range of critical ecosystem services, regulating carbon stability, nutrient flux, and the purification of water. Advances in cultivation-independent approaches now provide us with the ability to reconstruct thousands of genomes from microbial populations from which functional roles may be assigned. With this capability to reveal microbial metabolic potential, the next step is to put these microbes back where they belong to interact with their natural environment, i.e. the pore scale. At this scale, microorganisms communicate, cooperate and compete across their fitness landscapes with communities emerging that feedback on the physical and chemical properties of their environment, ultimately altering the fitness landscape and selecting for new microbial communities with new properties and so on. We have developed a trait-based model of microbial activity that simulates coupled functional guilds that are parameterized with unique combinations of traits that govern fitness under dynamic conditions. Using a reactive transport framework, we simulate the thermodynamics of coupled electron donor-acceptor reactions to predict energy available for cellular maintenance, respiration, biomass development, and enzyme production. From metagenomics, we directly estimate some trait values related to growth and identify the linkage of key traits associated with respiration and fermentation, macromolecule depolymerizing enzymes, and other key functions such as nitrogen fixation. Our simulations were carried out to explore abiotic controls on community emergence such as seasonally fluctuating water table regimes across floodplain organic matter hotspots. Simulations and metagenomic/metatranscriptomic observations highlighted the many dependencies connecting the relative fitness of functional guilds and the importance of chemolithoautotrophic lifestyles. Using an X-Ray microCT-derived soil microaggregate physical model combined

  2. Spacecube: A Family of Reconfigurable Hybrid On-Board Science Data Processors

    Science.gov (United States)

    Flatley, Thomas P.

    2015-01-01

    SpaceCube is a family of Field Programmable Gate Array (FPGA) based on-board science data processing systems developed at the NASA Goddard Space Flight Center (GSFC). The goal of the SpaceCube program is to provide 10x to 100x improvements in on-board computing power while lowering relative power consumption and cost. SpaceCube is based on the Xilinx Virtex family of FPGAs, which include processor, FPGA logic and digital signal processing (DSP) resources. These processing elements are leveraged to produce a hybrid science data processing platform that accelerates the execution of algorithms by distributing computational functions to the most suitable elements. This approach enables the implementation of complex on-board functions that were previously limited to ground based systems, such as on-board product generation, data reduction, calibration, classification, eventfeature detection, data mining and real-time autonomous operations. The system is fully reconfigurable in flight, including data parameters, software and FPGA logic, through either ground commanding or autonomously in response to detected eventsfeatures in the instrument data stream.

  3. On-board data management study for EOPAP

    Science.gov (United States)

    Davisson, L. D.

    1975-01-01

    The requirements, implementation techniques, and mission analysis associated with on-board data management for EOPAP were studied. SEASAT-A was used as a baseline, and the storage requirements, data rates, and information extraction requirements were investigated for each of the following proposed SEASAT sensors: a short pulse 13.9 GHz radar, a long pulse 13.9 GHz radar, a synthetic aperture radar, a multispectral passive microwave radiometer facility, and an infrared/visible very high resolution radiometer (VHRR). Rate distortion theory was applied to determine theoretical minimum data rates and compared with the rates required by practical techniques. It was concluded that practical techniques can be used which approach the theoretically optimum based upon an empirically determined source random process model. The results of the preceding investigations were used to recommend an on-board data management system for (1) data compression through information extraction, optimal noiseless coding, source coding with distortion, data buffering, and data selection under command or as a function of data activity, (2) for command handling, (3) for spacecraft operation and control, and (4) for experiment operation and monitoring.

  4. Orienting and Onboarding Clinical Nurse Specialists: A Process Improvement Project.

    Science.gov (United States)

    Garcia, Mayra G; Watt, Jennifer L; Falder-Saeed, Karie; Lewis, Brennan; Patton, Lindsey

    Clinical nurse specialists (CNSs) have a unique advanced practice role. This article describes a process useful in establishing a comprehensive orientation and onboarding program for a newly hired CNS. The project team used the National Association of Clinical Nurse Specialists core competencies as a guide to construct a process for effectively onboarding and orienting newly hired CNSs. Standardized documents were created for the orientation process including a competency checklist, needs assessment template, and professional evaluation goals. In addition, other documents were revised to streamline the orientation process. Standardizing the onboarding and orientation process has demonstrated favorable results. As of 2016, 3 CNSs have successfully been oriented and onboarded using the new process. Unique healthcare roles require special focus when onboarding and orienting into a healthcare system. The use of the National Association of Clinical Nurse Specialists core competencies guided the project in establishing a successful orientation and onboarding process for newly hired CNSs.

  5. Estimation of waves and ship responses using onboard measurements

    DEFF Research Database (Denmark)

    Montazeri, Najmeh

    This thesis focuses on estimation of waves and ship responses using ship-board measurements. This is useful for development of operational safety and performance efficiency in connection with the broader concept of onboard decision support systems. Estimation of sea state is studied using a set...... of measured ship responses, a parametric description of directional wave spectra (a generalised JONSWAP model) and the transfer functions of the ship responses. The difference between the spectral moments of the measured ship responses and the corresponding theoretically calculated moments formulates a cost...... information. The model is tested on simulated data based on known unimodal and bimodal wave scenarios. The wave parameters in the output are then compared with the true wave parameters. In addition to the numerical experiments, two sets of full-scale measurements from container ships are analysed. Herein...

  6. Identifying Onboarding Heuristics for Free-to-Play Mobile Games

    DEFF Research Database (Denmark)

    Thomsen, Line Ebdrup; Weigert Petersen, Falko; Drachen, Anders

    2016-01-01

    a set of heuristics for the design of onboarding phases in mobile games is presented. The heuristics are identified by a lab-based mixed-methods experiment, utilizing lightweight psycho-physiological measures together with self-reported player responses, across three titles that cross the genres...... of puzzle games, base builders and arcade games, and utilize different onboarding phase design approaches. Results showcase how heuristics can be used to design engaging onboarding phases in mobile games....

  7. Flight Hardware Virtualization for On-Board Science Data Processing

    Data.gov (United States)

    National Aeronautics and Space Administration — Utilize Hardware Virtualization technology to benefit on-board science data processing by investigating new real time embedded Hardware Virtualization solutions and...

  8. Control of the Onboard Microgravity Environment and Extension of the Service Life of the Long-Term Space Station

    Science.gov (United States)

    Titov, V. A.

    2018-03-01

    The problem of control of the on-board microgravity environment in order to extend the service life of the long-term space station has been discussed. Software developed for the ISS and the results of identifying dynamic models and external impacts based on telemetry data have been presented. Proposals for controlling the onboard microgravity environment for future long-term space stations have been formulated.

  9. WE-AB-303-09: Rapid Projection Computations for On-Board Digital Tomosynthesis in Radiation Therapy

    International Nuclear Information System (INIS)

    Iliopoulos, AS; Sun, X; Pitsianis, N; Yin, FF; Ren, L

    2015-01-01

    Purpose: To facilitate fast and accurate iterative volumetric image reconstruction from limited-angle on-board projections. Methods: Intrafraction motion hinders the clinical applicability of modern radiotherapy techniques, such as lung stereotactic body radiation therapy (SBRT). The LIVE system may impact clinical practice by recovering volumetric information via Digital Tomosynthesis (DTS), thus entailing low time and radiation dose for image acquisition during treatment. The DTS is estimated as a deformation of prior CT via iterative registration with on-board images; this shifts the challenge to the computational domain, owing largely to repeated projection computations across iterations. We address this issue by composing efficient digital projection operators from their constituent parts. This allows us to separate the static (projection geometry) and dynamic (volume/image data) parts of projection operations by means of pre-computations, enabling fast on-board processing, while also relaxing constraints on underlying numerical models (e.g. regridding interpolation kernels). Further decoupling the projectors into simpler ones ensures the incurred memory overhead remains low, within the capacity of a single GPU. These operators depend only on the treatment plan and may be reused across iterations and patients. The dynamic processing load is kept to a minimum and maps well to the GPU computational model. Results: We have integrated efficient, pre-computable modules for volumetric ray-casting and FDK-based back-projection with the LIVE processing pipeline. Our results show a 60x acceleration of the DTS computations, compared to the previous version, using a single GPU; presently, reconstruction is attained within a couple of minutes. The present implementation allows for significant flexibility in terms of the numerical and operational projection model; we are investigating the benefit of further optimizations and accurate digital projection sub

  10. WE-AB-303-09: Rapid Projection Computations for On-Board Digital Tomosynthesis in Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Iliopoulos, AS; Sun, X [Duke University, Durham, NC (United States); Pitsianis, N [Aristotle University of Thessaloniki (Greece); Duke University, Durham, NC (United States); Yin, FF; Ren, L [Duke University Medical Center, Durham, NC (United States)

    2015-06-15

    Purpose: To facilitate fast and accurate iterative volumetric image reconstruction from limited-angle on-board projections. Methods: Intrafraction motion hinders the clinical applicability of modern radiotherapy techniques, such as lung stereotactic body radiation therapy (SBRT). The LIVE system may impact clinical practice by recovering volumetric information via Digital Tomosynthesis (DTS), thus entailing low time and radiation dose for image acquisition during treatment. The DTS is estimated as a deformation of prior CT via iterative registration with on-board images; this shifts the challenge to the computational domain, owing largely to repeated projection computations across iterations. We address this issue by composing efficient digital projection operators from their constituent parts. This allows us to separate the static (projection geometry) and dynamic (volume/image data) parts of projection operations by means of pre-computations, enabling fast on-board processing, while also relaxing constraints on underlying numerical models (e.g. regridding interpolation kernels). Further decoupling the projectors into simpler ones ensures the incurred memory overhead remains low, within the capacity of a single GPU. These operators depend only on the treatment plan and may be reused across iterations and patients. The dynamic processing load is kept to a minimum and maps well to the GPU computational model. Results: We have integrated efficient, pre-computable modules for volumetric ray-casting and FDK-based back-projection with the LIVE processing pipeline. Our results show a 60x acceleration of the DTS computations, compared to the previous version, using a single GPU; presently, reconstruction is attained within a couple of minutes. The present implementation allows for significant flexibility in terms of the numerical and operational projection model; we are investigating the benefit of further optimizations and accurate digital projection sub

  11. A Comprehensive Onboarding and Orientation Plan for Neurocritical Care Advanced Practice Providers.

    Science.gov (United States)

    Langley, Tamra M; Dority, Jeremy; Fraser, Justin F; Hatton, Kevin W

    2018-06-01

    As the role of advanced practice providers (APPs) expands to include increasingly complex patient care within the intensive care unit, the educational needs of these providers must also be expanded. An onboarding process was designed for APPs in the neurocritical care service line. Onboarding for new APPs revolved around 5 specific areas: candidate selection, proctor assignment, 3-phased orientation process, remediation, and mentorship. To ensure effective training for APPs, using the most time-conscious approach, the backbone of the process is a structured curriculum. This was developed and integrated within the standard orientation and onboarding process. The curriculum design incorporated measurable learning goals, objective assessments of phased goal achievements, and opportunities for remediation. The neurocritical care service implemented an onboarding process in 2014. Four APPs (3 nurse practitioners and 1 physician assistant) were employed by the department before the implementation of the orientation program. The length of employment ranged from 1 to 4 years. Lack of clinical knowledge and/or sufficient training was cited as reasons for departure from the position in 2 of the 4 APPs, as either self-expression or peer evaluation. Since implementation of this program, 12 APPs have completed the program, of which 10 remain within the division, creating an 83% retention rate. The onboarding process, including a 3-phased, structured orientation plan for neurocritical care, has increased APP retention since its implementation. The educational model, along with proctoring and mentorship, has improved clinical knowledge and increased nurse practitioner retention. A larger-scale study would help to support the validity of this onboarding process.

  12. Dosimetric verification of lung cancer treatment using the CBCTs estimated from limited-angle on-board projections.

    Science.gov (United States)

    Zhang, You; Yin, Fang-Fang; Ren, Lei

    2015-08-01

    Lung cancer treatment is susceptible to treatment errors caused by interfractional anatomical and respirational variations of the patient. On-board treatment dose verification is especially critical for the lung stereotactic body radiation therapy due to its high fractional dose. This study investigates the feasibility of using cone-beam (CB)CT images estimated by a motion modeling and free-form deformation (MM-FD) technique for on-board dose verification. Both digital and physical phantom studies were performed. Various interfractional variations featuring patient motion pattern change, tumor size change, and tumor average position change were simulated from planning CT to on-board images. The doses calculated on the planning CT (planned doses), the on-board CBCT estimated by MM-FD (MM-FD doses), and the on-board CBCT reconstructed by the conventional Feldkamp-Davis-Kress (FDK) algorithm (FDK doses) were compared to the on-board dose calculated on the "gold-standard" on-board images (gold-standard doses). The absolute deviations of minimum dose (ΔDmin), maximum dose (ΔDmax), and mean dose (ΔDmean), and the absolute deviations of prescription dose coverage (ΔV100%) were evaluated for the planning target volume (PTV). In addition, 4D on-board treatment dose accumulations were performed using 4D-CBCT images estimated by MM-FD in the physical phantom study. The accumulated doses were compared to those measured using optically stimulated luminescence (OSL) detectors and radiochromic films. Compared with the planned doses and the FDK doses, the MM-FD doses matched much better with the gold-standard doses. For the digital phantom study, the average (± standard deviation) ΔDmin, ΔDmax, ΔDmean, and ΔV100% (values normalized by the prescription dose or the total PTV) between the planned and the gold-standard PTV doses were 32.9% (±28.6%), 3.0% (±2.9%), 3.8% (±4.0%), and 15.4% (±12.4%), respectively. The corresponding values of FDK PTV doses were 1.6% (±1

  13. On-board processing of video image sequences

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl; Chanrion, Olivier Arnaud; Forchhammer, Søren

    2008-01-01

    and evaluated. On-board there are six video cameras each capturing images of 1024times1024 pixels of 12 bpp at a frame rate of 15 fps, thus totalling 1080 Mbits/s. In comparison the average downlink data rate for these images is projected to be 50 kbit/s. This calls for efficient on-board processing to select...

  14. Defense Threat Reduction Agency > Careers > Onboarding > Special Programs

    Science.gov (United States)

    Development Work/Life Programs Onboarding Onboarding Overview Before You Report Sponsor Program Getting Here , programs, and practices to help our employees and Service members balance work and family responsibilities . We have put in place family-friendly Work/Life programs and policies designed to create a more

  15. ON-BOARD COMPUTER SYSTEM FOR KITSAT-1 AND 2

    Directory of Open Access Journals (Sweden)

    H. S. Kim

    1996-06-01

    Full Text Available KITSAT-1 and 2 are microsatellites weighting 50kg and all the on-board data are processed by the on-board computer system. Hence, these on-board computers require to be highly reliable and be designed with tight power consumption, mass and size constraints. On-board computer(OBC systems for KITSAT-1 and 2 are also designed with a simple flexible hardware for reliability and software takes more responsibility than hardware. KITSAT-1 and 2 on-board computer system consist of OBC 186 as the primary OBC and OBC80 as its backup. OBC186 runs spacecraft operating system (SCOS which has real-time multi-tasking capability. Since their launch, OBC186 and OBC80 have been operating successfully until today. In this paper, we describe the development of OBC186 hardware and software and analyze its in-orbit operation performance.

  16. The cloud services innovation platform- enabling service-based environmental modelling using infrastructure-as-a-service cloud computing

    Science.gov (United States)

    Service oriented architectures allow modelling engines to be hosted over the Internet abstracting physical hardware configuration and software deployments from model users. Many existing environmental models are deployed as desktop applications running on user's personal computers (PCs). Migration ...

  17. Memory-Efficient Onboard Rock Segmentation

    Science.gov (United States)

    Burl, Michael C.; Thompson, David R.; Bornstein, Benjamin J.; deGranville, Charles K.

    2013-01-01

    Rockster-MER is an autonomous perception capability that was uploaded to the Mars Exploration Rover Opportunity in December 2009. This software provides the vision front end for a larger software system known as AEGIS (Autonomous Exploration for Gathering Increased Science), which was recently named 2011 NASA Software of the Year. As the first step in AEGIS, Rockster-MER analyzes an image captured by the rover, and detects and automatically identifies the boundary contours of rocks and regions of outcrop present in the scene. This initial segmentation step reduces the data volume from millions of pixels into hundreds (or fewer) of rock contours. Subsequent stages of AEGIS then prioritize the best rocks according to scientist- defined preferences and take high-resolution, follow-up observations. Rockster-MER has performed robustly from the outset on the Mars surface under challenging conditions. Rockster-MER is a specially adapted, embedded version of the original Rockster algorithm ("Rock Segmentation Through Edge Regrouping," (NPO- 44417) Software Tech Briefs, September 2008, p. 25). Although the new version performs the same basic task as the original code, the software has been (1) significantly upgraded to overcome the severe onboard re source limitations (CPU, memory, power, time) and (2) "bulletproofed" through code reviews and extensive testing and profiling to avoid the occurrence of faults. Because of the limited computational power of the RAD6000 flight processor on Opportunity (roughly two orders of magnitude slower than a modern workstation), the algorithm was heavily tuned to improve its speed. Several functional elements of the original algorithm were removed as a result of an extensive cost/benefit analysis conducted on a large set of archived rover images. The algorithm was also required to operate below a stringent 4MB high-water memory ceiling; hence, numerous tricks and strategies were introduced to reduce the memory footprint. Local filtering

  18. Enabling benchmarking and improving operational efficiency at nuclear power plants through adoption of a common process model: SNPM (standard nuclear performance model)

    International Nuclear Information System (INIS)

    Pete Karns

    2006-01-01

    others. The goal of the SNPM is to give the people maintaining and operating nuclear power stations a common model on which to base their business processes and measure/benchmark themselves against others. The importance of benchmarking and comparing 'apples to apples' has and will continue to safely drive improvement and efficiencies throughout the business. For example, in the mid 1990's it was quite difficult to compare work management statistics and programs between plants. The introduction of several INPO documents, which eventually became the SNPM work management process (AP 928) enabled plants to benchmark and compare information on many aspects of work management, in fact INPO began to evaluate the nuclear plants on their implementation and usage of AP 928. Also, the standardization enabled the identification and benchmarking of innovations in plant processes and performance, which in turn helped to facilitate those innovations being accepted in other plants-thus furthering the cycle of continuous improvement. Using a master plan, all communities of practice are able to identify specific improvement projects and coordinate the implementation of the processes to ensure smooth transitions between the various process interface or intersection points. In essence the nuclear energy industry in the United States is working as one company-driving efficiencies and operational improvements. Key enablers in adopting the best practices like the SNPM are work, asset and supply chain management solutions - both from a functional and a technological point of view. In addition to the importance of supporting industry best practices, there are two additional attributes a nuclear power operating company should evaluate regarding software solutions for work, asset, and supply chain management: breadth of assets managed, and the architecture of solution. (author)

  19. IBIS: the imager on-board integral

    International Nuclear Information System (INIS)

    Ubertini, P.; Bazzano, A.; Lebrun, F.; Goldwurm, A.; Laurent, P.; Mirabel, I.F.; Vigroux, L.; Di Cocco, G.; Labanti, C.; Bird, A.J.; Broenstad, K.; La Rosa, G.; Sacco, B.; Quadrini, E.M.; Ramsey, B.; Weisskopf, M.C.; Reglero, V.; Sabau, L.; Staubert, R.; Zdziarski, A.A.

    2003-01-01

    The IBIS telescope is the high angular resolution gamma-ray imager on-board the INTEGRAL Observatory, successfully launched from Baikonur (Kazakhstan) on October 2002. This medium size ESA project, planned for a 2 year mission with possible extension to 5, is devoted to the observation of the gamma-ray sky in the energy range from 3 keV to 10 MeV (Winkler 2001). The IBIS imaging system is based on two independent solid state detector arrays optimised for low (15-1000 keV) and high (0.175-10.0 MeV) energies surrounded by an active VETO System. This high efficiency shield is essential to minimise the background induced by high energy particles in the highly ex-centric out of van Allen belt orbit. A Tungsten Coded Aperture Mask, 16 mm thick and ∼ 1 squared meter in dimension is the imaging device. The IBIS telescope will serve the scientific community at large providing a unique combination of unprecedented high energy wide field imaging capability coupled with broad band spectroscopy and high resolution timing over the energy range from X to gamma rays. To date the IBIS telescope is working nominally in orbit since more than 9 month. (authors)

  20. Digibaro pressure instrument onboard the Phoenix Lander

    Science.gov (United States)

    Harri, A.-M.; Polkko, J.; Kahanpää, H. H.; Schmidt, W.; Genzer, M. M.; Haukka, H.; Savijarv1, H.; Kauhanen, J.

    2009-04-01

    The Phoenix Lander landed successfully on the Martian northern polar region. The mission is part of the National Aeronautics and Space Administration's (NASA's) Scout program. Pressure observations onboard the Phoenix lander were performed by an FMI (Finnish Meteorological Institute) instrument, based on a silicon diaphragm sensor head manufactured by Vaisala Inc., combined with MDA data processing electronics. The pressure instrument performed successfully throughout the Phoenix mission. The pressure instrument had 3 pressure sensor heads. One of these was the primary sensor head and the other two were used for monitoring the condition of the primary sensor head during the mission. During the mission the primary sensor was read with a sampling interval of 2 s and the other two were read less frequently as a check of instrument health. The pressure sensor system had a real-time data-processing and calibration algorithm that allowed the removal of temperature dependent calibration effects. In the same manner as the temperature sensor, a total of 256 data records (8.53 min) were buffered and they could either be stored at full resolution, or processed to provide mean, standard deviation, maximum and minimum values for storage on the Phoenix Lander's Meteorological (MET) unit.The time constant was approximately 3s due to locational constraints and dust filtering requirements. Using algorithms compensating for the time constant effect the temporal resolution was good enough to detect pressure drops associated with the passage of nearby dust devils.

  1. Improving BDS Autonomous Orbit Determination Performance Using Onboard Accelerometers

    Directory of Open Access Journals (Sweden)

    QIAO Jing

    2017-05-01

    Full Text Available Autonomous orbit determination is a crucial step for GNSS development to improve GNSS vulnerability, integrity, reliability and robustness. The newly launched BeiDou (BD satellites are capable of conducting satellite to satellite tracking (SST, which can be used for autonomous orbit determination. However, using SST data only, the BD satellite system (BDS will have whole constellation rotation in the absence of absolute constraints from ground or other celestial body over time, due to various force perturbations. The perturbations can be categorized into conservative forces and non-conservative forces. The conservative forces, such as the Earth non-spherical perturbations, tidal perturbation, the solar, lunar and other third-body perturbations, can be precisely modeled with latest force models. The non-conservative forces (i.e. Solar Radiation Pressure (SRP, on the other hand, are difficult to be modeled precisely, which are the main factors affecting satellite orbit determination accuracy. In recent years, accelerometers onboard satellites have been used to directly measure the non-conservative forces for gravity recovery and atmosphere study, such as GRACE, CHAMP, and GOCE missions. This study investigates the feasibility to use accelerometers onboard BD satellites to improve BD autonomous orbit determination accuracy and service span. Using simulated BD orbit and SST data, together with the error models of existing space-borne accelerometers, the orbit determination accuracy for BD constellation is evaluated using either SST data only or SST data with accelerometers. An empirical SRP model is used to extract non-conservative forces. The simulation results show that the orbit determination accuracy using SST with accelerometers is significantly better than that with SST data only. Assuming 0.33 m random noises and decimeter level signal transponder system biases in SST data, IGSO and MEO satellites decimeter level orbit accuracy can be

  2. Enabling innovative healthcare delivery through the use of focussed factory model: case of spine clinic of the future

    NARCIS (Netherlands)

    Wickramasinghe, N.; Bloemendal, J.W.; de Bruin, A.K.; Krabbendam, Johannes Jacobus

    2005-01-01

    Abstract: This paper discusses the concept of the focused factory model. We highlight that the focused factory model combines one of the key generic strategies identified by Michael Porter (1985) and the ideas and concepts from manufacturing. The genesis of this model has its roots in trying to

  3. High-Speed On-Board Data Processing for Science Instruments

    Science.gov (United States)

    Beyon, Jeffrey Y.; Ng, Tak-Kwong; Lin, Bing; Hu, Yongxiang; Harrison, Wallace

    2014-01-01

    A new development of on-board data processing platform has been in progress at NASA Langley Research Center since April, 2012, and the overall review of such work is presented in this paper. The project is called High-Speed On-Board Data Processing for Science Instruments (HOPS) and focuses on a high-speed scalable data processing platform for three particular National Research Council's Decadal Survey missions such as Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS), Aerosol-Cloud-Ecosystems (ACE), and Doppler Aerosol Wind Lidar (DAWN) 3-D Winds. HOPS utilizes advanced general purpose computing with Field Programmable Gate Array (FPGA) based algorithm implementation techniques. The significance of HOPS is to enable high speed on-board data processing for current and future science missions with its reconfigurable and scalable data processing platform. A single HOPS processing board is expected to provide approximately 66 times faster data processing speed for ASCENDS, more than 70% reduction in both power and weight, and about two orders of cost reduction compared to the state-of-the-art (SOA) on-board data processing system. Such benchmark predictions are based on the data when HOPS was originally proposed in August, 2011. The details of these improvement measures are also presented. The two facets of HOPS development are identifying the most computationally intensive algorithm segments of each mission and implementing them in a FPGA-based data processing board. A general introduction of such facets is also the purpose of this paper.

  4. LAT Onboard Science: Gamma-Ray Burst Identification

    International Nuclear Information System (INIS)

    Kuehn, Frederick; Hughes, Richard; Smith, Patrick; Winer, Brian; Bonnell, Jerry; Norris, Jay; Ritz, Steven; Russell, James

    2007-01-01

    The main goal of the Large Area Telescope (LAT) onboard science program is to provide quick identification and localization of Gamma Ray Bursts (GRB) onboard the LAT for follow-up observations by other observatories. The GRB identification and localization algorithm will provide celestial coordinates with an error region that will be distributed via the Gamma ray burst Coordinate Network (GCN). We present results that show our sensitivity to bursts as characterized using Monte Carlo simulations of the GLAST observatory. We describe and characterize the method of onboard track determination and the GRB identification and localization algorithm. Onboard track determination is considerably different than in the on-ground case, resulting in a substantially altered point spread function. The algorithm contains tunable parameters which may be adjusted after launch when real bursts characteristics at very high energies have been identified

  5. Onboard Blackbody Calibrator Component Development for IR Remote Sensing Instrumentation

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this study is to apply and to provide a reliable, stable durable onboard blackbody calibrator to future Earth Science missions by infusing the new...

  6. Mechanistic models enable the rational use of in vitro drug-target binding kinetics for better drug effects in patients.

    Science.gov (United States)

    de Witte, Wilhelmus E A; Wong, Yin Cheong; Nederpelt, Indira; Heitman, Laura H; Danhof, Meindert; van der Graaf, Piet H; Gilissen, Ron A H J; de Lange, Elizabeth C M

    2016-01-01

    Drug-target binding kinetics are major determinants of the time course of drug action for several drugs, as clearly described for the irreversible binders omeprazole and aspirin. This supports the increasing interest to incorporate newly developed high-throughput assays for drug-target binding kinetics in drug discovery. A meaningful application of in vitro drug-target binding kinetics in drug discovery requires insight into the relation between in vivo drug effect and in vitro measured drug-target binding kinetics. In this review, the authors discuss both the relation between in vitro and in vivo measured binding kinetics and the relation between in vivo binding kinetics, target occupancy and effect profiles. More scientific evidence is required for the rational selection and development of drug-candidates on the basis of in vitro estimates of drug-target binding kinetics. To elucidate the value of in vitro binding kinetics measurements, it is necessary to obtain information on system-specific properties which influence the kinetics of target occupancy and drug effect. Mathematical integration of this information enables the identification of drug-specific properties which lead to optimal target occupancy and drug effect in patients.

  7. Development of three-dimensional patient face model that enables real-time collision detection and cutting operation for a dental simulator.

    Science.gov (United States)

    Yamaguchi, Satoshi; Yamada, Yuya; Yoshida, Yoshinori; Noborio, Hiroshi; Imazato, Satoshi

    2012-01-01

    The virtual reality (VR) simulator is a useful tool to develop dental hand skill. However, VR simulations with reactions of patients have limited computational time to reproduce a face model. Our aim was to develop a patient face model that enables real-time collision detection and cutting operation by using stereolithography (STL) and deterministic finite automaton (DFA) data files. We evaluated dependence of computational cost and constructed the patient face model using the optimum condition for combining STL and DFA data files, and assessed the computational costs for operation in do-nothing, collision, cutting, and combination of collision and cutting. The face model was successfully constructed with low computational costs of 11.3, 18.3, 30.3, and 33.5 ms for do-nothing, collision, cutting, and collision and cutting, respectively. The patient face model could be useful for developing dental hand skill with VR.

  8. Three-Dimensional Human iPSC-Derived Artificial Skeletal Muscles Model Muscular Dystrophies and Enable Multilineage Tissue Engineering

    Directory of Open Access Journals (Sweden)

    Sara Martina Maffioletti

    2018-04-01

    Full Text Available Summary: Generating human skeletal muscle models is instrumental for investigating muscle pathology and therapy. Here, we report the generation of three-dimensional (3D artificial skeletal muscle tissue from human pluripotent stem cells, including induced pluripotent stem cells (iPSCs from patients with Duchenne, limb-girdle, and congenital muscular dystrophies. 3D skeletal myogenic differentiation of pluripotent cells was induced within hydrogels under tension to provide myofiber alignment. Artificial muscles recapitulated characteristics of human skeletal muscle tissue and could be implanted into immunodeficient mice. Pathological cellular hallmarks of incurable forms of severe muscular dystrophy could be modeled with high fidelity using this 3D platform. Finally, we show generation of fully human iPSC-derived, complex, multilineage muscle models containing key isogenic cellular constituents of skeletal muscle, including vascular endothelial cells, pericytes, and motor neurons. These results lay the foundation for a human skeletal muscle organoid-like platform for disease modeling, regenerative medicine, and therapy development. : Maffioletti et al. generate human 3D artificial skeletal muscles from healthy donors and patient-specific pluripotent stem cells. These human artificial muscles accurately model severe genetic muscle diseases. They can be engineered to include other cell types present in skeletal muscle, such as vascular cells and motor neurons. Keywords: skeletal muscle, pluripotent stem cells, iPS cells, myogenic differentiation, tissue engineering, disease modeling, muscular dystrophy, organoids

  9. Genome-scale modeling using flux ratio constraints to enable metabolic engineering of clostridial metabolism in silico.

    Science.gov (United States)

    McAnulty, Michael J; Yen, Jiun Y; Freedman, Benjamin G; Senger, Ryan S

    2012-05-14

    Genome-scale metabolic networks and flux models are an effective platform for linking an organism genotype to its phenotype. However, few modeling approaches offer predictive capabilities to evaluate potential metabolic engineering strategies in silico. A new method called "flux balance analysis with flux ratios (FBrAtio)" was developed in this research and applied to a new genome-scale model of Clostridium acetobutylicum ATCC 824 (iCAC490) that contains 707 metabolites and 794 reactions. FBrAtio was used to model wild-type metabolism and metabolically engineered strains of C. acetobutylicum where only flux ratio constraints and thermodynamic reversibility of reactions were required. The FBrAtio approach allowed solutions to be found through standard linear programming. Five flux ratio constraints were required to achieve a qualitative picture of wild-type metabolism for C. acetobutylicum for the production of: (i) acetate, (ii) lactate, (iii) butyrate, (iv) acetone, (v) butanol, (vi) ethanol, (vii) CO2 and (viii) H2. Results of this simulation study coincide with published experimental results and show the knockdown of the acetoacetyl-CoA transferase increases butanol to acetone selectivity, while the simultaneous over-expression of the aldehyde/alcohol dehydrogenase greatly increases ethanol production. FBrAtio is a promising new method for constraining genome-scale models using internal flux ratios. The method was effective for modeling wild-type and engineered strains of C. acetobutylicum.

  10. Adaptive learning in a compartmental model of visual cortex—how feedback enables stable category learning and refinement

    Science.gov (United States)

    Layher, Georg; Schrodt, Fabian; Butz, Martin V.; Neumann, Heiko

    2014-01-01

    The categorization of real world objects is often reflected in the similarity of their visual appearances. Such categories of objects do not necessarily form disjunct sets of objects, neither semantically nor visually. The relationship between categories can often be described in terms of a hierarchical structure. For instance, tigers and leopards build two separate mammalian categories, both of which are subcategories of the category Felidae. In the last decades, the unsupervised learning of categories of visual input stimuli has been addressed by numerous approaches in machine learning as well as in computational neuroscience. However, the question of what kind of mechanisms might be involved in the process of subcategory learning, or category refinement, remains a topic of active investigation. We propose a recurrent computational network architecture for the unsupervised learning of categorial and subcategorial visual input representations. During learning, the connection strengths of bottom-up weights from input to higher-level category representations are adapted according to the input activity distribution. In a similar manner, top-down weights learn to encode the characteristics of a specific stimulus category. Feedforward and feedback learning in combination realize an associative memory mechanism, enabling the selective top-down propagation of a category's feedback weight distribution. We suggest that the difference between the expected input encoded in the projective field of a category node and the current input pattern controls the amplification of feedforward-driven representations. Large enough differences trigger the recruitment of new representational resources and the establishment of additional (sub-) category representations. We demonstrate the temporal evolution of such learning and show how the proposed combination of an associative memory with a modulatory feedback integration successfully establishes category and subcategory representations

  11. Adaptive learning in a compartmental model of visual cortex - how feedback enables stable category learning and refinement

    Directory of Open Access Journals (Sweden)

    Georg eLayher

    2014-12-01

    Full Text Available The categorization of real world objects is often reflected in the similarity of their visual appearances. Such categories of objects do not necessarily form disjunct sets of objects, neither semantically nor visually. The relationship between categories can often be described in terms of a hierarchical structure. For instance, tigers and leopards build two separate mammalian categories, but both belong to the category of felines. In other words, tigers and leopards are subcategories of the category Felidae. In the last decades, the unsupervised learning of categories of visual input stimuli has been addressed by numerous approaches in machine learning as well as in the computational neurosciences. However, the question of what kind of mechanisms might be involved in the process of subcategory learning, or category refinement, remains a topic of active investigation. We propose a recurrent computational network architecture for the unsupervised learning of categorial and subcategorial visual input representations. During learning, the connection strengths of bottom-up weights from input to higher-level category representations are adapted according to the input activity distribution. In a similar manner, top-down weights learn to encode the characteristics of a specific stimulus category. Feedforward and feedback learning in combination realize an associative memory mechanism, enabling the selective top-down propagation of a category's feedback weight distribution. We suggest that the difference between the expected input encoded in the projective field of a category node and the current input pattern controls the amplification of feedforward-driven representations. Large enough differences trigger the recruitment of new representational resources and the establishment of (sub- category representations. We demonstrate the temporal evolution of such learning and show how the approach successully establishes category and subcategory

  12. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    Energy Technology Data Exchange (ETDEWEB)

    Jablonowski, Christiane [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  13. An Onboard ISS Virtual Reality Trainer

    Science.gov (United States)

    Miralles, Evelyn

    2013-01-01

    Prior to the retirement of the Space Shuttle, many exterior repairs on the International Space Station (ISS) were carried out by shuttle astronauts, trained on the ground and flown to the Station to perform these specific repairs. With the retirement of the shuttle, this is no longer an available option. As such, the need for ISS crew members to review scenarios while on flight, either for tasks they already trained for on the ground or for contingency operations has become a very critical issue. NASA astronauts prepare for Extra-Vehicular Activities (EVA) or Spacewalks through numerous training media, such as: self-study, part task training, underwater training in the Neutral Buoyancy Laboratory (NBL), hands-on hardware reviews and training at the Virtual Reality Laboratory (VRLab). In many situations, the time between the last session of a training and an EVA task might be 6 to 8 months. EVA tasks are critical for a mission and as time passes the crew members may lose proficiency on previously trained tasks and their options to refresh or learn a new skill while on flight are limited to reading training materials and watching videos. In addition, there is an increased need for unplanned contingency repairs to fix problems arising as the Station ages. In order to help the ISS crew members maintain EVA proficiency or train for contingency repairs during their mission, the Johnson Space Center's VRLab designed an immersive ISS Virtual Reality Trainer (VRT). The VRT incorporates a unique optical system that makes use of the already successful Dynamic On-board Ubiquitous Graphics (DOUG) software to assist crew members with procedure reviews and contingency EVAs while on board the Station. The need to train and re-train crew members for EVAs and contingency scenarios is crucial and extremely demanding. ISS crew members are now asked to perform EVA tasks for which they have not been trained and potentially have never seen before. The Virtual Reality Trainer (VRT

  14. Investigating a model for lecturer training that enables lecturers to plan and carry out meaningful e-learning activities

    DEFF Research Database (Denmark)

    Kjær, Christopher; Hansen, Pernille Stenkil; Christensen, Inger-Marie F.

    2014-01-01

    This paper reports on the effect of a lecturer training model in the shape of an e-learning project based on research on adult and work-based learning. A survey was conducted to explore participants’ learning experiences. Findings show high overall satisfaction, motivation and engagement. Suggest......This paper reports on the effect of a lecturer training model in the shape of an e-learning project based on research on adult and work-based learning. A survey was conducted to explore participants’ learning experiences. Findings show high overall satisfaction, motivation and engagement...

  15. Three-Dimensional Human iPSC-Derived Artificial Skeletal Muscles Model Muscular Dystrophies and Enable Multilineage Tissue Engineering.

    Science.gov (United States)

    Maffioletti, Sara Martina; Sarcar, Shilpita; Henderson, Alexander B H; Mannhardt, Ingra; Pinton, Luca; Moyle, Louise Anne; Steele-Stallard, Heather; Cappellari, Ornella; Wells, Kim E; Ferrari, Giulia; Mitchell, Jamie S; Tyzack, Giulia E; Kotiadis, Vassilios N; Khedr, Moustafa; Ragazzi, Martina; Wang, Weixin; Duchen, Michael R; Patani, Rickie; Zammit, Peter S; Wells, Dominic J; Eschenhagen, Thomas; Tedesco, Francesco Saverio

    2018-04-17

    Generating human skeletal muscle models is instrumental for investigating muscle pathology and therapy. Here, we report the generation of three-dimensional (3D) artificial skeletal muscle tissue from human pluripotent stem cells, including induced pluripotent stem cells (iPSCs) from patients with Duchenne, limb-girdle, and congenital muscular dystrophies. 3D skeletal myogenic differentiation of pluripotent cells was induced within hydrogels under tension to provide myofiber alignment. Artificial muscles recapitulated characteristics of human skeletal muscle tissue and could be implanted into immunodeficient mice. Pathological cellular hallmarks of incurable forms of severe muscular dystrophy could be modeled with high fidelity using this 3D platform. Finally, we show generation of fully human iPSC-derived, complex, multilineage muscle models containing key isogenic cellular constituents of skeletal muscle, including vascular endothelial cells, pericytes, and motor neurons. These results lay the foundation for a human skeletal muscle organoid-like platform for disease modeling, regenerative medicine, and therapy development. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  16. The chain of care enabling tPA treatment in acute ischemic stroke : a comprehensive review of organisational models

    NARCIS (Netherlands)

    Lahr, Maarten M. H.; Luijckx, Gert-Jan; Vroomen, Patrick; van der Zee, D.J.; Buskens, Erik

    Protracted and partial implementation of treatment with intravenous tissue plasminogen activator (tPA) within 4.5 h after acute stroke onset results in potentially eligible patients not receiving optimal treatment. The goal of this study was to review the performance of various organisational models

  17. Toward a Conceptual Model for Social Mechanisms Enabling Knowledge Sharing: Dynamic Relationships among Three Dimensions of Social Capital

    Science.gov (United States)

    Jo, Sung Jun

    2008-01-01

    Knowledge sharing is important because individual knowledge is not transformed into organizational knowledge until it is shared. The conceptual model presents how social factors create the conditions for effective knowledge sharing. It illustrates how three dimensions of social capital impact with each other and with knowledge sharing. Social…

  18. Fiber optical sensing on-board communication satellites

    Science.gov (United States)

    Hurni, A.; Lemke, N. M. K.; Roner, M.; Obermaier, J.; Putzer, P.; Kuhenuri Chami, N.

    2017-11-01

    Striving constantly to reduce mass, AIT effort and overall cost of the classical point-to-point wired temperature sensor harness on-board telecommunication satellites, OHB System (formerly Kayser-Threde) has introduced the Hybrid Sensor Bus (HSB) system. As a future spacecraft platform element, HSB relies on electrical remote sensor units as well as fiber-optical sensors, both of which can serially be connected in a bus architecture. HSB is a modular measurement system with many applications, also thanks to the opportunities posed by the digital I²C bus. The emphasis, however, is on the introduction of fiber optics and especially fiber-Bragg grating (FBG) temperature sensors as disruptive innovation for the company's satellite platforms. The light weight FBG sensors are directly inscribed in mechanically robust and radiation tolerant fibers, reducing the need for optical fiber connectors and splices to a minimum. Wherever an FBG sensor shall be used, the fiber is glued together with a corresponding temperature transducer to the satellites structure or to a subsystem. The transducer is necessary to provide decoupling of mechanical stress, but simultaneously ensure a high thermal conductivity. HSB has been developed in the frame of an ESA-ARTES program with European and German co-funding and will be verified as flight demonstrator on-board the German Heinrich Hertz satellite (H2Sat). In this paper the Engineering Model development of HSB is presented and a Fiber-optical Sensor Multiplexer for a more flexible sensor bus architecture is introduced. The HSB system aims at telecommunication satellite platforms with an operational life time beyond 15 years in geostationary orbit. It claims a high compatibility in terms of performance and interfaces with existing platforms while it was designed with future applications with increased radiation exposure already in mind. In its basic configuration HSB consists of four modules which are the Power Supply Unit, the HSB

  19. High throughput ADME screening: practical considerations, impact on the portfolio and enabler of in silico ADME models.

    Science.gov (United States)

    Hop, Cornelis E C A; Cole, Mark J; Davidson, Ralph E; Duignan, David B; Federico, James; Janiszewski, John S; Jenkins, Kelly; Krueger, Suzanne; Lebowitz, Rebecca; Liston, Theodore E; Mitchell, Walter; Snyder, Mark; Steyn, Stefan J; Soglia, John R; Taylor, Christine; Troutman, Matt D; Umland, John; West, Michael; Whalen, Kevin M; Zelesky, Veronica; Zhao, Sabrina X

    2008-11-01

    Evaluation and optimization of drug metabolism and pharmacokinetic data plays an important role in drug discovery and development and several reliable in vitro ADME models are available. Recently higher throughput in vitro ADME screening facilities have been established in order to be able to evaluate an appreciable fraction of synthesized compounds. The ADME screening process can be dissected in five distinct steps: (1) plate management of compounds in need of in vitro ADME data, (2) optimization of the MS/MS method for the compounds, (3) in vitro ADME experiments and sample clean up, (4) collection and reduction of the raw LC-MS/MS data and (5) archival of the processed ADME data. All steps will be described in detail and the value of the data on drug discovery projects will be discussed as well. Finally, in vitro ADME screening can generate large quantities of data obtained under identical conditions to allow building of reliable in silico models.

  20. Shifts in nitrogen acquisition strategies enable enhanced terrestrial carbon storage under elevated CO2 in a global model

    Science.gov (United States)

    Sulman, B. N.; Brzostek, E. R.; Menge, D.; Malyshev, S.; Shevliakova, E.

    2017-12-01

    Earth System Model (ESM) projections of terrestrial carbon (C) uptake are critical to understanding the future of the global C cycle. Current ESMs include intricate representations of photosynthetic C fixation in plants, allowing them to simulate the stimulatory effect of increasing atmospheric CO2 levels on photosynthesis. However, they lack sophisticated representations of plant nutrient acquisition, calling into question their ability to project the future land C sink. We conducted simulations using a new model of terrestrial C and nitrogen (N) cycling within the Geophysical Fluid Dynamics Laboratory (GFDL) global land model LM4 that uses a return on investment framework to simulate global patterns of N acquisition via fixation of N2 from the atmosphere, scavenging of inorganic N from soil solution, and mining of organic N from soil organic matter (SOM). We show that these strategies drive divergent C cycle responses to elevated CO2 at the ecosystem scale, with the scavenging strategy leading to N limitation of plant growth and the mining strategy facilitating stimulation of plant biomass accumulation over decadal time scales. In global simulations, shifts in N acquisition from inorganic N scavenging to organic N mining along with increases in N fixation supported long-term acceleration of C uptake under elevated CO2. Our results indicate that the ability of the land C sink to mitigate atmospheric CO2 levels is tightly coupled to the functional diversity of ecosystems and their capacity to change their N acquisition strategies over time. Incorporation of these mechanisms into ESMs is necessary to improve confidence in model projections of the global C cycle.

  1. A New Approach to Predict Microbial Community Assembly and Function Using a Stochastic, Genome-Enabled Modeling Framework

    Science.gov (United States)

    King, E.; Brodie, E.; Anantharaman, K.; Karaoz, U.; Bouskill, N.; Banfield, J. F.; Steefel, C. I.; Molins, S.

    2016-12-01

    Characterizing and predicting the microbial and chemical compositions of subsurface aquatic systems necessitates an understanding of the metabolism and physiology of organisms that are often uncultured or studied under conditions not relevant for one's environment of interest. Cultivation-independent approaches are therefore important and have greatly enhanced our ability to characterize functional microbial diversity. The capability to reconstruct genomes representing thousands of populations from microbial communities using metagenomic techniques provides a foundation for development of predictive models for community structure and function. Here, we discuss a genome-informed stochastic trait-based model incorporated into a reactive transport framework to represent the activities of coupled guilds of hypothetical microorganisms. Metabolic pathways for each microbe within a functional guild are parameterized from metagenomic data with a unique combination of traits governing organism fitness under dynamic environmental conditions. We simulate the thermodynamics of coupled electron donor and acceptor reactions to predict the energy available for cellular maintenance, respiration, biomass development, and enzyme production. While `omics analyses can now characterize the metabolic potential of microbial communities, it is functionally redundant as well as computationally prohibitive to explicitly include the thousands of recovered organisms into biogeochemical models. However, one can derive potential metabolic pathways from genomes along with trait-linkages to build probability distributions of traits. These distributions are used to assemble groups of microbes that couple one or more of these pathways. From the initial ensemble of microbes, only a subset will persist based on the interaction of their physiological and metabolic traits with environmental conditions, competing organisms, etc. Here, we analyze the predicted niches of these hypothetical microbes and

  2. A non-parametric mixture model for genome-enabled prediction of genetic value for a quantitative trait.

    Science.gov (United States)

    Gianola, Daniel; Wu, Xiao-Lin; Manfredi, Eduardo; Simianer, Henner

    2010-10-01

    A Bayesian nonparametric form of regression based on Dirichlet process priors is adapted to the analysis of quantitative traits possibly affected by cryptic forms of gene action, and to the context of SNP-assisted genomic selection, where the main objective is to predict a genomic signal on phenotype. The procedure clusters unknown genotypes into groups with distinct genetic values, but in a setting in which the number of clusters is unknown a priori, so that standard methods for finite mixture analysis do not work. The central assumption is that genetic effects follow an unknown distribution with some "baseline" family, which is a normal process in the cases considered here. A Bayesian analysis based on the Gibbs sampler produces estimates of the number of clusters, posterior means of genetic effects, a measure of credibility in the baseline distribution, as well as estimates of parameters of the latter. The procedure is illustrated with a simulation representing two populations. In the first one, there are 3 unknown QTL, with additive, dominance and epistatic effects; in the second, there are 10 QTL with additive, dominance and additive × additive epistatic effects. In the two populations, baseline parameters are inferred correctly. The Dirichlet process model infers the number of unique genetic values correctly in the first population, but it produces an understatement in the second one; here, the true number of clusters is over 900, and the model gives a posterior mean estimate of about 140, probably because more replication of genotypes is needed for correct inference. The impact on inferences of the prior distribution of a key parameter (M), and of the extent of replication, was examined via an analysis of mean body weight in 192 paternal half-sib families of broiler chickens, where each sire was genotyped for nearly 7,000 SNPs. In this small sample, it was found that inference about the number of clusters was affected by the prior distribution of M. For a

  3. Reduced-order modeling (ROM) for simulation and optimization powerful algorithms as key enablers for scientific computing

    CERN Document Server

    Milde, Anja; Volkwein, Stefan

    2018-01-01

    This edited monograph collects research contributions and addresses the advancement of efficient numerical procedures in the area of model order reduction (MOR) for simulation, optimization and control. The topical scope includes, but is not limited to, new out-of-the-box algorithmic solutions for scientific computing, e.g. reduced basis methods for industrial problems and MOR approaches for electrochemical processes. The target audience comprises research experts and practitioners in the field of simulation, optimization and control, but the book may also be beneficial for graduate students alike. .

  4. SuperAGILE onboard electronics and ground test instrumentation

    International Nuclear Information System (INIS)

    Pacciani, Luigi; Morelli, Ennio; Rubini, Alda; Mastropietro, Marcello; Porrovecchio, Geiland; Costa, Enrico; Del Monte, Ettore; Donnarumma, Immacolata; Evangelista, Yuri; Feroci, Marco; Lazzarotto, Francesco; Rapisarda, Massimo; Soffitta, Paolo

    2007-01-01

    In this paper we describe the electronics of the SuperAGILE X-ray imager on-board AGILE satellite and the instrumentation developed to test and improve the Front-End and digital electronics of the flight model of the imager. Although the working principle of the instrument is very well established, and the conceptual scheme simple, the budget and mechanical constraints of the AGILE small mission made necessary the introduction of new elements in SuperAGILE, regarding both the mechanics and the electronics. In fact the instrument is contained in a ∼44x44x16cm 3 volume, but the required performance is quite ambitious, leading us to equip a sensitive area of ∼1350cm 2 with 6144 Silicon μstrips detectors with a pitch of 121μm and a total length of ∼18.2cm. The result is a very light and power-cheap imager with a good sensitivity (∼15mCrab in 1 day in 15-45keV), high angular resolution (6arcmin) and gross spectral resolution. The test-equipment is versatile, and can be easily modified to test FEE based on self-triggered, data-driven and sparse-readout ASICs such as XA family chips

  5. Development of a canine model to enable the preclinical assessment of pH-dependent absorption of test compounds.

    Science.gov (United States)

    Fancher, R Marcus; Zhang, Hongjian; Sleczka, Bogdan; Derbin, George; Rockar, Richard; Marathe, Punit

    2011-07-01

    A preclinical canine model capable of predicting a compound's potential for pH-dependent absorption in humans was developed. This involved the surgical insertion of a gastrostomy feeding tube into the stomach of a beagle dog. The tube was sutured in position to allow frequent withdrawal of gastric fluid for pH measurement. Therefore, it was possible to measure pH in the stomach and assess the effect of gastric pH-modifying agents on the absorption of various test compounds. Fasted gastric pH in the dog showed considerable inter- and intra-animal variability. Pretreatment of pentagastrin (6 µg/kg intramuscularly) 20 min prior to test compound administration was determined to be adequate for simulating fasting stomach pH in humans. Pretreatment with famotidine [40 mg orally] 1 h prior to test compound administration was determined to be adequate for simulating human gastric pH when acid-reducing agents are coadministered. Pentagastrin and famotidine pretreatments were used to test two discovery compounds and distinct differences in their potential for pH-dependent absorption were observed. The model described herein can be used preclinically to screen out compounds, differentiate compounds, and support the assessment of various formulation- and prodrug-based strategies to mitigate the pH effect. Copyright © 2011 Wiley-Liss, Inc. and the American Pharmacists Association

  6. Computational Laboratory Astrophysics to Enable Transport Modeling of Protons and Hydrogen in Stellar Winds, the ISM, and other Astrophysical Environments

    Science.gov (United States)

    Schultz, David

    As recognized prominently by the APRA program, interpretation of NASA astrophysical mission observations requires significant products of laboratory astrophysics, for example, spectral lines and transition probabilities, electron-, proton-, or heavy-particle collision data. Availability of these data underpin robust and validated models of astrophysical emissions and absorptions, energy, momentum, and particle transport, dynamics, and reactions. Therefore, measured or computationally derived, analyzed, and readily available laboratory astrophysics data significantly enhances the scientific return on NASA missions such as HST, Spitzer, and JWST. In the present work a comprehensive set of data will be developed for the ubiquitous proton-hydrogen and hydrogen-hydrogen collisions in astrophysical environments including ISM shocks, supernova remnants and bubbles, HI clouds, young stellar objects, and winds within stellar spheres, covering the necessary wide range of energy- and charge-changing channels, collision energies, and most relevant scattering parameters. In addition, building on preliminary work, a transport and reaction simulation will be developed incorporating the elastic and inelastic collision data collected and produced. The work will build upon significant previous efforts of the principal investigators and collaborators, will result in a comprehensive data set required for modeling these environments and interpreting NASA astrophysical mission observations, and will benefit from feedback from collaborators who are active users of the work proposed.

  7. Immunoassay for Capsular Antigen of Bacillus anthracis Enables Rapid Diagnosis in a Rabbit Model of Inhalational Anthrax.

    Directory of Open Access Journals (Sweden)

    Marcellene A Gates-Hollingsworth

    Full Text Available Inhalational anthrax is a serious biothreat. Effective antibiotic treatment of inhalational anthrax requires early diagnosis; the further the disease has progressed, the less the likelihood for cure. Current means for diagnosis such as blood culture require several days to a result and require advanced laboratory infrastructure. An alternative approach to diagnosis is detection of a Bacillus anthracis antigen that is shed into blood and can be detected by rapid immunoassay. The goal of the study was to evaluate detection of poly-γ-D-glutamic acid (PGA, the capsular antigen of B. anthracis, as a biomarker surrogate for blood culture in a rabbit model of inhalational anthrax. The mean time to a positive blood culture was 26 ± 5.7 h (mean ± standard deviation, whereas the mean time to a positive ELISA was 22 ± 4.2 h; P = 0.005 in comparison with blood culture. A lateral flow immunoassay was constructed for detection of PGA in plasma at concentrations of less than 1 ng PGA/ml. Use of the lateral flow immunoassay for detection of PGA in the rabbit model found that antigen was detected somewhat earlier than the earliest time point at which the blood culture became positive. The low cost, ease of use, and rapid time to result of the lateral flow immunoassay format make an immunoassay for PGA a viable surrogate for blood culture for detection of infection in individuals who have a likelihood of exposure to B. anthracis.

  8. Identification of Value Proposition and Development of Innovative Business Models for Demand Response Products and Services Enabled by the DR-BOB Solution

    Directory of Open Access Journals (Sweden)

    Mario Sisinni

    2017-10-01

    Full Text Available The work presented is the result of an ongoing European H2020 project entitled DR-BOB Demand Response in Blocks of Buildings (DR-BOB that seeks to integrate existing technologies to create a scalable solution for Demand Response (DR in blocks of buildings. In most EU countries, DR programs are currently limited to the industrial sector and to direct asset control. The DR-BOB solution extends applicability to the building sector, providing predictive building management in blocks of buildings, enabling facilities managers to respond to implicit and explicit DR schemes, and enabling the aggregation of the DR potential of many blocks of buildings for use in demand response markets. The solution consists of three main components: the Local Energy Manager (LEM, which adds intelligence and provides the capacity for predictive building management in blocks of buildings, a Consumer Portal (CP to enable building managers and building occupants to interact with the system and be engaged in demand response operations, and a Decentralized Energy Management System (DEMS®, Siemens plc, Nottingham, England, UK, which enables the aggregation of the DR potential of many blocks of buildings, thus allowing participation in incentive-based demand response with or without an aggregator. The paper reports the key results around Business Modelling development for demand response products and services enabled by the DR-BOB solution. The scope is threefold: (1 illustrate how the functionality of the demand response solution can provide value proposition to underpin its exploitation by four specific customer segments, namely aggregators and three types of Owners of Blocks of Buildings in different market conditions, (2 explore key aspects of the Business Model from the point of view of a demand response solution provider, in particular around most the suitable revenue stream and key partnership, and (3 assess the importance of key variables such as market maturity, user

  9. Genome-enabled Modeling of Microbial Biogeochemistry using a Trait-based Approach. Does Increasing Metabolic Complexity Increase Predictive Capabilities?

    Science.gov (United States)

    King, E.; Karaoz, U.; Molins, S.; Bouskill, N.; Anantharaman, K.; Beller, H. R.; Banfield, J. F.; Steefel, C. I.; Brodie, E.

    2015-12-01

    The biogeochemical functioning of ecosystems is shaped in part by genomic information stored in the subsurface microbiome. Cultivation-independent approaches allow us to extract this information through reconstruction of thousands of genomes from a microbial community. Analysis of these genomes, in turn, gives an indication of the organisms present and their functional roles. However, metagenomic analyses can currently deliver thousands of different genomes that range in abundance/importance, requiring the identification and assimilation of key physiologies and metabolisms to be represented as traits for successful simulation of subsurface processes. Here we focus on incorporating -omics information into BioCrunch, a genome-informed trait-based model that represents the diversity of microbial functional processes within a reactive transport framework. This approach models the rate of nutrient uptake and the thermodynamics of coupled electron donors and acceptors for a range of microbial metabolisms including heterotrophs and chemolithotrophs. Metabolism of exogenous substrates fuels catabolic and anabolic processes, with the proportion of energy used for cellular maintenance, respiration, biomass development, and enzyme production based upon dynamic intracellular and environmental conditions. This internal resource partitioning represents a trade-off against biomass formation and results in microbial community emergence across a fitness landscape. Biocrunch was used here in simulations that included organisms and metabolic pathways derived from a dataset of ~1200 non-redundant genomes reflecting a microbial community in a floodplain aquifer. Metagenomic data was directly used to parameterize trait values related to growth and to identify trait linkages associated with respiration, fermentation, and key enzymatic functions such as plant polymer degradation. Simulations spanned a range of metabolic complexities and highlight benefits originating from simulations

  10. Psycho-Motor and Error Enabled Simulations: Modeling Vulnerable Skills in the Pre Mastery Phase - Medical Practice Initiative Procedural Skill Decay and Maintenance (MPI-PSD)

    Science.gov (United States)

    2016-12-01

    B were set between 10% and 90% of the maximum closed loop force handled by the device (14.5 N/mm), or between 1.45 and 13.05 N/mm. The effective...include administration of vasoactive medications , rapid resuscitation, total parenteral nutrition, and delivery of caustic medications .2 When considering...Award Number: W81XWH-13-1-0080 TITLE: "Psycho-Motor and Error Enabled Simulations: Modeling Vulnerable Skills in the Pre-Mastery Phase - Medical

  11. Integrated management model. Methodology and software-enabled tood designed to assist a utility in developing a station-wide optimization

    International Nuclear Information System (INIS)

    Llovet, R.; Ibanez, R.; Woodcock, J.

    2005-01-01

    A key concern for utilities today is optimizing station aging and realibility management activities in a manner that maximizes the value of those activities withing an affordable budget. The Westinghouse Proactive Asset Management Model is a methodology and software-enabled tood designed to assist a utility in developing a station-wide optimization of those activities. The process and tool support the development of an optimized, station-wide plan for inspection, testing, maintenance, repaor and replacement of aging components. The optimization identifies the benefit and optimal timing of those activities based on minimizing unplanned outage costs (avoided costs) and maximizing station Net Present Value. (Author)

  12. Enabling the Analysis of Emergent Behavior in Future Electrical Distribution Systems Using Agent-Based Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Sonja Kolen

    2018-01-01

    Full Text Available In future electrical distribution systems, component heterogeneity and their cyber-physical interactions through electrical lines and communication lead to emergent system behavior. As the distribution systems represent the largest part of an energy system with respect to the number of nodes and components, large-scale studies of their emergent behavior are vital for the development of decentralized control strategies. This paper presents and evaluates DistAIX, a novel agent-based modeling and simulation tool to conduct such studies. The major novelty is a parallelization of the entire model—including the power system, communication system, control, and all interactions—using processes instead of threads. Thereby, a distribution of the simulation to multiple computing nodes with a distributed memory architecture becomes possible. This makes DistAIX scalable and allows the inclusion of as many processing units in the simulation as desired. The scalability of DistAIX is demonstrated by simulations of large-scale scenarios. Additionally, the capability of observing emergent behavior is demonstrated for an exemplary distribution grid with a large number of interacting components.

  13. Reactive Goal Decomposition Hierarchies for On-Board Autonomy

    Science.gov (United States)

    Hartmann, L.

    2002-01-01

    to state and environment and in general can terminate the execution of a decomposition and attempt a new decomposition at any level in the hierarchy. This goal decomposition system is suitable for workstation, microprocessor and fpga implementation and thus is able to support the full range of prototyping activities, from mission design in the laboratory to development of the fpga firmware for the flight system. This approach is based on previous artificial intelligence work including (1) Brooks' subsumption architecture for robot control, (2) Firby's Reactive Action Package System (RAPS) for mediating between high level automated planning and low level execution and (3) hierarchical task networks for automated planning. Reactive goal decomposition hierarchies can be used for a wide variety of on-board autonomy applications including automating low level operation sequences (such as scheduling prerequisite operations, e.g., heaters, warm-up periods, monitoring power constraints), coordinating multiple spacecraft as in formation flying and constellations, robot manipulator operations, rendez-vous, docking, servicing, assembly, on-orbit maintenance, planetary rover operations, solar system and interstellar probes, intelligent science data gathering and disaster early warning. Goal decomposition hierarchies can support high level fault tolerance. Given models of on-board resources and goals to accomplish, the decomposition hierarchy could allocate resources to goals taking into account existing faults and in real-time reallocating resources as new faults arise. Resources to be modeled include memory (e.g., ROM, FPGA configuration memory, processor memory, payload instrument memory), processors, on-board and interspacecraft network nodes and links, sensors, actuators (e.g., attitude determination and control, guidance and navigation) and payload instruments. A goal decomposition hierarchy could be defined to map mission goals and tasks to available on-board resources. As

  14. A New Cyber-enabled Platform for Scale-independent Interoperability of Earth Observations with Hydrologic Models

    Science.gov (United States)

    Rajib, A.; Zhao, L.; Merwade, V.; Shin, J.; Smith, J.; Song, C. X.

    2017-12-01

    Despite the significant potential of remotely sensed earth observations, their application is still not full-fledged in water resources research, management and education. Inconsistent storage structures, data formats and spatial resolution among different platforms/sources of earth observations hinder the use of these data. Available web-services can help bulk data downloading and visualization, but they are not sufficiently tailored to meet the degree of interoperability required for direct application of earth observations in hydrologic modeling at user-defined spatio-temporal scales. Similarly, the least ambiguous way for educators and watershed managers is to instantaneously obtain a time-series at any watershed of interest without spending time and computational resources on data download and post-processing activities. To address this issue, an open access, online platform, named HydroGlobe, is developed that minimizes all these processing tasks and delivers ready-to-use data from different earth observation sources. HydroGlobe can provide spatially-averaged time series of earth observations by using the following inputs: (i) data source, (ii) temporal extent in the form of start/end date, and (iii) geographic units (e.g., grid cell or sub-basin boundary) and extent in the form of GIS shapefile. In its preliminary version, HydroGlobe simultaneously handles five data sources including the surface and root zone soil moisture from SMAP (Soil Moisture Active Passive Mission), actual and potential evapotranspiration from MODIS (Moderate Resolution Imaging Spectroradiometer), and precipitation from GPM (Global Precipitation Measurements). This presentation will demonstrate the HydroGlobe interface and its applicability using few test cases on watersheds from different parts of the globe.

  15. Assessing the ability of isotope-enabled General Circulation Models to simulate the variability of Iceland water vapor isotopic composition

    Science.gov (United States)

    Erla Sveinbjornsdottir, Arny; Steen-Larsen, Hans Christian; Jonsson, Thorsteinn; Ritter, Francois; Riser, Camilla; Messon-Delmotte, Valerie; Bonne, Jean Louis; Dahl-Jensen, Dorthe

    2014-05-01

    During the fall of 2010 we installed an autonomous water vapor spectroscopy laser (Los Gatos Research analyzer) in a lighthouse on the Southwest coast of Iceland (63.83°N, 21.47°W). Despite initial significant problems with volcanic ash, high wind, and attack of sea gulls, the system has been continuously operational since the end of 2011 with limited down time. The system automatically performs calibration every 2 hours, which results in high accuracy and precision allowing for analysis of the second order parameter, d-excess, in the water vapor. We find a strong linear relationship between d-excess and local relative humidity (RH) when normalized to SST. The observed slope of approximately -45 o/oo/% is similar to theoretical predictions by Merlivat and Jouzel [1979] for smooth surface, but the calculated intercept is significant lower than predicted. Despite this good linear agreement with theoretical calculations, mismatches arise between the simulated seasonal cycle of water vapour isotopic composition using LMDZiso GCM nudged to large-scale winds from atmospheric analyses, and our data. The GCM is not able to capture seasonal variations in local RH, nor seasonal variations in d-excess. Based on daily data, the performance of LMDZiso to resolve day-to-day variability is measured based on the strength of the correlation coefficient between observations and model outputs. This correlation coefficient reaches ~0.8 for surface absolute humidity, but decreases to ~0.6 for δD and ~0.45 d-excess. Moreover, the magnitude of day-to-day humidity variations is also underestimated by LMDZiso, which can explain the underestimated magnitude of isotopic depletion. Finally, the simulated and observed d-excess vs. RH has similar slopes. We conclude that the under-estimation of d-excess variability may partly arise from the poor performance of the humidity simulations.

  16. Get Your Hotel Operations Team Onboard The Tricycle of Guest Service

    OpenAIRE

    Kennedy, Doug

    2018-01-01

    As hospitality industry trainers know, using symbols and models can help trainees grasp abstract concepts and make seemingly-complex paradigms easy to understand. Seems like is a good time for the hotel industry to update its model, so let’s get your team onboard The Tricycle of Guest Service. When you think about it, a tricycle is a perfect model for a positive guest experience. For one, it has three wheels, just like the three components of a memorable guest stay. The back wheels repres...

  17. Onboard Decision Making For a New Class of AUV Science

    Science.gov (United States)

    Rajan, K.; McGann, C.; Py, F.; Thomas, H.; Henthorn, R.; McEwen, R.

    2007-12-01

    Autonomous Underwater Vehicles (AUVs) are an increasingly important tool for oceanographic research. They routinely and cost effectively sample the water column at depths far beyond what humans are capable of visiting. However, control of these platforms has relied on fixed sequences for execution of pre-planned actions limiting their effectiveness for measuring dynamic and episodic ocean phenomenon. At the Monterey Bay Aquarium Research Institute (MBARI), we are developing an advanced Artificial Intelligence (AI) based control system to enable our AUV's to dynamically adapt to the environment by deliberating in-situ about mission plans while tracking onboard resource consumption, dealing with plan failures by allowing dynamic re-planning and being cognizant of vehicle health and safety in the course of executing science plans. Existing behavior-based approaches require an operator to script plans a priori while anticipating where and how the vehicle will transect the water column. While adequate for current needs to do routine pre-defined transects, it has limited flexibility in dealing with opportunistic science needs, is unable to deal with uncertainty in the oceanic environment and puts undue burden on the mission operators to manage complex interactions between behaviors. Our approach, informed by a decades worth of experience in intelligent control of NASA spacecraft, uses a constraint-based representation to manage mission goals, react to exogenous or endogenous failure conditions, respond to sensory feedback by using AI-based search techniques to sort thru a space of likely responses and picking one which is satisfies the completion of mission goals. The system encapsulates the long-standing notion of a sense-deliberate-act cycle at the heart of a control loop and reflects the goal-oriented nature of control allowing operators to specify abstract mission goals rather than detailed command sequences. To date we have tested T- REX (the Teleo

  18. Enablers and barriers to physical activity in overweight and obese pregnant women: an analysis informed by the theoretical domains framework and COM-B model.

    Science.gov (United States)

    Flannery, C; McHugh, S; Anaba, A E; Clifford, E; O'Riordan, M; Kenny, L C; McAuliffe, F M; Kearney, P M; Byrne, M

    2018-05-21

    Obesity during pregnancy is associated with increased risk of gestational diabetes mellitus (GDM) and other complications. Physical activity is a modifiable lifestyle factor that may help to prevent these complications but many women reduce their physical activity levels during pregnancy. Interventions targeting physical activity in pregnancy are on-going but few identify the underlying behaviour change mechanisms by which the intervention is expected to work. To enhance intervention effectiveness, recent tools in behavioural science such as the Theoretical Domains Framework (TDF) and COM-B model (capability, opportunity, motivation and behaviour) have been employed to understand behaviours for intervention development. Using these behaviour change methods, this study aimed to identify the enablers and barriers to physical activity in overweight and obese pregnant women. Semi-structured interviews were conducted with a purposive sample of overweight and obese women at different stages of pregnancy attending a public antenatal clinic in a large academic maternity hospital in Cork, Ireland. Interviews were recorded and transcribed into NVivo V.10 software. Data analysis followed the framework approach, drawing on the TDF and the COM-B model. Twenty one themes were identified and these mapped directly on to the COM-B model of behaviour change and ten of the TDF domains. Having the social opportunity to engage in physical activity was identified as an enabler; pregnant women suggested being active was easier when supported by their partners. Knowledge was a commonly reported barrier with women lacking information on safe activities during pregnancy and describing the information received from their midwife as 'limited'. Having the physical capability and physical opportunity to carry out physical activity were also identified as barriers; experiencing pain, a lack of time, having other children, and working prevented women from being active. A wide range of barriers

  19. High-G Survivability of an Unpotted Onboard Recorder

    Science.gov (United States)

    2017-10-01

    UNCLASSIFIED UNCLASSIFIED AD-E403 949 Technical Report ARMET-TR-16081 HIGH -G SURVIVABILITY OF AN UNPOTTED ONBOARD RECORDER...Arsenal, New Jersey UNCLASSIFIED UNCLASSIFIED The views, opinions, and/or findings contained in this report are those...documentation. The citation in this report of the names of commercial firms or commercially available products or services does not constitute

  20. Digital tomosynthesis with an on-board kilovoltage imaging device

    International Nuclear Information System (INIS)

    Godfrey, Devon J.; Yin, F.-F.; Oldham, Mark; Yoo, Sua; Willett, Christopher

    2006-01-01

    Purpose: To generate on-board digital tomosynthesis (DTS) and reference DTS images for three-dimensional image-guided radiation therapy (IGRT) as an alternative to conventional portal imaging or on-board cone-beam computed tomography (CBCT). Methods and Materials: Three clinical cases (prostate, head-and-neck, and liver) were selected to illustrate the capabilities of on-board DTS for IGRT. Corresponding reference DTS images were reconstructed from digitally reconstructed radiographs computed from planning CT image sets. The effect of scan angle on DTS slice thickness was examined by computing the mutual information between coincident CBCT and DTS images, as the DTS scan angle was varied from 0 o to 165 o . A breath-hold DTS acquisition strategy was implemented to remove respiratory motion artifacts. Results: Digital tomosynthesis slices appeared similar to coincident CBCT planes and yielded substantially more anatomic information than either kilovoltage or megavoltage radiographs. Breath-hold DTS acquisition improved soft-tissue visibility by suppressing respiratory motion. Conclusions: Improved bony and soft-tissue visibility in DTS images is likely to improve target localization compared with radiographic verification techniques and might allow for daily localization of a soft-tissue target. Breath-hold DTS is a potential alternative to on-board CBCT for sites prone to respiratory motion

  1. On-Board Mining in the Sensor Web

    Science.gov (United States)

    Tanner, S.; Conover, H.; Graves, S.; Ramachandran, R.; Rushing, J.

    2004-12-01

    On-board data mining can contribute to many research and engineering applications, including natural hazard detection and prediction, intelligent sensor control, and the generation of customized data products for direct distribution to users. The ability to mine sensor data in real time can also be a critical component of autonomous operations, supporting deep space missions, unmanned aerial and ground-based vehicles (UAVs, UGVs), and a wide range of sensor meshes, webs and grids. On-board processing is expected to play a significant role in the next generation of NASA, Homeland Security, Department of Defense and civilian programs, providing for greater flexibility and versatility in measurements of physical systems. In addition, the use of UAV and UGV systems is increasing in military, emergency response and industrial applications. As research into the autonomy of these vehicles progresses, especially in fleet or web configurations, the applicability of on-board data mining is expected to increase significantly. Data mining in real time on board sensor platforms presents unique challenges. Most notably, the data to be mined is a continuous stream, rather than a fixed store such as a database. This means that the data mining algorithms must be modified to make only a single pass through the data. In addition, the on-board environment requires real time processing with limited computing resources, thus the algorithms must use fixed and relatively small amounts of processing time and memory. The University of Alabama in Huntsville is developing an innovative processing framework for the on-board data and information environment. The Environment for On-Board Processing (EVE) and the Adaptive On-board Data Processing (AODP) projects serve as proofs-of-concept of advanced information systems for remote sensing platforms. The EVE real-time processing infrastructure will upload, schedule and control the execution of processing plans on board remote sensors. These plans

  2. Enablers and barriers to physical activity in overweight and obese pregnant women: an analysis informed by the theoretical domains framework and COM-B model.

    LENUS (Irish Health Repository)

    Flannery, C

    2018-05-21

    Obesity during pregnancy is associated with increased risk of gestational diabetes mellitus (GDM) and other complications. Physical activity is a modifiable lifestyle factor that may help to prevent these complications but many women reduce their physical activity levels during pregnancy. Interventions targeting physical activity in pregnancy are on-going but few identify the underlying behaviour change mechanisms by which the intervention is expected to work. To enhance intervention effectiveness, recent tools in behavioural science such as the Theoretical Domains Framework (TDF) and COM-B model (capability, opportunity, motivation and behaviour) have been employed to understand behaviours for intervention development. Using these behaviour change methods, this study aimed to identify the enablers and barriers to physical activity in overweight and obese pregnant women.

  3. The Ionospheric Bubble Index deduced from magnetic field and plasma observations onboard Swarm

    DEFF Research Database (Denmark)

    Park, Jaeheung; Noja, Max; Stolle, Claudia

    2013-01-01

    . This product called L2-IBI is generated from magnetic field and plasma observations onboard Swarm, and gives information as to whether a Swarm magnetic field observation is affected by EPBs. We validate the performance of the L2-IBI product by using magnetic field and plasma measurements from the CHAMP...... satellite, which provided observations similar to those of the Swarm. The L2-IBI product is of interest not only for ionospheric studies, but also for geomagnetic field modeling; modelers can de-select magnetic data which are affected by EPBs or other unphysical artifacts....

  4. Using remotely piloted aircraft and onboard processing to optimize and expand data collection

    Science.gov (United States)

    Fladeland, M. M.; Sullivan, D. V.; Chirayath, V.; Instrella, R.; Phelps, G. A.

    2016-12-01

    Remotely piloted aircraft (RPA) have the potential to revolutionize local to regional data collection for geophysicists as platform and payload size decrease while aircraft capabilities increase. In particular, data from RPAs combine high-resolution imagery available from low flight elevations with comprehensive areal coverage, unattainable from ground investigations and difficult to acquire from manned aircraft due to budgetary and logistical costs. Low flight elevations are particularly important for detecting signals that decay exponentially with distance, such as electromagnetic fields. Onboard data processing coupled with high-bandwidth telemetry open up opportunities for real-time and near real-time data processing, producing more efficient flight plans through the use of payload-directed flight, machine learning and autonomous systems. Such applications not only strive to enhance data collection, but also enable novel sensing modalities and temporal resolution. NASA's Airborne Science Program has been refining the capabilities and applications of RPA in support of satellite calibration and data product validation for several decades. In this paper, we describe current platforms, payloads, and onboard data systems available to the research community. Case studies include Fluid Lensing for littoral zone 3D mapping, structure from motion for terrestrial 3D multispectral imaging, and airborne magnetometry on medium and small RPAs.

  5. Design Space Toolbox V2: Automated Software Enabling a Novel Phenotype-Centric Modeling Strategy for Natural and Synthetic Biological Systems

    Directory of Open Access Journals (Sweden)

    Jason Gunther Lomnitz

    2016-07-01

    Full Text Available Mathematical models of biochemical systems provide a means to elucidate the link between the genotype, environment and phenotype. A subclass of mathematical models, known as mechanistic models, quantitatively describe the complex non-linear mechanisms that capture the intricate interactions between biochemical components. However, the study of mechanistic models is challenging because most are analytically intractable and involve large numbers of system parameters. Conventional methods to analyze them rely on local analyses about a nominal parameter set and they do not reveal the vast majority of potential phenotypes possible for a given system design. We have recently developed a new modeling approach that does not require estimated values for the parameters initially and inverts the typical steps of the conventional modeling strategy. Instead, this approach relies on architectural features of the model to identify the phenotypic repertoire and then predict values for the parameters that yield specific instances of the system that realize desired phenotypic characteristics. Here, we present a collection of software tools, the Design Space Toolbox V2 based on the System Design Space method, that automates (1 enumeration of the repertoire of model phenotypes, (2 prediction of values for the parameters for any model phenotype and (3 analysis of model phenotypes through analytical and numerical methods. The result is an enabling technology that facilitates this radically new, phenotype-centric, modeling approach. We illustrate the power of these new tools by applying them to a synthetic gene circuit that can exhibit multi-stability. We then predict values for the system parameters such that the design exhibits 2, 3 and 4 stable steady states. In one example, inspection of the basins of attraction reveals that the circuit can count between 3 stable states by transient stimulation through one of two input channels: a positive channel that increases

  6. Design Space Toolbox V2: Automated Software Enabling a Novel Phenotype-Centric Modeling Strategy for Natural and Synthetic Biological Systems

    Science.gov (United States)

    Lomnitz, Jason G.; Savageau, Michael A.

    2016-01-01

    Mathematical models of biochemical systems provide a means to elucidate the link between the genotype, environment, and phenotype. A subclass of mathematical models, known as mechanistic models, quantitatively describe the complex non-linear mechanisms that capture the intricate interactions between biochemical components. However, the study of mechanistic models is challenging because most are analytically intractable and involve large numbers of system parameters. Conventional methods to analyze them rely on local analyses about a nominal parameter set and they do not reveal the vast majority of potential phenotypes possible for a given system design. We have recently developed a new modeling approach that does not require estimated values for the parameters initially and inverts the typical steps of the conventional modeling strategy. Instead, this approach relies on architectural features of the model to identify the phenotypic repertoire and then predict values for the parameters that yield specific instances of the system that realize desired phenotypic characteristics. Here, we present a collection of software tools, the Design Space Toolbox V2 based on the System Design Space method, that automates (1) enumeration of the repertoire of model phenotypes, (2) prediction of values for the parameters for any model phenotype, and (3) analysis of model phenotypes through analytical and numerical methods. The result is an enabling technology that facilitates this radically new, phenotype-centric, modeling approach. We illustrate the power of these new tools by applying them to a synthetic gene circuit that can exhibit multi-stability. We then predict values for the system parameters such that the design exhibits 2, 3, and 4 stable steady states. In one example, inspection of the basins of attraction reveals that the circuit can count between three stable states by transient stimulation through one of two input channels: a positive channel that increases the count

  7. Design Space Toolbox V2: Automated Software Enabling a Novel Phenotype-Centric Modeling Strategy for Natural and Synthetic Biological Systems.

    Science.gov (United States)

    Lomnitz, Jason G; Savageau, Michael A

    2016-01-01

    Mathematical models of biochemical systems provide a means to elucidate the link between the genotype, environment, and phenotype. A subclass of mathematical models, known as mechanistic models, quantitatively describe the complex non-linear mechanisms that capture the intricate interactions between biochemical components. However, the study of mechanistic models is challenging because most are analytically intractable and involve large numbers of system parameters. Conventional methods to analyze them rely on local analyses about a nominal parameter set and they do not reveal the vast majority of potential phenotypes possible for a given system design. We have recently developed a new modeling approach that does not require estimated values for the parameters initially and inverts the typical steps of the conventional modeling strategy. Instead, this approach relies on architectural features of the model to identify the phenotypic repertoire and then predict values for the parameters that yield specific instances of the system that realize desired phenotypic characteristics. Here, we present a collection of software tools, the Design Space Toolbox V2 based on the System Design Space method, that automates (1) enumeration of the repertoire of model phenotypes, (2) prediction of values for the parameters for any model phenotype, and (3) analysis of model phenotypes through analytical and numerical methods. The result is an enabling technology that facilitates this radically new, phenotype-centric, modeling approach. We illustrate the power of these new tools by applying them to a synthetic gene circuit that can exhibit multi-stability. We then predict values for the system parameters such that the design exhibits 2, 3, and 4 stable steady states. In one example, inspection of the basins of attraction reveals that the circuit can count between three stable states by transient stimulation through one of two input channels: a positive channel that increases the count

  8. YOLO Object Detector for Onboard Driving Images

    OpenAIRE

    Soto i Serrano, Albert

    2017-01-01

    With the evolution of artificial intelligence and, specially, machine learning, tech and car manufacturing companies are in research of the car of the future. Along with the arrival of new powerful hardware, deep learning is expected to be one of the most outstanding fields in the automotive sector. In this paper, we will be developing an object detection system with neural networks using the You Only Look Once (YOLO) network architecture. We will train and evaluate the model using various da...

  9. Enabling immersive simulation.

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Josh (University of California Santa Cruz, Santa Cruz, CA); Mateas, Michael (University of California Santa Cruz, Santa Cruz, CA); Hart, Derek H.; Whetzel, Jonathan; Basilico, Justin Derrick; Glickman, Matthew R.; Abbott, Robert G.

    2009-02-01

    The object of the 'Enabling Immersive Simulation for Complex Systems Analysis and Training' LDRD has been to research, design, and engineer a capability to develop simulations which (1) provide a rich, immersive interface for participation by real humans (exploiting existing high-performance game-engine technology wherever possible), and (2) can leverage Sandia's substantial investment in high-fidelity physical and cognitive models implemented in the Umbra simulation framework. We report here on these efforts. First, we describe the integration of Sandia's Umbra modular simulation framework with the open-source Delta3D game engine. Next, we report on Umbra's integration with Sandia's Cognitive Foundry, specifically to provide for learning behaviors for 'virtual teammates' directly from observed human behavior. Finally, we describe the integration of Delta3D with the ABL behavior engine, and report on research into establishing the theoretical framework that will be required to make use of tools like ABL to scale up to increasingly rich and realistic virtual characters.

  10. Onboard software of Plasma Wave Experiment aboard Arase: instrument management and signal processing of Waveform Capture/Onboard Frequency Analyzer

    Science.gov (United States)

    Matsuda, Shoya; Kasahara, Yoshiya; Kojima, Hirotsugu; Kasaba, Yasumasa; Yagitani, Satoshi; Ozaki, Mitsunori; Imachi, Tomohiko; Ishisaka, Keigo; Kumamoto, Atsushi; Tsuchiya, Fuminori; Ota, Mamoru; Kurita, Satoshi; Miyoshi, Yoshizumi; Hikishima, Mitsuru; Matsuoka, Ayako; Shinohara, Iku

    2018-05-01

    We developed the onboard processing software for the Plasma Wave Experiment (PWE) onboard the Exploration of energization and Radiation in Geospace, Arase satellite. The PWE instrument has three receivers: Electric Field Detector, Waveform Capture/Onboard Frequency Analyzer (WFC/OFA), and the High-Frequency Analyzer. We designed a pseudo-parallel processing scheme with a time-sharing system and achieved simultaneous signal processing for each receiver. Since electric and magnetic field signals are processed by the different CPUs, we developed a synchronized observation system by using shared packets on the mission network. The OFA continuously measures the power spectra, spectral matrices, and complex spectra. The OFA obtains not only the entire ELF/VLF plasma waves' activity but also the detailed properties (e.g., propagation direction and polarization) of the observed plasma waves. We performed simultaneous observation of electric and magnetic field data and successfully obtained clear wave properties of whistler-mode chorus waves using these data. In order to measure raw waveforms, we developed two modes for the WFC, `chorus burst mode' (65,536 samples/s) and `EMIC burst mode' (1024 samples/s), for the purpose of the measurement of the whistler-mode chorus waves (typically in a frequency range from several hundred Hz to several kHz) and the EMIC waves (typically in a frequency range from a few Hz to several hundred Hz), respectively. We successfully obtained the waveforms of electric and magnetic fields of whistler-mode chorus waves and ion cyclotron mode waves along the Arase's orbit. We also designed the software-type wave-particle interaction analyzer mode. In this mode, we measure electric and magnetic field waveforms continuously and transfer them to the mission data recorder onboard the Arase satellite. We also installed an onboard signal calibration function (onboard SoftWare CALibration; SWCAL). We performed onboard electric circuit diagnostics and

  11. PROCESS OF CHANGES OF MAINTENANCE-FREE ONBOARD SYSTEM OPERATIONAL STATUS BETWEEN SCHEDULED MAINTENANCES

    Directory of Open Access Journals (Sweden)

    Andrey Mikhaylovich Bronnikov

    2017-01-01

    Full Text Available In this article the authors consider the problem of simulating the process of a maintenance-free between scheduled maintenance aircraft system operational status changes, which failure during the flight leads to the disaster. On-board equipment with automatic self-repair between routine maintenance in the event the components fail is called maintenance-free. During operation, onboard equipment accumulates failures maintaining its functions with a safety level not lower than the required minimum. Trouble shooting is carried out either at the end of between-maintenance period (as a rule, or after the failure, which led to the functions disorder or to the decrease below the target level of flight safety (as an exception. The system contains both redundant and nonredundant units and elements with the known failure rates. The system can be in one of the three states: operable, extreme, failed. The excessive redundant elements allow the system to accumulate failures which are repaired during the routine maintenance. The process of system operational status changes is described with the discrete-continuous model in the flight time. Basing on the information about the probabilities of the on-board equipment being in an operable, extreme or failed state, it is possible to calculate such complex efficiency indicators as the average loss of sorties, the average operating costs, the expected number of emergency recovery operations and others. Numerical studies have been conducted to validate the proposed model. It is believed that maintenance work completely updates the system. The analysis of these indicators will allow to evaluate the maintenance-free aircraft equipment operation efficiency, as well as to make an effectiveness comparison with other methods of technical operation. The model can be also used to assess the technical operation systems performance. The model can be used to optimize the period between maintenance.

  12. Combining the Generic Entity-Attribute-Value Model and Terminological Models into a Common Ontology to Enable Data Integration and Decision Support.

    Science.gov (United States)

    Bouaud, Jacques; Guézennec, Gilles; Séroussi, Brigitte

    2018-01-01

    The integration of clinical information models and termino-ontological models into a unique ontological framework is highly desirable for it facilitates data integration and management using the same formal mechanisms for both data concepts and information model components. This is particularly true for knowledge-based decision support tools that aim to take advantage of all facets of semantic web technologies in merging ontological reasoning, concept classification, and rule-based inferences. We present an ontology template that combines generic data model components with (parts of) existing termino-ontological resources. The approach is developed for the guideline-based decision support module on breast cancer management within the DESIREE European project. The approach is based on the entity attribute value model and could be extended to other domains.

  13. Logistic Model to Support Service Modularity for the Promotion of Reusability in a Web Objects-Enabled IoT Environment.

    Science.gov (United States)

    Kibria, Muhammad Golam; Ali, Sajjad; Jarwar, Muhammad Aslam; Kumar, Sunil; Chong, Ilyoung

    2017-09-22

    Due to a very large number of connected virtual objects in the surrounding environment, intelligent service features in the Internet of Things requires the reuse of existing virtual objects and composite virtual objects. If a new virtual object is created for each new service request, then the number of virtual object would increase exponentially. The Web of Objects applies the principle of service modularity in terms of virtual objects and composite virtual objects. Service modularity is a key concept in the Web Objects-Enabled Internet of Things (IoT) environment which allows for the reuse of existing virtual objects and composite virtual objects in heterogeneous ontologies. In the case of similar service requests occurring at the same, or different locations, the already-instantiated virtual objects and their composites that exist in the same, or different ontologies can be reused. In this case, similar types of virtual objects and composite virtual objects are searched and matched. Their reuse avoids duplication under similar circumstances, and reduces the time it takes to search and instantiate them from their repositories, where similar functionalities are provided by similar types of virtual objects and their composites. Controlling and maintaining a virtual object means controlling and maintaining a real-world object in the real world. Even though the functional costs of virtual objects are just a fraction of those for deploying and maintaining real-world objects, this article focuses on reusing virtual objects and composite virtual objects, as well as discusses similarity matching of virtual objects and composite virtual objects. This article proposes a logistic model that supports service modularity for the promotion of reusability in the Web Objects-enabled IoT environment. Necessary functional components and a flowchart of an algorithm for reusing composite virtual objects are discussed. Also, to realize the service modularity, a use case scenario is

  14. A systematic evaluation of solubility enhancing excipients to enable the generation of permeability data for poorly soluble compounds in Caco-2 model.

    Science.gov (United States)

    Shah, Devang; Paruchury, Sundeep; Matta, Muralikrishna; Chowan, Gajendra; Subramanian, Murali; Saxena, Ajay; Soars, Matthew G; Herbst, John; Haskell, Roy; Marathe, Punit; Mandlekar, Sandhya

    2014-01-01

    The study presented here identified and utilized a panel of solubility enhancing excipients to enable the generation of flux data in the Human colon carcinoma (Caco-2) system for compounds with poor solubility. Solubility enhancing excipients Dimethyl acetamide (DMA) 1 % v/v, polyethylene glycol (PEG) 400 1% v/v, povidone 1% w/v, poloxamer 188 2.5% w/v and bovine serum albumin (BSA) 4% w/v did not compromise Caco-2 monolayer integrity as assessed by trans-epithelial resistance measurement (TEER) and Lucifer yellow (LY) permeation. Further, these excipients did not affect P-glycoprotein (P-gp) mediated bidirectional transport of digoxin, permeabilities of high (propranolol) or low permeability (atenolol) compounds, and were found to be inert to Breast cancer resistant protein (BCRP) mediated transport of cladribine. This approach was validated further using poorly soluble tool compounds, atazanavir (poloxamer 188 2.5% w/v) and cyclosporine A (BSA 4% w/v) and also applied to new chemical entity (NCE) BMS-A in BSA 4% w/v, for which Caco-2 data could not be generated using the traditional methodology due to poor solubility (solubility of atazanavir by >8 fold whereas BSA 4% w/v increased the solubility of cyclosporine A and BMS-A by >2-4 fold thereby enabling permeability as well as efflux liability estimation in the Caco-2 model with reasonable recovery values. To conclude, addition of excipients such as poloxamer 188 2.5% w/v and BSA 4% w/v to HBSS leads to a significant improvement in the solubility of the poorly soluble compounds resulting in enhanced recoveries without modulating transporter-mediated efflux, expanding the applicability of Caco-2 assays to poorly soluble compounds.

  15. Organising to Enable Innovation

    DEFF Research Database (Denmark)

    Brink, Tove

    2016-01-01

    The purpose of this conceptual paper is to reveal how organising can enable innovation across organisational layers and organisational units. This approach calls for a cross-disciplinary literature review. The aim is to provide an integrated understanding of innovation in an organisational approach....... The findings reveal a continous organising process between individual/ team creativity and organisational structures/control to enable innovation at firm level. Organising provides a dynamic approach and contains the integrated reconstruction of creativity, structures and boundaries for enhanced balance...... of explorative and exploitative learning in uncertain environments. Shedding light on the cross-disciplinary theories to organise innovation provides a contribution at the firm level to enable innovation....

  16. Proton exchange membrane fuel cells for electrical power generation on-board commercial airplanes.

    Energy Technology Data Exchange (ETDEWEB)

    Curgus, Dita Brigitte; Munoz-Ramos, Karina (Sandia National Laboratories, Albuquerque, NM); Pratt, Joseph William; Akhil, Abbas Ali (Sandia National Laboratories, Albuquerque, NM); Klebanoff, Leonard E.; Schenkman, Benjamin L. (Sandia National Laboratories, Albuquerque, NM)

    2011-05-01

    Deployed on a commercial airplane, proton exchange membrane fuel cells may offer emissions reductions, thermal efficiency gains, and enable locating the power near the point of use. This work seeks to understand whether on-board fuel cell systems are technically feasible, and, if so, if they offer a performance advantage for the airplane as a whole. Through hardware analysis and thermodynamic and electrical simulation, we found that while adding a fuel cell system using today's technology for the PEM fuel cell and hydrogen storage is technically feasible, it will not likely give the airplane a performance benefit. However, when we re-did the analysis using DOE-target technology for the PEM fuel cell and hydrogen storage, we found that the fuel cell system would provide a performance benefit to the airplane (i.e., it can save the airplane some fuel), depending on the way it is configured.

  17. Proton Exchange Membrane Fuel Cells for Electrical Power Generation On-Board Commercial Airplanes

    Energy Technology Data Exchange (ETDEWEB)

    Pratt, Joesph W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klebanoff, Leonard E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Munoz-Ramos, Karina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Akhil, Abbas A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Curgus, Dita B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schenkman, Benjamin L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2011-05-01

    Deployed on a commercial airplane, proton exchange membrane fuel cells may offer emissions reductions, thermal efficiency gains, and enable locating the power near the point of use. This work seeks to understand whether on-board fuel cell systems are technically feasible, and, if so, if they offer a performance advantage for the airplane as a whole. Through hardware analysis and thermodynamic and electrical simulation, we found that while adding a fuel cell system using today’s technology for the PEM fuel cell and hydrogen storage is technically feasible, it will not likely give the airplane a performance benefit. However, when we re-did the analysis using DOE-target technology for the PEM fuel cell and hydrogen storage, we found that the fuel cell system would provide a performance benefit to the airplane (i.e., it can save the airplane some fuel), depending on the way it is configured.

  18. Identifying Onboarding Heuristics for Free-to-Play Mobile Games: A Mixed Methods Approach

    DEFF Research Database (Denmark)

    Thomsen, Line Ebdrup; Weigert Petersen, Falko; Mirza-Babaei, Pejman

    2016-01-01

    The onboarding phase of Free-to-Play mobile games, covering the first few minutes of play, typically sees a substantial retention rate amongst players. It is therefore crucial to the success of these games that the onboarding phase promotes engagement to the widest degree possible. In this paper ...... of puzzle games, base builders and arcade games, and utilize different onboarding phase design approaches. Results showcase how heuristics can be used to design engaging onboarding phases in mobile games....

  19. Model My Watershed - A Robust Online App to Enable Citizen Scientists to Model Watershed Hydrology and Water Quality at Regulatory-Level Standards

    Science.gov (United States)

    Daniels, M.; Kerlin, S.; Arscott, D.

    2017-12-01

    Citizen-based watershed monitoring has historically lacked scientific rigor and geographic scope due to limitation in access to watershed-level data and the high level skills and resources required to adequately model watershed dynamics. Public access to watershed information is currently routed through a variety of governmental data portals and often requires advanced geospatial skills to collect and present in useable forms. At the same time, tremendous financial resources are being invested in watershed restoration and management efforts, and often these resources pass through local stakeholder groups such as conservation NGO, watershed interest groups, and local municipalities without extensive hydrologic knowledge or access to sophisticated modeling resources. Even governmental agencies struggle to understand how to best steer or prioritize restoration investments. A new app, Model My Watershed, was built to improve access to watershed data and modeling capabilities in a fast, accessible, free web-app format. Working across the contiguous United States, the Model My Watershed app provides land cover, soils, aerial imagery and relief, watershed delineation, and stream network delineation. Users can model watersheds or areas of interest and create management scenarios to evaluate implementation of land cover changes and best management practice implementation with both hydrologic and water quality outputs that meet TMDL regulatory standards.

  20. The EGSE science software of the IBIS instrument on-board INTEGRAL satellite

    International Nuclear Information System (INIS)

    La Rosa, Giovanni; Fazio, Giacomo; Segreto, Alberto; Gianotti, Fulvio; Stephen, John; Trifoglio, Massimo

    2000-01-01

    IBIS (Imager on Board INTEGRAL Satellite) is one of the key instrument on-board the INTEGRAL satellite, the follow up mission of the high energy missions CGRO and Granat. The EGSE of IBIS is composed by a Satellite Interface Simulator, a Control Station and a Science Station. Here are described the solutions adopted for the architectural design of the software running on the Science Station. Some preliminary results are used to show the science functionality, that allowed to understand the instrument behavior, all along the test and calibration campaigns of the Engineering Model of IBIS

  1. Information processing requirements for on-board monitoring of automatic landing

    Science.gov (United States)

    Sorensen, J. A.; Karmarkar, J. S.

    1977-01-01

    A systematic procedure is presented for determining the information processing requirements for on-board monitoring of automatic landing systems. The monitoring system detects landing anomalies through use of appropriate statistical tests. The time-to-correct aircraft perturbations is determined from covariance analyses using a sequence of suitable aircraft/autoland/pilot models. The covariance results are used to establish landing safety and a fault recovery operating envelope via an event outcome tree. This procedure is demonstrated with examples using the NASA Terminal Configured Vehicle (B-737 aircraft). The procedure can also be used to define decision height, assess monitoring implementation requirements, and evaluate alternate autoland configurations.

  2. Semantic Information Extraction of Lanes Based on Onboard Camera Videos

    Science.gov (United States)

    Tang, L.; Deng, T.; Ren, C.

    2018-04-01

    In the field of autonomous driving, semantic information of lanes is very important. This paper proposes a method of automatic detection of lanes and extraction of semantic information from onboard camera videos. The proposed method firstly detects the edges of lanes by the grayscale gradient direction, and improves the Probabilistic Hough transform to fit them; then, it uses the vanishing point principle to calculate the lane geometrical position, and uses lane characteristics to extract lane semantic information by the classification of decision trees. In the experiment, 216 road video images captured by a camera mounted onboard a moving vehicle were used to detect lanes and extract lane semantic information. The results show that the proposed method can accurately identify lane semantics from video images.

  3. Development of Onboard Computer Complex for Russian Segment of ISS

    Science.gov (United States)

    Branets, V.; Brand, G.; Vlasov, R.; Graf, I.; Clubb, J.; Mikrin, E.; Samitov, R.

    1998-01-01

    Report present a description of the Onboard Computer Complex (CC) that was developed during the period of 1994-1998 for the Russian Segment of ISS. The system was developed in co-operation with NASA and ESA. ESA developed a new computation system under the RSC Energia Technical Assignment, called DMS-R. The CC also includes elements developed by Russian experts and organizations. A general architecture of the computer system and the characteristics of primary elements of this system are described. The system was integrated at RSC Energia with the participation of American and European specialists. The report contains information on software simulators, verification and de-bugging facilities witch were been developed for both stand-alone and integrated tests and verification. This CC serves as the basis for the Russian Segment Onboard Control Complex on ISS.

  4. Experimental study on ceramic membrane technology for onboard oxygen generation

    Directory of Open Access Journals (Sweden)

    Jiang Dongsheng

    2016-08-01

    Full Text Available The ceramic membrane oxygen generation technology has advantages of high concentration of produced oxygen and potential nuclear and biochemical protection capability. The present paper studies the ceramic membrane technology for onboard oxygen generation. Comparisons are made to have knowledge of the effects of two kinds of ceramic membrane separation technologies on oxygen generation, namely electricity driven ceramic membrane separation oxygen generation technology (EDCMSOGT and pressure driven ceramic membrane separation oxygen generation technology (PDCMSOGT. Experiments were conducted under different temperatures, pressures of feed air and produced oxygen flow rates. On the basis of these experiments, the flow rate of feed air, electric power provided, oxygen recovery rate and concentration of produced oxygen are compared under each working condition. It is concluded that the EDCMSOGT is the oxygen generation means more suitable for onboard conditions.

  5. Development of on-board fuel metering and sensing system

    Science.gov (United States)

    Hemanth, Y.; Manikanta, B. S. S.; Thangaraja, J.; Bharanidaran, R.

    2017-11-01

    Usage of biodiesel fuels and their blends with diesel fuel has a potential to reduce the tailpipe emissions and reduce the dependence on crude oil imports. Further, biodiesel fuels exhibit favourable greenhouse gas emission and energy balance characteristics. While fossil fuel technology is well established, the technological implications of biofuels particularly biodiesel is not clearly laid out. Hence, the objective is to provide an on-board metering control in selecting the different proportions of diesel and bio-diesel blends. An on-board fuel metering system is being developed using PID controller, stepper motors and a capacitance sensor. The accuracy was tested with the blends of propanol-1, diesel and are found to be within 1.3% error. The developed unit was tested in a twin cylinder diesel engine with biodiesel blended diesel fuel. There was a marginal increase (5%) in nitric oxide and 14% increase in smoke emission with 10% biodiesel blended diesel at part load conditions.

  6. Onboard radiation shielding estimates for interplanetary manned missions

    International Nuclear Information System (INIS)

    Totemeier, A.; Jevremovic, T.; Hounshel, D.

    2004-01-01

    The main focus of space related shielding design is to protect operating systems, personnel and key structural components from outer space and onboard radiation. This paper summarizes the feasibility of a lightweight neutron radiation shield design for a nuclear powered, manned space vehicle. The Monte Carlo code MCNP5 is used to determine radiation transport characteristics of the different materials and find the optimized shield configuration. A phantom torso encased in air is used to determine a dose rate for a crew member on the ship. Calculation results indicate that onboard shield against neutron radiation coming from nuclear engine can be achieved with very little addition of weight to the space vehicle. The selection of materials and neutron transport analysis as presented in this paper are useful starting data to design shield against neutrons generated when high-energy particles from outer space interact with matter on the space vehicle. (authors)

  7. Lunar Penetrating Radar onboard the Chang'e-3 mission

    Science.gov (United States)

    Fang, Guang-You; Zhou, Bin; Ji, Yi-Cai; Zhang, Qun-Ying; Shen, Shao-Xiang; Li, Yu-Xi; Guan, Hong-Fei; Tang, Chuan-Jun; Gao, Yun-Ze; Lu, Wei; Ye, Sheng-Bo; Han, Hai-Dong; Zheng, Jin; Wang, Shu-Zhi

    2014-12-01

    Lunar Penetrating Radar (LPR) is one of the important scientific instruments onboard the Chang'e-3 spacecraft. Its scientific goals are the mapping of lunar regolith and detection of subsurface geologic structures. This paper describes the goals of the mission, as well as the basic principles, design, composition and achievements of the LPR. Finally, experiments on a glacier and the lunar surface are analyzed.

  8. Experimental study on ceramic membrane technology for onboard oxygen generation

    OpenAIRE

    Jiang Dongsheng; Bu Xueqin; Sun Bing; Lin Guiping; Zhao Hongtao; Cai Yan; Fang Ling

    2016-01-01

    The ceramic membrane oxygen generation technology has advantages of high concentration of produced oxygen and potential nuclear and biochemical protection capability. The present paper studies the ceramic membrane technology for onboard oxygen generation. Comparisons are made to have knowledge of the effects of two kinds of ceramic membrane separation technologies on oxygen generation, namely electricity driven ceramic membrane separation oxygen generation technology (EDCMSOGT) and pressure d...

  9. MOBS - A modular on-board switching system

    Science.gov (United States)

    Berner, W.; Grassmann, W.; Piontek, M.

    The authors describe a multibeam satellite system that is designed for business services and for communications at a high bit rate. The repeater is regenerative with a modular onboard switching system. It acts not only as baseband switch but also as the central node of the network, performing network control and protocol evaluation. The hardware is based on a modular bus/memory architecture with associated processors.

  10. STS-59 crewmembers in training for onboard Earth observations

    Science.gov (United States)

    1993-01-01

    The six astronauts in training for the STS-59 mission are shown onboard Earth observations tips by Justin Wilkinson (standing, foreground) of the Space Shuttle Earth Observations Project (SSEOP) group. Astronaut Sidney M. Gutierrez, mission commander, is at center on the left side of the table. Others, left to right, are Astronauts Kevin P. Chilton, pilot; Jerome (Jay) Apt and Michael R.U. (Rich) Clifford, both mission specialists; Linda M. Godwin, payload commander; and Thomas D. Jones, mission specialist.

  11. The end-to-end testbed of the optical metrology system on-board LISA Pathfinder

    Energy Technology Data Exchange (ETDEWEB)

    Steier, F; Cervantes, F Guzman; Marin, A F GarcIa; Heinzel, G; Danzmann, K [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut) and Universitaet Hannover (Germany); Gerardi, D, E-mail: frank.steier@aei.mpg.d [EADS Astrium Satellites GmbH, Friedrichshafen (Germany)

    2009-05-07

    LISA Pathfinder is a technology demonstration mission for the Laser Interferometer Space Antenna (LISA). The main experiment on-board LISA Pathfinder is the so-called LISA Technology Package (LTP) which has the aim to measure the differential acceleration between two free-falling test masses with an accuracy of 3 x 10{sup -14} ms{sup -2} Hz{sup -1/2} between 1 mHz and 30 mHz. This measurement is performed interferometrically by the optical metrology system (OMS) on-board LISA Pathfinder. In this paper, we present the development of an experimental end-to-end testbed of the entire OMS. It includes the interferometer and its sub-units, the interferometer backend which is a phasemeter and the processing of the phasemeter output data. Furthermore, three-axes piezo-actuated mirrors are used instead of the free-falling test masses for the characterization of the dynamic behaviour of the system and some parts of the drag-free and attitude control system (DFACS) which controls the test masses and the satellite. The end-to-end testbed includes all parts of the LTP that can reasonably be tested on earth without free-falling test masses. At its present status it consists mainly of breadboard components. Some of those have already been replaced by engineering models of the LTP experiment. In the next steps, further engineering and flight models will also be inserted in this testbed and tested against well-characterized breadboard components. The presented testbed is an important reference for the unit tests and can also be used for validation of the on-board experiment during the mission.

  12. Radiation dosimetry onboard the International Space Station ISS

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Thomas [German Aerospace Center - DLR, Inst. of Aerospace Medicine, Radiation Biology, Cologne (Germany)

    2008-07-01

    Besides the effects of the microgravity environment, and the psychological and psychosocial problems encountered in confined spaces, radiation is the main health detriment for long duration human space missions. The radiation environment encountered in space differs in nature front that on earth, consisting mostly of high energetic ions from protons up to iron, resulting in radiation levels far exceeding the ones encountered on earth for occupational radiation workers. Therefore the determination and the control of the radiation load on astronauts is a moral obligation of the space faring nations. The requirements for radiation detectors in space are very different to that on earth. Limitations in mass, power consumption and the complex nature of the space radiation environment define and limit the overall construction of radiation detectors. Radiation dosimetry onboard the International Space Station (ISS) is accomplished to one part as 'operational' dosimetry aiming for area monitoring of the radiation environment as well as astronaut surveillance. Another part focuses on 'scientific' dosimetry aiming for a better understanding of the radiation environment and its constitutes. Various research activities for a more detailed quantification of the radiation environment as well as its distribution in and outside the space station have been accomplished in the last years onboard the ISS. The paper will focus on the current radiation detectors onboard the ISS, their results, as well as on future planned activities. (orig.)

  13. Radiation dosimetry onboard the International Space Station ISS

    International Nuclear Information System (INIS)

    Berger, Thomas

    2008-01-01

    Besides the effects of the microgravity environment, and the psychological and psychosocial problems encountered in confined spaces, radiation is the main health detriment for long duration human space missions. The radiation environment encountered in space differs in nature front that on earth, consisting mostly of high energetic ions from protons up to iron, resulting in radiation levels far exceeding the ones encountered on earth for occupational radiation workers. Therefore the determination and the control of the radiation load on astronauts is a moral obligation of the space faring nations. The requirements for radiation detectors in space are very different to that on earth. Limitations in mass, power consumption and the complex nature of the space radiation environment define and limit the overall construction of radiation detectors. Radiation dosimetry onboard the International Space Station (ISS) is accomplished to one part as ''operational'' dosimetry aiming for area monitoring of the radiation environment as well as astronaut surveillance. Another part focuses on ''scientific'' dosimetry aiming for a better understanding of the radiation environment and its constitutes. Various research activities for a more detailed quantification of the radiation environment as well as its distribution in and outside the space station have been accomplished in the last years onboard the ISS. The paper will focus on the current radiation detectors onboard the ISS, their results, as well as on future planned activities. (orig.)

  14. Dosimetry and microdosimetry using LET spectrometer based on the track-etch detector: radiotherapy Bremsstrahlung beam, onboard aircraft radiation field

    International Nuclear Information System (INIS)

    Jadrnickova, I.; Spurny, F.

    2006-01-01

    The spectrometer of linear energy transfer (Let) based on the chemically etched poly-allyl-diglycol-carbonate (P.A.D.C.) track-etch detector was developed several years ago in our institute. This Let spectrometer enables determining Let of particles approximately from 10 to 700 keV/μm. From the Let spectra, dose characteristics can be calculated. The contribution presents the Let spectra and other dosimetric characteristics obtained onboard a commercial aircraft during more than 6 months long exposure and in the 18 MV radiotherapy Bremsstrahlung beam. (authors)

  15. Development of the high-order decoupled direct method in three dimensions for particulate matter: enabling advanced sensitivity analysis in air quality models

    Directory of Open Access Journals (Sweden)

    W. Zhang

    2012-03-01

    Full Text Available The high-order decoupled direct method in three dimensions for particulate matter (HDDM-3D/PM has been implemented in the Community Multiscale Air Quality (CMAQ model to enable advanced sensitivity analysis. The major effort of this work is to develop high-order DDM sensitivity analysis of ISORROPIA, the inorganic aerosol module of CMAQ. A case-specific approach has been applied, and the sensitivities of activity coefficients and water content are explicitly computed. Stand-alone tests are performed for ISORROPIA by comparing the sensitivities (first- and second-order computed by HDDM and the brute force (BF approximations. Similar comparison has also been carried out for CMAQ sensitivities simulated using a week-long winter episode for a continental US domain. Second-order sensitivities of aerosol species (e.g., sulfate, nitrate, and ammonium with respect to domain-wide SO2, NOx, and NH3 emissions show agreement with BF results, yet exhibit less noise in locations where BF results are demonstrably inaccurate. Second-order sensitivity analysis elucidates poorly understood nonlinear responses of secondary inorganic aerosols to their precursors and competing species. Adding second-order sensitivity terms to the Taylor series projection of the nitrate concentrations with a 50% reduction in domain-wide NOx or SO2 emissions rates improves the prediction with statistical significance.

  16. The Nordic Housing Enabler

    DEFF Research Database (Denmark)

    Helle, Tina; Slaug, Bjørn; Brandt, Åse

    2010-01-01

    This study addresses development of a content valid cross-Nordic version of the Housing Enabler and investigation of its inter-rater reliability when used in occupational therapy rating situations, involving occupational therapists, clients and their home environments. The instrument was translated...... from the original Swedish version of the Housing Enabler, and adapted according to accessibility norms and guidelines for housing design in Sweden, Denmark, Finland and Iceland. This iterative process involved occupational therapists, architects, building engineers and professional translators......, resulting in the Nordic Housing Enabler. For reliability testing, the sampling strategy and data collection procedures used were the same in all countries. Twenty voluntary occupational therapists, pair-wise but independently from each other, collected data from 106 cases by means of the Nordic Housing...

  17. Applied Questions of Onboard Laser Radar Equipment Development

    Directory of Open Access Journals (Sweden)

    E. I. Starovoitov

    2015-01-01

    Full Text Available During development of the spacecraft laser radar systems (LRS it is a problem to make a choice of laser sources and photo-detectors both because of their using specifics in onboard equipment and because of the limited number of domestic and foreign manufacturers.Previous publications did not consider in detail the accuracy versus laser pulse repetition frequency, the impact of photo-detector sensitivity and dynamic range on the LRS characteristics, and the power signal-protected photo-detector against overload.The objective of this work is to analyze how the range, accuracy, and reliability of onboard LRS depend on different types of laser sources and photo-detectors, and on availability of electromechanical optical attenuator.The paper describes design solutions that are used to compensate for a decreased sensitivity of photo-detector and an impact of these changes on the LRS characteristics.It is shown that due to the high pulse repetition frequency a fiber laser is the preferred type of a laser source in onboard LRS, which can be used at ranges less than 500 m for two purposes: determining the orientation of the passive spacecraft with the accuracy of 0.3 and measuring the range rate during the rendezvous of spacecrafts with an accuracy of 0.003... 0.006 m/s.The work identifies the attenuation level of the optical attenuator versus measured range. In close proximity to a diffusely reflecting passive spacecraft and a corner reflector this attenuator protects photo-detector. It is found that the optical attenuator is advisable to apply when using the photo-detector based on an avalanche photodiode. There is no need in optical attenuator (if a geometric factor is available in the case of sounding corner reflector when a photo-detector based on pin-photodiode is used. Exclusion of electromechanical optical attenuator can increase the reliability function of LRS from Р (t = 0.9991 to Р (t = 0.9993.The results obtained in this work can be used

  18. Pilot project as enabler?

    DEFF Research Database (Denmark)

    Neisig, Margit; Glimø, Helle; Holm, Catrine Granzow

    This article deals with a systemic perspective on transition. The field of study addressed is a pilot project as enabler of transition in a highly complex polycentric context. From a Luhmannian systemic approach, a framework is created to understand and address barriers of change occurred using...... pilot projects as enabler of transition. Aspects of how to create trust and deal with distrust during a transition are addressed. The transition in focus is the concept of New Public Management and how it is applied in the management of the Employment Service in Denmark. The transition regards...

  19. Enabling distributed collaborative science

    DEFF Research Database (Denmark)

    Hudson, T.; Sonnenwald, Diane H.; Maglaughlin, K.

    2000-01-01

    To enable collaboration over distance, a collaborative environment that uses a specialized scientific instrument called a nanoManipulator is evaluated. The nanoManipulator incorporates visualization and force feedback technology to allow scientists to see, feel, and modify biological samples bein...

  20. The Nordic Housing Enabler

    DEFF Research Database (Denmark)

    Helle, T.; Nygren, C.; Slaug, B.

    2014-01-01

    This study addresses development of a content-valid cross-Nordic version of the Housing Enabler and investigation of its inter-rater reliability when used in occupational therapy rating situations, involving occupational therapists, clients, and their home environments. The instrument was transla......This study addresses development of a content-valid cross-Nordic version of the Housing Enabler and investigation of its inter-rater reliability when used in occupational therapy rating situations, involving occupational therapists, clients, and their home environments. The instrument...... was translated from the original Swedish version of the Housing Enabler, and adapted according to accessibility norms and guidelines for housing design in Sweden, Denmark, Finland, and Iceland. This iterative process involved occupational therapists, architects, building engineers, and professional translators......, resulting in the Nordic Housing Enabler. For reliability testing, the sampling strategy and data collection procedures used were the same in all countries. Twenty voluntary occupational therapists, pair-wise but independently of each other, collected data from 106 cases by means of the Nordic Housing...

  1. High spatio-temporal resolution pollutant measurements of on-board vehicle emissions using ultra-fast response gas analyzers

    Directory of Open Access Journals (Sweden)

    M. Irwin

    2018-06-01

    Full Text Available Existing ultra-fast response engine exhaust emissions analyzers have been adapted for on-board vehicle use combined with GPS data. We present, for the first time, how high spatio-temporal resolution data products allow transient features associated with internal combustion engines to be examined in detail during on-road driving. Such data are both useful to examine the circumstances leading to high emissions, and reveals the accurate position of urban air quality hot spots as deposited by the candidate vehicle, useful for source attribution and dispersion modelling. The fast response time of the analyzers, which results in 100 Hz data, makes accurate time-alignment with the vehicle's engine control unit (ECU signals possible. This enables correlation with transient air fuel ratio, engine speed, load, and other engine parameters, which helps to explain the causes of the emissions spikes that portable emissions measurement systems (PEMS and conventional slow response analyzers would miss or smooth out due to mixing within their sampling systems. The data presented is from NO and NOx analyzers, but other fast analyzers (e.g. total hydrocarbons (THC, CO and CO2 can be used similarly. The high levels of NOx pollution associated with accelerating on entry ramps to motorways, driving over speed bumps, accelerating away from traffic lights, are explored in detail. The time-aligned ultra-fast analyzers offer unique insight allowing more accurate quantification and better interpretation of engine and driver activity and the associated emissions impact on local air quality.

  2. Thermal Imaging Performance of TIR Onboard the Hayabusa2 Spacecraft

    Science.gov (United States)

    Arai, Takehiko; Nakamura, Tomoki; Tanaka, Satoshi; Demura, Hirohide; Ogawa, Yoshiko; Sakatani, Naoya; Horikawa, Yamato; Senshu, Hiroki; Fukuhara, Tetsuya; Okada, Tatsuaki

    2017-07-01

    The thermal infrared imager (TIR) is a thermal infrared camera onboard the Hayabusa2 spacecraft. TIR will perform thermography of a C-type asteroid, 162173 Ryugu (1999 JU3), and estimate its surface physical properties, such as surface thermal emissivity ɛ , surface roughness, and thermal inertia Γ, through remote in-situ observations in 2018 and 2019. In prelaunch tests of TIR, detector calibrations and evaluations, along with imaging demonstrations, were performed. The present paper introduces the experimental results of a prelaunch test conducted using a large-aperture collimator in conjunction with TIR under atmospheric conditions. A blackbody source, controlled at constant temperature, was measured using TIR in order to construct a calibration curve for obtaining temperatures from observed digital data. As a known thermal emissivity target, a sandblasted black almite plate warmed from the back using a flexible heater was measured by TIR in order to evaluate the accuracy of the calibration curve. As an analog target of a C-type asteroid, carbonaceous chondrites (50 mm × 2 mm in thickness) were also warmed from the back and measured using TIR in order to clarify the imaging performance of TIR. The calibration curve, which was fitted by a specific model of the Planck function, allowed for conversion to the target temperature within an error of 1°C (3σ standard deviation) for the temperature range of 30 to 100°C. The observed temperature of the black almite plate was consistent with the temperature measured using K-type thermocouples, within the accuracy of temperature conversion using the calibration curve when the temperature variation exhibited a random error of 0.3 °C (1σ ) for each pixel at a target temperature of 50°C. TIR can resolve the fine surface structure of meteorites, including cracks and pits with the specified field of view of 0.051°C (328 × 248 pixels). There were spatial distributions with a temperature variation of 3°C at the setting

  3. Science objectives of the magnetic field experiment onboard Aditya-L1 spacecraft

    Science.gov (United States)

    Yadav, Vipin K.; Srivastava, Nandita; Ghosh, S. S.; Srikar, P. T.; Subhalakshmi, Krishnamoorthy

    2018-01-01

    The Aditya-L1 is first Indian solar mission scheduled to be placed in a halo orbit around the first Lagrangian point (L1) of Sun-Earth system in the year 2018-19. The approved scientific payloads onboard Aditya-L1 spacecraft includes a Fluxgate Digital Magnetometer (FGM) to measure the local magnetic field which is necessary to supplement the outcome of other scientific experiments onboard. The in-situ vector magnetic field data at L1 is essential for better understanding of the data provided by the particle and plasma analysis experiments, onboard Aditya-L1 mission. Also, the dynamics of Coronal Mass Ejections (CMEs) can be better understood with the help of in-situ magnetic field data at the L1 point region. This data will also serve as crucial input for the short lead-time space weather forecasting models. The proposed FGM is a dual range magnetic sensor on a 6 m long boom mounted on the Sun viewing panel deck and configured to deploy along the negative roll direction of the spacecraft. Two sets of sensors (tri-axial each) are proposed to be mounted, one at the tip of boom (6 m from the spacecraft) and other, midway (3 m from the spacecraft). The main science objective of this experiment is to measure the magnitude and nature of the interplanetary magnetic field (IMF) locally and to study the disturbed magnetic conditions and extreme solar events by detecting the CME from Sun as a transient event. The proposed secondary science objectives are to study the impact of interplanetary structures and shock solar wind interaction on geo-space environment and to detect low frequency plasma waves emanating from the solar corona at L1 point. This will provide a better understanding on how the Sun affects interplanetary space. In this paper, we shall give the main scientific objectives of the magnetic field experiment and brief technical details of the FGM onboard Aditya-1 spacecraft.

  4. On-board measurement of emissions from liquefied petroleum gas, gasoline and diesel powered passenger cars in Algeria

    OpenAIRE

    Chikhi , Saâdane; Boughedaoui , Ménouèr; Kerbachi , Rabah; Joumard , Robert

    2014-01-01

    International audience; On-board measurements of unit emissions of CO, HC, NOx and CO 2 were conducted on 17 private cars powered by different types of fuels including gasoline, dual gasoline-LPG, gasoline, and diesel. The tests performed revealed the effect of LPG injection technology on unit emissions and made it possible to compare the measured emissions to the European Artemis emission model. A sequential multipoint injection LPG kit with no catalyst installed was found to be the most eff...

  5. Spatially enabled land administration

    DEFF Research Database (Denmark)

    Enemark, Stig

    2006-01-01

    enabling of land administration systems managing tenure, valuation, planning, and development will allow the information generated by these activities to be much more useful. Also, the services available to private and public sectors and to community organisations should commensurably improve. Knowledge....... In other words: Good governance and sustainable development is not attainable without sound land administration or - more broadly – sound land management. The paper presents a land management vision that incorporates the benefits of ICT enabled land administration functions. The idea is that spatial...... the communication between administrative systems and also establish more reliable data due to the use the original data instead of copies. In Denmark, such governmental guidelines for a service-oriented ITarchitecture in support of e-government are recently adopted. Finally, the paper presents the role of FIG...

  6. Nordic Housing Enabler

    DEFF Research Database (Denmark)

    Helle, Tina; Brandt, Åse

    Development and reliability testing of the Nordic Housing Enabler – an instrument for accessibility assessment of the physical housing. Tina Helle & Åse Brandt University of Lund, Health Sciences, Faculty of Medicine (SE) and University College Northern Jutland, Occupational Therapy department (DK......). Danish Centre for Assistive Technology. Abstract. For decades, accessibility to the physical housing environment for people with functional limitations has been of interest politically, professionally and for the users. Guidelines and norms on accessible housing design have gradually been developed......, however, the built environment shows serious deficits when it comes to accessibility. This study addresses development of a content valid cross-Nordic version of the Housing Enabler and investigation of inter-rater reliability, when used in occupational therapy practice. The instrument was translated from...

  7. Biological quarantine on international waters: an initiative for onboard protocols

    Science.gov (United States)

    Takano, Yoshinori; Yano, Hajime; Funase, Ryu; Sekine, Yasuhito; Takai, Ken

    2012-07-01

    The research vessel Chikyu is expanding new frontiers in science, technology, and international collaboration through deep-sea expedition. The Chikyu (length: 210 m, gross tonnage: 56752 tons) has advanced and comprehensive scientific research facilities. One of the scientific purposes of the vessel is to investigate into unexplored biosphere (i.e., undescribed extremophiles) on the Earth. Therefore, "the onboard laboratory" provides us systematic microbiological protocols with a physical containment situation. In parallel, the onboard equipments provide sufficient space for fifty scientists and technical support staff. The helicopter deck also supports various logistics through transporting by a large scale helicopter (See, http://www.jamstec.go.jp/chikyu/eng/). Since the establishment of Panel on Planetary Protection (PPP) in Committee on Space Research (COSPAR), we have an international consensus about the development and promulgation of planetary protection knowledge, policy, and plans to prevent the harmful effects of biological contamination on the Earth (e.g., Rummel, 2002). However, the matter to select a candidate location of initial quarantine at BSL4 level is often problematic. To answer the key issue, we suggest that international waters can be a meaningful option with several advantages to conduct initial onboard-biological quarantine investigation. Hence, the research vessel Chikyu is promising for further PPP requirements (e.g., Enceladus sample return project: Tsou et al., 2012). Rummel, J., Seeking an international consensus in planetary protection: COSPAR's planetary protection panel. Advances in Space Research, 30, 1573-1575 (2002). Tsou, P. et al. LIFE: Life Investigation For Enceladus - A Sample Return Mission Concept in Search for Evidence of Life. Astrobiology, in press.

  8. High-Speed On-Board Data Processing for Science Instruments: HOPS

    Science.gov (United States)

    Beyon, Jeffrey

    2015-01-01

    The project called High-Speed On-Board Data Processing for Science Instruments (HOPS) has been funded by NASA Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST) program during April, 2012 â€" April, 2015. HOPS is an enabler for science missions with extremely high data processing rates. In this three-year effort of HOPS, Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) and 3-D Winds were of interest in particular. As for ASCENDS, HOPS replaces time domain data processing with frequency domain processing while making the real-time on-board data processing possible. As for 3-D Winds, HOPS offers real-time high-resolution wind profiling with 4,096-point fast Fourier transform (FFT). HOPS is adaptable with quick turn-around time. Since HOPS offers reusable user-friendly computational elements, its FPGA IP Core can be modified for a shorter development period if the algorithm changes. The FPGA and memory bandwidth of HOPS is 20 GB/sec while the typical maximum processor-to-SDRAM bandwidth of the commercial radiation tolerant high-end processors is about 130-150 MB/sec. The inter-board communication bandwidth of HOPS is 4 GB/sec while the effective processor-to-cPCI bandwidth of commercial radiation tolerant high-end boards is about 50-75 MB/sec. Also, HOPS offers VHDL cores for the easy and efficient implementation of ASCENDS and 3-D Winds, and other similar algorithms. A general overview of the 3-year development of HOPS is the goal of this presentation.

  9. Enabling Wind Power Nationwide

    Energy Technology Data Exchange (ETDEWEB)

    Jose Zayas, Michael Derby, Patrick Gilman and Shreyas Ananthan,

    2015-05-01

    Leveraging this experience, the U.S. Department of Energy’s (DOE’s) Wind and Water Power Technologies Office has evaluated the potential for wind power to generate electricity in all 50 states. This report analyzes and quantifies the geographic expansion that could be enabled by accessing higher above ground heights for wind turbines and considers the means by which this new potential could be responsibly developed.

  10. A SmallSat Approach for Global Imaging Spectroscopy of the Earth SYSTEM Enabled by Advanced Technology

    Science.gov (United States)

    Green, R. O.; Asner, G. P.; Thompson, D. R.; Mouroulis, P.; Eastwood, M. L.; Chien, S.

    2017-12-01

    Global coverage imaging spectroscopy in the solar reflected energy portion of the spectrum has been identified by the Earth Decadal Survey as an important measurement that enables a diverse set of new and time critical science objectives/targets for the Earth system. These science objectives include biodiversity; ecosystem function; ecosystem biogeochemistry; initialization and constraint of global ecosystem models; fire fuel, combustion, burn severity, and recovery; surface mineralogy, geochemistry, geologic processes, soils, and hazards; global mineral dust source composition; cryospheric albedo, energy balance, and melting; coastal and inland water habitats; coral reefs; point source gas emission; cloud thermodynamic phase; urban system properties; and more. Traceability of these science objectives to spectroscopic measurement in the visible to short wavelength infrared portion of the spectrum is summarized. New approaches, including satellite constellations, to acquire these global imaging spectroscopy measurements is presented drawing from recent advances in optical design, detector technology, instrument architecture, thermal control, on-board processing, data storage, and downlink.

  11. On-board cryogenic system for magnetic levitation of trains

    Energy Technology Data Exchange (ETDEWEB)

    Baldus, S A.W.; Kneuer, R; Stephan, A

    1975-02-01

    An experimental car based on electrodynamic levitation with superconducting magnets was developed and manufactured with an on-board cryogenic system. This system has to cope with new conditions and cryogenic tasks. It can be characterized in principle by liquid helium heat exchanger units, compressors, transfer lines, rotable and movable couplings and junctions. All transfer lines and couplings consist of three coaxial ducts for three different streams. Processes and components are discussed, and a brief description of the first results for the whole system under simulation conditions is given.

  12. On-board image compression for the RAE lunar mission

    Science.gov (United States)

    Miller, W. H.; Lynch, T. J.

    1976-01-01

    The requirements, design, implementation, and flight performance of an on-board image compression system for the lunar orbiting Radio Astronomy Explorer-2 (RAE-2) spacecraft are described. The image to be compressed is a panoramic camera view of the long radio astronomy antenna booms used for gravity-gradient stabilization of the spacecraft. A compression ratio of 32 to 1 is obtained by a combination of scan line skipping and adaptive run-length coding. The compressed imagery data are convolutionally encoded for error protection. This image compression system occupies about 1000 cu cm and consumes 0.4 W.

  13. MARES: Navigation, Control and On-board Software

    OpenAIRE

    Aníbal Matos; Nuno Cruz

    2009-01-01

    MARES, or Modular Autonomous Robot for Environment Sampling, is a 1.5m long AUV, designed and built by the Ocean Systems Group. The vehicle can be programmed to follow predefined trajectories, while collecting relevant data with the onboard sensors. MARES can dive up to 100m deep, and unlike similar-sized systems, has vertical thrusters to allow for purely vertical motion in the water column. Forward velocity can be independently defined, from 0 to 2 m/s. Major application areas include pollu...

  14. On-board cryogenic system for magnetic levitation of trains

    International Nuclear Information System (INIS)

    Asztalos, St.; Baldus, W.; Kneuer, R.; Stephan, A.

    1974-01-01

    An experimental car based on electrodynamic levitation with superconducting magnets has been developed and manufactured by AEG, BBC, Siemens and other partners, together with Linde AG as the firm responsible for the on-board cryogenic system. This system has to cope with new conditions and cryogenic tasks. It can be characterized in principle by liquid helium heat exchanger units, compressors, transfer lines, rotatable and movable couplings and junctions. All transfer lines and couplings consist of three coaxial ducts for three different streams. This paper reports on processes and components. A brief description of the first results for the whole system under simulation conditions is given. (author)

  15. Spectrally and Radiometrically Stable, Wideband, Onboard Calibration Source

    Science.gov (United States)

    Coles, James B.; Richardson, Brandon S.; Eastwood, Michael L.; Sarture, Charles M.; Quetin, Gregory R.; Porter, Michael D.; Green, Robert O.; Nolte, Scott H.; Hernandez, Marco A.; Knoll, Linley A.

    2013-01-01

    The Onboard Calibration (OBC) source incorporates a medical/scientific-grade halogen source with a precisely designed fiber coupling system, and a fiber-based intensity-monitoring feedback loop that results in radiometric and spectral stabilities to within less than 0.3 percent over a 15-hour period. The airborne imaging spectrometer systems developed at the Jet Propulsion Laboratory incorporate OBC sources to provide auxiliary in-use system calibration data. The use of the OBC source will provide a significant increase in the quantitative accuracy, reliability, and resulting utility of the spectral data collected from current and future imaging spectrometer instruments.

  16. On-board system for physical and microphysical measurements

    International Nuclear Information System (INIS)

    Ravaut, M.; Allet, C.; Dole, B.; Gribkoff, A.; Schibler, P.; Charpentier, C.

    1981-10-01

    This report presents the system of physical and microphysical measurement instrumentation on board the HUREL-DUBOIS HD 34 aircraft, built in cooperation with the Institut National d'Astronomie et de Geophysique (I.N.A.G.) and the Institut Geographique National (I.G.N.). The feasibility study of the system was carried out in the first half of 1978 and took shape in an on-site proving campaign in November 1979. As a result, the on-board system was able to participate in the BUGEY experimental campaign of March 1980, a glimpse of which is given in this report [fr

  17. Assessment of an Onboard EO Sensor to Enable Detect-and-Sense Capability for UAVs Operating in a Cluttered Environment

    Science.gov (United States)

    2017-09-01

    rates of accidents. To ensure safe operation in such complex environment , the unmanned systems have to perform accurate and timely detection and...security. C. PROBLEM FORMULATION To ensure the safe operation of unmanned systems in modern complex environment , this thesis strives to answer two...computer vision algorithm work in a complex operating environment with multiple moving objects? This thesis examines the integration of the CV

  18. Agile deployment and code coverage testing metrics of the boot software on-board Solar Orbiter's Energetic Particle Detector

    Science.gov (United States)

    Parra, Pablo; da Silva, Antonio; Polo, Óscar R.; Sánchez, Sebastián

    2018-02-01

    In this day and age, successful embedded critical software needs agile and continuous development and testing procedures. This paper presents the overall testing and code coverage metrics obtained during the unit testing procedure carried out to verify the correctness of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. The ICU boot software is a critical part of the project so its verification should be addressed at an early development stage, so any test case missed in this process may affect the quality of the overall on-board software. According to the European Cooperation for Space Standardization ESA standards, testing this kind of critical software must cover 100% of the source code statement and decision paths. This leads to the complete testing of fault tolerance and recovery mechanisms that have to resolve every possible memory corruption or communication error brought about by the space environment. The introduced procedure enables fault injection from the beginning of the development process and enables to fulfill the exigent code coverage demands on the boot software.

  19. MERTIS: the thermal infrared imaging spectrometer onboard of the Mercury Planetary Orbiter

    Science.gov (United States)

    Zeh, T.; Peter, G.; Walter, I.; Kopp, E.; Knollenberg, J.; Helbert, J.; Gebhardt, A.; Weber, I.; Hiesinger, Harry

    2017-11-01

    The MERTIS instrument is a thermal infrared imaging spectrometer onboard of ESA's cornerstone mission BepiColombo to Mercury. MERTIS has four goals: the study of Mercury's surface composition, identification of rock-forming minerals, mapping of the surface mineralogy, and the study of the surface temperature variations and thermal inertia. MERTIS will provide detailed information about the mineralogical composition of Mercury's surface layer by measuring the spectral emittance in the spectral range from 7-14 μm at high spatial and spectral resolution. Furthermore MERTIS will obtain radiometric measurements in the spectral range from 7-40 μm to study the thermo-physical properties of the surface material. The MERTIS detector is based on an uncooled micro-bolometer array providing spectral separation and spatial resolution according to its 2-dimensional shape. The operation principle is characterized by intermediate scanning of the planet surface and three different calibration targets - free space view and two on-board black body sources. In the current project phase, the MERTIS Qualification Model (QM) is under a rigorous testing program. Besides a general overview of the instrument principles, the papers addresses major aspects of the instrument design, manufacturing and verification.

  20. Onboard Processing on PWE OFA/WFC (Onboard Frequency Analyzer/Waveform Capture) aboard the ERG (ARASE) Satellite

    Science.gov (United States)

    Matsuda, S.; Kasahara, Y.; Kojima, H.; Kasaba, Y.; Yagitani, S.; Ozaki, M.; Imachi, T.; Ishisaka, K.; Kurita, S.; Ota, M.; Kumamoto, A.; Tsuchiya, F.; Yoshizumi, M.; Matsuoka, A.; Teramoto, M.; Shinohara, I.

    2017-12-01

    Exploration of energization and Radiation in Geospace (ERG) is a mission for understanding particle acceleration, loss mechanisms, and the dynamic evolution of space storms in the context of cross-energy and cross-regional coupling [Miyoshi et al., 2012]. The ERG (ARASE) satellite was launched on December 20, 2016, and successfully inserted into an orbit. The Plasma Wave Experiment (PWE) is one of the science instruments on board the ERG satellite to measure electric field and magnetic field in the inner magnetosphere. PWE consists of three sub-components, EFD (Electric Field Detector), OFA/WFC (Onboard Frequency Analyzer and Waveform Capture), and HFA (High Frequency Analyzer). Especially, OFA/WFC measures electric and magnetic field spectrum and waveform from a few Hz to 20 kHz. OFA/WFC processes signals detected by a couple of dipole wire-probe antenna (WPT) and tri-axis magnetic search coils (MSC) installed onboard the satellite. The PWE-OFA subsystem calculates and produces three kind of data; OFA-SPEC (power spectrum), OFA-MATRIX (spectrum matrix), and OFA-COMPLEX (complex spectrum). They are continuously processed 24 hours per day and all data are sent to the ground. OFA-MATRIX and OFA-COMPLEX are used for polarization analyses and direction finding of the plasma waves. The PWE-WFC subsystem measures raw (64 kHz sampled) and down-sampled (1 kHz sampled) burst waveform detected by the WPT and the MSC sensors. It activates by a command, automatic triggering, and scheduling. The initial check-out process of the PWE successfully completed, and initial data has been obtained. In this presentation, we introduce onboard processing technique on PWE OFA/WFC and its initial results.

  1. Onboard calibration and monitoring for the SWIFT instrument

    International Nuclear Information System (INIS)

    Rahnama, P; McDade, I; Shepherd, G; Gault, W

    2012-01-01

    The SWIFT (Stratospheric Wind Interferometer for Transport studies) instrument is a proposed space-based field-widened Doppler Michelson interferometer designed to measure stratospheric winds and ozone densities using a passive optical technique called Doppler Michelson imaging interferometry. The onboard calibration and monitoring procedures for the SWIFT instrument are described in this paper. Sample results of the simulations of onboard calibration measurements are presented and discussed. This paper also discusses the results of the derivation of the calibrations and monitoring requirements for the SWIFT instrument. SWIFT's measurement technique and viewing geometry are briefly described. The reference phase calibration and filter monitoring for the SWIFT instrument are two of the main critical design issues. In this paper it is shown that in order to meet SWIFT's science requirements, Michelson interferometer optical path difference monitoring corresponding to a phase calibration accuracy of ∼10 −3 radians, filter passband monitoring corresponding to phase accuracy of ∼5 × 10 −3 radians and a thermal stability of 10 −3 K s −1 are required. (paper)

  2. Tuning the Solar Dynamics Observatory Onboard Kalman Filter

    Science.gov (United States)

    Halverson, Julie Kay; Harman, Rick; Carpenter, Russell; Poland, Devin

    2017-01-01

    The Solar Dynamics Observatory (SDO) was launched in 2010. SDO is a sun pointing semi-autonomous spacecraft in a geosynchronous orbit that allows nearly continuous observations of the sun. SDO is equipped with coarse sun sensors, two star trackers, a digital sun sensor, and three two-axis inertial reference units (IRU). The IRUs are temperature sensitive and were designed to operate in a stable thermal environment. Due to battery degradation concerns the IRU heaters were not used on SDO and the onboard filter was tuned to accommodate the noisier IRU data. Since launch currents have increased on two IRUs, one had to eventually be powered off. Recent ground tests on a battery similar to SDO indicated the heaters would have negligible impact on battery degradation, so in 2016 a decision was made to turn the heaters on. This paper presents the analysis and results of updating the filter tuning parameters onboard SDO with the IRUs now operating in their intended thermal environment.

  3. EnableATIS strategy assessment.

    Science.gov (United States)

    2014-02-01

    Enabling Advanced Traveler Information Systems (EnableATIS) is the traveler information component of the Dynamic Mobility Application (DMA) program. The objective of : the EnableATIS effort is to foster transformative traveler information application...

  4. Enabling Digital Literacy

    DEFF Research Database (Denmark)

    Ryberg, Thomas; Georgsen, Marianne

    2010-01-01

    There are some tensions between high-level policy definitions of “digital literacy” and actual teaching practice. We need to find workable definitions of digital literacy; obtain a better understanding of what digital literacy might look like in practice; and identify pedagogical approaches, which...... support teachers in designing digital literacy learning. We suggest that frameworks such as Problem Based Learning (PBL) are approaches that enable digital literacy learning because they provide good settings for engaging with digital literacy. We illustrate this through analysis of a case. Furthermore......, these operate on a meso-level mediating between high-level concepts of digital literacy and classroom practice....

  5. CtOS Enabler

    OpenAIRE

    Crespo Cepeda, Rodrigo; El Yamri El Khatibi, Meriem; Carrera García, Juan Manuel

    2015-01-01

    Las Smart Cities son, indudablemente, el futuro próximo de la tecnología al que nos acercamos cada día, lo que se puede observar en la abundancia de dispositivos móviles entre la población, que informatizan la vida cotidiana mediante el uso de la geolocalización y la información. Pretendemos unir estos dos ámbitos con CtOS Enabler para crear un estándar de uso que englobe todos los sistemas de Smart Cities y facilite a los desarrolladores de dicho software la creación de nuevas herramientas. ...

  6. Examining the Diet of Post-Migrant Hispanic Males Using the Precede-Proceed Model: Predisposing, Reinforcing, and Enabling Dietary Factors

    Science.gov (United States)

    Castellanos, Diana Cuy; Downey, Laura; Graham-Kresge, Susan; Yadrick, Kathleen; Zoellner, Jamie; Connell, Carol L.

    2013-01-01

    Objective: To examine socio-environmental, behavioral, and predisposing, reinforcing, and enabling (PRE) factors contributing to post-migration dietary behavior change among a sample of traditional Hispanic males. Design: In this descriptive study, semistructured interviews, a group interview, and photovoice, followed by group interviews, were…

  7. Fatigue crack growth spectrum simplification: Facilitation of on-board damage prognosis systems

    Science.gov (United States)

    Adler, Matthew Adam

    2009-12-01

    Better lifetime predictions of systems subjected to fatigue loading are needed in support of the optimization of the costs of life-cycle engineering. In particular, the climate is especially encouraging for the development of safer aircraft. One issue is that aircraft experience complex fatigue loading and current methods for the prediction of fatigue damage accumulation rely on intensive computational tools that are not currently carried onboard during flight. These tools rely on complex models that are made more difficult by the complicated load spectra themselves. This presents an overhead burden as offline analysis must be performed at an offsite facility. This architecture is thus unable to provide online, timely information for on-board use. The direct objective of this research was to facilitate the real-time fatigue damage assessments of on-board systems with a particular emphasis on aging aircraft. To achieve the objective, the goal of this research was to simplify flight spectra. Variable-amplitude spectra, in which the load changes on a cycle-by-cycle basis, cannot readily be supported by an onboard system because the models required to predict fatigue crack growth during variable-amplitude loading are too complicated. They are too complicated because variable-amplitude fatigue crack growth analysis must be performed on a cycle-by-cycle basis as no closed-form solution exists. This makes these calculations too time-consuming and requires impractical, heavy onboard systems or offsite facilities. The hypothesis is to replace a variable-amplitude spectrum with an equivalent constant-amplitude spectrum. The advantage is a dramatic reduction in the complexity of the problem so that damage predictions can be made onboard by simple, fast calculations in real-time without the need to add additional weight to the aircraft. The intent is to reduce the computational burden and facilitate on-board projection of damage evolution and prediction for the accurate

  8. Smart Grid Enabled EVSE

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2015-01-12

    The combined team of GE Global Research, Federal Express, National Renewable Energy Laboratory, and Consolidated Edison has successfully achieved the established goals contained within the Department of Energy’s Smart Grid Capable Electric Vehicle Supply Equipment funding opportunity. The final program product, shown charging two vehicles in Figure 1, reduces by nearly 50% the total installed system cost of the electric vehicle supply equipment (EVSE) as well as enabling a host of new Smart Grid enabled features. These include bi-directional communications, load control, utility message exchange and transaction management information. Using the new charging system, Utilities or energy service providers will now be able to monitor transportation related electrical loads on their distribution networks, send load control commands or preferences to individual systems, and then see measured responses. Installation owners will be able to authorize usage of the stations, monitor operations, and optimally control their electricity consumption. These features and cost reductions have been developed through a total system design solution.

  9. Flight Hardware Virtualization for On-Board Science Data Processing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Utilize Hardware Virtualization technology to benefit on-board science data processing by investigating new real time embedded Hardware Virtualization solutions and...

  10. Possibilities of reduction of the on-board energy for an innovative subway

    OpenAIRE

    Allègre, A-L.; Barrade, P.; Delarue, P.; Bouscayrol, A.; Chattot, E.; El-Fassi, S.

    2009-01-01

    An innovative subway has been proposed using supercapacitors as energy source. In this paper, are presented different possibilities to reduce on-board stored energy in order to downsize the on-board energy storage subsystem. Special attention is paid to the influence of a feeding rail extension or a downward slope at the beginning of the interstation on the on-board stored energy. A map is built to facilitate the selection of the solution which leads to reduce the on-board energy.

  11. Water: A Critical Material Enabling Space Exploration

    Science.gov (United States)

    Pickering, Karen D.

    2014-01-01

    Water is one of the most critical materials in human spaceflight. The availability of water defines the duration of a space mission; the volume of water required for a long-duration space mission becomes too large, heavy, and expensive for launch vehicles to carry. Since the mission duration is limited by the amount of water a space vehicle can carry, the capability to recycle water enables space exploration. In addition, water management in microgravity impacts spaceflight in other respects, such as the recent emergency termination of a spacewalk caused by free water in an astronaut's spacesuit helmet. A variety of separation technologies are used onboard spacecraft to ensure that water is always available for use, and meets the stringent water quality required for human space exploration. These separation technologies are often adapted for use in a microgravity environment, where water behaves in unique ways. The use of distillation, membrane processes, ion exchange and granular activated carbon will be reviewed. Examples of microgravity effects on operations will also be presented. A roadmap for future technologies, needed to supply water resources for the exploration of Mars, will also be reviewed.

  12. Enabling graphene nanoelectronics.

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Wei; Ohta, Taisuke; Biedermann, Laura Butler; Gutierrez, Carlos; Nolen, C. M.; Howell, Stephen Wayne; Beechem Iii, Thomas Edwin; McCarty, Kevin F.; Ross, Anthony Joseph, III

    2011-09-01

    Recent work has shown that graphene, a 2D electronic material amenable to the planar semiconductor fabrication processing, possesses tunable electronic material properties potentially far superior to metals and other standard semiconductors. Despite its phenomenal electronic properties, focused research is still required to develop techniques for depositing and synthesizing graphene over large areas, thereby enabling the reproducible mass-fabrication of graphene-based devices. To address these issues, we combined an array of growth approaches and characterization resources to investigate several innovative and synergistic approaches for the synthesis of high quality graphene films on technologically relevant substrate (SiC and metals). Our work focused on developing the fundamental scientific understanding necessary to generate large-area graphene films that exhibit highly uniform electronic properties and record carrier mobility, as well as developing techniques to transfer graphene onto other substrates.

  13. The Thermal Infrared Sensor onboard NASA's Mars 2020 Mission

    Science.gov (United States)

    Martinez, G.; Perez-Izquierdo, J.; Sebastian, E.; Ramos, M.; Bravo, A.; Mazo, M.; Rodriguez-Manfredi, J. A.

    2017-12-01

    NASA's Mars 2020 rover mission is scheduled for launch in July/August 2020 and will address key questions about the potential for life on Mars. The Mars Environmental Dynamics Analyzer (MEDA) is one of the seven instruments onboard the rover [1] and has been designed to assess the environmental conditions across the rover traverse. MEDA will extend the current record of in-situ meteorological measurements at the surface [2] to other locations on Mars. The Thermal InfraRed Sensor (TIRS) [3] is one of the six sensors comprising MEDA. TIRS will use three downward-looking channels to measure (1) the surface skin temperature (with high heritage from the Rover Environmental Monitoring Station onboard the Mars Science Laboratory mission [4]), (2) the upwelling thermal infrared radiation from the surface and (3) the reflected solar radiation at the surface, and two upward-looking channels to measure the (4) downwelling thermal infrared radiation at the surface and (5) the atmospheric temperature. In combination with other MEDA's sensors, TIRS will allow the quantification of the surface energy budget [5] and the determination of key geophysical properties of the terrain such as the albedo and thermal inertia with an unprecedented spatial resolution. Here we present a general description of the TIRS, with focus on its scientific requirements and results from field campaigns showing the performance of the different channels. References:[1] Rodríguez-Manfredi, J. A. et al. (2014), MEDA: An environmental and meteorological package for Mars 2020, LPSC, 45, 2837. [2] Martínez, G.M. et al. (2017), The Modern Near-Surface Martian Climate: A Review of In-situ Meteorological Data from Viking to Curiosity, Space Science Reviews, 1-44. [3] Pérez-Izquierdo, J. et al. (2017), The Thermal Infrared Sensor (TIRS) of the Mars Environmental Dynamics Analyzer (MEDA) Instrument onboard Mars 2020, IEEE. [4] Sebastián, E. et al. (2010), The Rover Environmental Monitoring Station Ground

  14. A One-Step Cone-Beam CT-Enabled Planning-to-Treatment Model for Palliative Radiotherapy-From Development to Implementation

    International Nuclear Information System (INIS)

    Wong, Rebecca K.S.; Letourneau, Daniel; Varma, Anita; Bissonnette, Jean Pierre; Fitzpatrick, David; Grabarz, Daniel; Elder, Christine; Martin, Melanie; Bezjak, Andrea; Panzarella, Tony; Gospodarowicz, Mary; Jaffray, David A.

    2012-01-01

    Purpose: To develop a cone-beam computed tomography (CT)–enabled one-step simulation-to-treatment process for the treatment of bone metastases. Methods and Materials: A three-phase prospective study was conducted. Patients requiring palliative radiotherapy to the spine, mediastinum, or abdomen/pelvis suitable for treatment with simple beam geometry (≤2 beams) were accrued. Phase A established the accuracy of cone-beam CT images for the purpose of gross tumor target volume (GTV) definition. Phase B evaluated the feasibility of implementing the cone-beam CT–enabled planning process at the treatment unit. Phase C evaluated the online cone-beam CT–enabled process for the planning and treatment of patients requiring radiotherapy for bone metastases. Results: Eighty-four patients participated in this study. Phase A (n = 9) established the adequacy of cone-beam CT images for target definition. Phase B (n = 45) established the quality of treatment plans to be adequate for clinical implementation for bone metastases. When the process was applied clinically in bone metastases (Phase C), the degree of overlap between planning computed tomography (PCT) and cone-beam CT for GTV and between PCT and cone-beam CT for treatment field was 82% ± 11% and 97% ± 4%, respectively. The oncologist’s decision to accept the plan under a time-pressured environment remained of high quality, with the cone-beam CT–generated treatment plan delivering at least 90% of the prescribed dose to 100% ± 0% of the cone-beam CT planning target volume (PTV). With the assumption that the PCT PTV is the gold-standard target, the cone-beam CT–generated treatment plan delivered at least 90% and at least 95% of dose to 98% ± 2% and 97% ± 5% of the PCT PTV, respectively. The mean time for the online planning and treatment process was 32.7 ± 4.0 minutes. Patient satisfaction was high, with a trend for superior satisfaction with the cone-beam CT–enabled process. Conclusions: The cone-beam CT–enabled

  15. A One-Step Cone-Beam CT-Enabled Planning-to-Treatment Model for Palliative Radiotherapy-From Development to Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Wong, Rebecca K.S., E-mail: rebecca.wong@rmp.uhn.on.ca [Radiation Medicine Program, Princess Margaret Hospital, Toronto, Ontario (Canada); Department of Radiation Oncology, University of Toronto, Toronto, Ontario (Canada); Letourneau, Daniel; Varma, Anita [Radiation Medicine Program, Princess Margaret Hospital, Toronto, Ontario (Canada); Department of Radiation Oncology, University of Toronto, Toronto, Ontario (Canada); Bissonnette, Jean Pierre; Fitzpatrick, David; Grabarz, Daniel; Elder, Christine [Radiation Medicine Program, Princess Margaret Hospital, Toronto, Ontario (Canada); Martin, Melanie; Bezjak, Andrea [Radiation Medicine Program, Princess Margaret Hospital, Toronto, Ontario (Canada); Department of Radiation Oncology, University of Toronto, Toronto, Ontario (Canada); Panzarella, Tony [Department of Biostatistics, Princess Margaret Hospital, Toronto, Ontario (Canada); Gospodarowicz, Mary [Radiation Medicine Program, Princess Margaret Hospital, Toronto, Ontario (Canada); Department of Radiation Oncology, University of Toronto, Toronto, Ontario (Canada); Jaffray, David A. [Radiation Medicine Program, Princess Margaret Hospital, Toronto, Ontario (Canada); Department of Radiation Oncology, University of Toronto, Toronto, Ontario (Canada); Department of Medical Biophysics, University of Toronto, Toronto, Ontario (Canada)

    2012-11-01

    Purpose: To develop a cone-beam computed tomography (CT)-enabled one-step simulation-to-treatment process for the treatment of bone metastases. Methods and Materials: A three-phase prospective study was conducted. Patients requiring palliative radiotherapy to the spine, mediastinum, or abdomen/pelvis suitable for treatment with simple beam geometry ({<=}2 beams) were accrued. Phase A established the accuracy of cone-beam CT images for the purpose of gross tumor target volume (GTV) definition. Phase B evaluated the feasibility of implementing the cone-beam CT-enabled planning process at the treatment unit. Phase C evaluated the online cone-beam CT-enabled process for the planning and treatment of patients requiring radiotherapy for bone metastases. Results: Eighty-four patients participated in this study. Phase A (n = 9) established the adequacy of cone-beam CT images for target definition. Phase B (n = 45) established the quality of treatment plans to be adequate for clinical implementation for bone metastases. When the process was applied clinically in bone metastases (Phase C), the degree of overlap between planning computed tomography (PCT) and cone-beam CT for GTV and between PCT and cone-beam CT for treatment field was 82% {+-} 11% and 97% {+-} 4%, respectively. The oncologist's decision to accept the plan under a time-pressured environment remained of high quality, with the cone-beam CT-generated treatment plan delivering at least 90% of the prescribed dose to 100% {+-} 0% of the cone-beam CT planning target volume (PTV). With the assumption that the PCT PTV is the gold-standard target, the cone-beam CT-generated treatment plan delivered at least 90% and at least 95% of dose to 98% {+-} 2% and 97% {+-} 5% of the PCT PTV, respectively. The mean time for the online planning and treatment process was 32.7 {+-} 4.0 minutes. Patient satisfaction was high, with a trend for superior satisfaction with the cone-beam CT-enabled process. Conclusions: The cone

  16. Soft x-ray imager (SXI) onboard the NeXT satellite

    Science.gov (United States)

    Tsuru, Takeshi Go; Takagi, Shin-Ichiro; Matsumoto, Hironori; Inui, Tatsuya; Ozawa, Midori; Koyama, Katsuji; Tsunemi, Hiroshi; Hayashida, Kiyoshi; Miyata, Emi; Ozawa, Hideki; Touhiguchi, Masakuni; Matsuura, Daisuke; Dotani, Tadayasu; Ozaki, Masanobu; Murakami, Hiroshi; Kohmura, Takayoshi; Kitamoto, Shunji; Awaki, Hisamitsu

    2006-06-01

    We give overview and the current status of the development of the Soft X-ray Imager (SXI) onboard the NeXT satellite. SXI is an X-ray CCD camera placed at the focal plane detector of the Soft X-ray Telescopes for Imaging (SXT-I) onboard NeXT. The pixel size and the format of the CCD is 24 x 24μm (IA) and 2048 x 2048 x 2 (IA+FS). Currently, we have been developing two types of CCD as candidates for SXI, in parallel. The one is front illumination type CCD with moderate thickness of the depletion layer (70 ~ 100μm) as a baseline plan. The other one is the goal plan, in which we develop back illumination type CCD with a thick depletion layer (200 ~ 300μm). For the baseline plan, we successfully developed the proto model 'CCD-NeXT1' with the pixel size of 12μm x 12μm and the CCD size of 24mm x 48mm. The depletion layer of the CCD has reached 75 ~ 85μm. The goal plan is realized by introduction of a new type of CCD 'P-channel CCD', which collects holes in stead of electrons in the common 'N-channel CCD'. By processing a test model of P-channel CCD we have confirmed high quantum efficiency above 10 keV with an equivalent depletion layer of 300μm. A back illumination type of P-channel CCD with a depletion layer of 200μm with aluminum coating for optical blocking has been also successfully developed. We have been also developing a thermo-electric cooler (TEC) with the function of the mechanically support of the CCD wafer without standoff insulators, for the purpose of the reduction of thermal input to the CCD through the standoff insulators. We have been considering the sensor housing and the onboard electronics for the CCD clocking, readout and digital processing of the frame date.

  17. Safe Onboard Guidance and Control Under Probabilistic Uncertainty

    Science.gov (United States)

    Blackmore, Lars James

    2011-01-01

    An algorithm was developed that determines the fuel-optimal spacecraft guidance trajectory that takes into account uncertainty, in order to guarantee that mission safety constraints are satisfied with the required probability. The algorithm uses convex optimization to solve for the optimal trajectory. Convex optimization is amenable to onboard solution due to its excellent convergence properties. The algorithm is novel because, unlike prior approaches, it does not require time-consuming evaluation of multivariate probability densities. Instead, it uses a new mathematical bounding approach to ensure that probability constraints are satisfied, and it is shown that the resulting optimization is convex. Empirical results show that the approach is many orders of magnitude less conservative than existing set conversion techniques, for a small penalty in computation time.

  18. Onboard monitoring of fatigue damage rates in the hull girder

    DEFF Research Database (Denmark)

    Nielsen, Ulrik Dam; Jensen, Jørgen Juncher; Pedersen, Preben Terndrup

    2011-01-01

    Most new advanced ships have extensive data collection systems to be used for continuous monitoring of engine and hull performance, for voyage performance evaluation etc. Such systems could be expanded to include also procedures for stress monitoring and for decision support, where the most...... critical wave-induced ship extreme responses and fatigue damage accumulation can be estimated for hypothetical changes in ship course and speed in the automatically estimated wave environment.The aim of this paper is to outline a calculation procedure for fatigue damage rate prediction in hull girders...... taking into account whipping stresses. It is conceptually shown how such a method, which integrates onboard estimation of sea states, can be used to deduce decision support with respect to the accumulated fatigue damage in the hull girder.The paper firstly presents a set of measured full-scale wave...

  19. Reconfigurable On-Board Vision Processing for Small Autonomous Vehicles

    Directory of Open Access Journals (Sweden)

    James K. Archibald

    2006-12-01

    Full Text Available This paper addresses the challenge of supporting real-time vision processing on-board small autonomous vehicles. Local vision gives increased autonomous capability, but it requires substantial computing power that is difficult to provide given the severe constraints of small size and battery-powered operation. We describe a custom FPGA-based circuit board designed to support research in the development of algorithms for image-directed navigation and control. We show that the FPGA approach supports real-time vision algorithms by describing the implementation of an algorithm to construct a three-dimensional (3D map of the environment surrounding a small mobile robot. We show that FPGAs are well suited for systems that must be flexible and deliver high levels of performance, especially in embedded settings where space and power are significant concerns.

  20. Reconfigurable On-Board Vision Processing for Small Autonomous Vehicles

    Directory of Open Access Journals (Sweden)

    Fife WadeS

    2007-01-01

    Full Text Available This paper addresses the challenge of supporting real-time vision processing on-board small autonomous vehicles. Local vision gives increased autonomous capability, but it requires substantial computing power that is difficult to provide given the severe constraints of small size and battery-powered operation. We describe a custom FPGA-based circuit board designed to support research in the development of algorithms for image-directed navigation and control. We show that the FPGA approach supports real-time vision algorithms by describing the implementation of an algorithm to construct a three-dimensional (3D map of the environment surrounding a small mobile robot. We show that FPGAs are well suited for systems that must be flexible and deliver high levels of performance, especially in embedded settings where space and power are significant concerns.

  1. Spatial distribution of absorbed dose onboard of International Space Station

    International Nuclear Information System (INIS)

    Jadrnickova, I.; Spumy, F.; Tateyama, R.; Yasuda, N.; Kawashima, H.; Kurano, M.; Uchihori, Y.; Kitamura, H.; Akatov, Yu.; Shurshakov, V.; Kobayashi, I.; Ohguchi, H.; Koguchi, Y.

    2009-01-01

    The passive detectors (LD and PNTD) were exposed onboard of Russian Service Module Qn the International Space Station (ISS) from August 2004 to October 2005 (425 days). The detectors were located at 6 different positions inside the Service Module and also in 32 pockets on the surface of the spherical tissue-equivalent phantom located in crew cabin. Distribution of absorbed doses and dose equivalents measured with passive detectors, as well as LET spectra of fluences of registered particles, are presented as the function of detectors' location. The variation of dose characteristics for different locations can be up to factor of 2. In some cases, data measured with passive detectors are also compared with the data obtained by means of active instruments. (authors)

  2. Comparison of MODIS and VIIRS On-board Blackbody Performance

    Science.gov (United States)

    Xiong, Jack; Butler, Jim; Wu, Aisheng; Chiang, Vincent; McIntire, Jeff; Oudari, Hassan

    2012-01-01

    MODIS has 16 thermal emissive bands (TEBs), covering wavelengths from 3.7 to 14.4 microns. MODIS TEBs are calibrated on-orbit by a v-grooved blackbody (BB) on a scan-by-scan basis. The BB temperatures are measured by a set of 12 thennistors. As expected, the BB temperature uncertainty and stability have direct impact on the quality of TEB calibration and, therefore, the quality of the science products derived from TEB observations. Since launch, Terra and Aqua MODIS have successfully operated for more than 12 and 10 years, respectively. Their on-board BB performance has been satisfactory in meeting the TEB calibration requirements. The first VIIRS, launched on-board the Suomi NPP spacecraft on October 28, 2011, has successfully completed its initial Intensive Calibration and Validation (ICV) phase. VIIRS has 7 thermal emissive bands (TEBs), covering wavelengths from 3.7 to 12.4 microns. Designed with strong MODIS heritage, VIIRS uses a similar BB for its TEB calibration. Like MODIS, VIIRS BB is nominally controlled at a pre-determined temperature (set point). Periodically, a BB Warm-Up and Cool-Down (WUCD) operation is performed, during which the BB temperatures vary from instrument ambient (temperature) to 315K. This paper examines NPP VIIRS BB on-orbit performance. It focuses on its BB temperature scan-to-scan variations at nominally controlled temperature as well as during its WUCD operation and their impact on TEB calibration uncertainty. Comparisons of VIIRS (NPP) and MODIS (Terra and Aqua) BB on-orbit performance and lessons learned for future improvements are also presented in this paper.

  3. Calibration of the radiation monitor onboard Akebono using Geant4

    Science.gov (United States)

    Asai, Keiko; Takashima, Takeshi; Koi, Tatsumi; Nagai, Tsugunobu

    Natural high-energy electrons and protons (keV-MeV) in the space contaminate the data re-ciprocally. In order to calibrate the energy ranges and to remove data contamination on the radiation monitor (RDM) onboard the Japanese satellite, Akebono (EXOS-D), the detector is investigated using the Geant4 simulation toolkit of computational particle tracing. The semi-polar orbiting Akebono, launched in February 1989, is active now. This satellite has been observed the space environment at altitudes of several thousands km. The RDM instrument onboard Akebono monitors energetic particles in the Earth's radiation belt and gives important data accumulated for about two solar cycles. The data from RDM are for electrons in three energy channels of 0.3 MeV, protons in three energy channels of ¿ 30 MeV, and alpha particles in one energy channels of 15-45 MeV. The energy ranges are however based on information of about 20 years ago so that the data seem to include some errors actuary. In addition, these data include contamination of electrons and protons reciprocally. Actuary it is noticed that the electron data are contaminated by the solar protons but unknown quantitative amount of the contamination. Therefore we need data calibration in order to correct the energy ranges and to remove data contamination. The Geant4 simulation gives information of trajectories of incident and secondary particles whose are interacted with materials. We examine the RDM monitor using the Geant4 simulation. We find from the results that relativistic electrons of MeV behave quite complicatedly because of particle-material interaction in the instrument. The results indicate that efficiencies of detection and contamination are dependent on energy. This study compares the electron data from Akebono RDM with the simultaneous observation of CRRES and tries to lead the values of correction for each of the energy channels.

  4. WE-G-BRD-06: Volumetric Cine MRI (VC-MRI) Estimated Based On Prior Knowledge for On-Board Target Localization

    International Nuclear Information System (INIS)

    Harris, W; Yin, F; Cai, J; Zhang, Y; Ren, L

    2015-01-01

    Purpose: To develop a technique to generate on-board VC-MRI using patient prior 4D-MRI, motion modeling and on-board 2D-cine MRI for real-time 3D target verification of liver and lung radiotherapy. Methods: The end-expiration phase images of a 4D-MRI acquired during patient simulation are used as patient prior images. Principal component analysis (PCA) is used to extract 3 major respiratory deformation patterns from the Deformation Field Maps (DFMs) generated between end-expiration phase and all other phases. On-board 2D-cine MRI images are acquired in the axial view. The on-board VC-MRI at any instant is considered as a deformation of the prior MRI at the end-expiration phase. The DFM is represented as a linear combination of the 3 major deformation patterns. The coefficients of the deformation patterns are solved by matching the corresponding 2D slice of the estimated VC-MRI with the acquired single 2D-cine MRI. The method was evaluated using both XCAT (a computerized patient model) simulation of lung cancer patients and MRI data from a real liver cancer patient. The 3D-MRI at every phase except end-expiration phase was used to simulate the ground-truth on-board VC-MRI at different instances, and the center-tumor slice was selected to simulate the on-board 2D-cine images. Results: Image subtraction of ground truth with estimated on-board VC-MRI shows fewer differences than image subtraction of ground truth with prior image. Excellent agreement between profiles was achieved. The normalized cross correlation coefficients between the estimated and ground-truth in the axial, coronal and sagittal views for each time step were >= 0.982, 0.905, 0.961 for XCAT data and >= 0.998, 0.911, 0.9541 for patient data. For XCAT data, the maximum-Volume-Percent-Difference between ground-truth and estimated tumor volumes was 1.6% and the maximum-Center-of-Mass-Shift was 0.9 mm. Conclusion: Preliminary studies demonstrated the feasibility to estimate real-time VC-MRI for on-board

  5. WE-G-BRD-06: Volumetric Cine MRI (VC-MRI) Estimated Based On Prior Knowledge for On-Board Target Localization

    Energy Technology Data Exchange (ETDEWEB)

    Harris, W; Yin, F; Cai, J; Zhang, Y; Ren, L [Duke University Medical Center, Durham, NC (United States)

    2015-06-15

    Purpose: To develop a technique to generate on-board VC-MRI using patient prior 4D-MRI, motion modeling and on-board 2D-cine MRI for real-time 3D target verification of liver and lung radiotherapy. Methods: The end-expiration phase images of a 4D-MRI acquired during patient simulation are used as patient prior images. Principal component analysis (PCA) is used to extract 3 major respiratory deformation patterns from the Deformation Field Maps (DFMs) generated between end-expiration phase and all other phases. On-board 2D-cine MRI images are acquired in the axial view. The on-board VC-MRI at any instant is considered as a deformation of the prior MRI at the end-expiration phase. The DFM is represented as a linear combination of the 3 major deformation patterns. The coefficients of the deformation patterns are solved by matching the corresponding 2D slice of the estimated VC-MRI with the acquired single 2D-cine MRI. The method was evaluated using both XCAT (a computerized patient model) simulation of lung cancer patients and MRI data from a real liver cancer patient. The 3D-MRI at every phase except end-expiration phase was used to simulate the ground-truth on-board VC-MRI at different instances, and the center-tumor slice was selected to simulate the on-board 2D-cine images. Results: Image subtraction of ground truth with estimated on-board VC-MRI shows fewer differences than image subtraction of ground truth with prior image. Excellent agreement between profiles was achieved. The normalized cross correlation coefficients between the estimated and ground-truth in the axial, coronal and sagittal views for each time step were >= 0.982, 0.905, 0.961 for XCAT data and >= 0.998, 0.911, 0.9541 for patient data. For XCAT data, the maximum-Volume-Percent-Difference between ground-truth and estimated tumor volumes was 1.6% and the maximum-Center-of-Mass-Shift was 0.9 mm. Conclusion: Preliminary studies demonstrated the feasibility to estimate real-time VC-MRI for on-board

  6. Spaceflight of HUVEC: An Integrated eXperiment- SPHINX Onboard the ISS

    Science.gov (United States)

    Versari, S.; Maier, J. A. M.; Norfini, A.; Zolesi, V.; Bradamante, S.

    2013-02-01

    The spaceflight orthostatic challenge can promote in astronauts inadequate cardiovascular responses defined as cardiovascular deconditioning. In particular, disturbance of endothelial functions are known to lead to altered vascular performances, being the endothelial cells crucial in the maintenance of the functional integrity of the vascular wall. In order to evaluate whether weightlessness affects endothelial functions, we designed, developed, and performed the experiment SPHINX - SPaceflight of HUVEC: an INtegrated eXperiment - where HUVEC (Human Umbilical Vein Endothelial Cells) were selected as a macrovascular cell model system. SPHINX arrived at the International Space Station (ISS) onboard Progress 40P, and was processed inside Kubik 6 incubator for 7 days. At the end, all of the samples were suitably fixed and preserved at 6°C until return on Earth on Soyuz 23S.

  7. Characterization and selection of CZT detector modules for HEX experiment onboard Chandrayaan-1

    International Nuclear Information System (INIS)

    Vadawale, S.V.; Purohit, S.; Shanmugam, M.; Acharya, Y.B.; Goswami, J.N.; Sudhakar, M.; Sreekumar, P.

    2009-01-01

    We present the results of characterization of a large sample of Cadmium Zinc Telluride (CZT) detector modules planned to be used for the HEX (High Energy X-ray spectrometer) experiment onboard India's first mission to the Moon, Chandrayaan-1. We procured forty modules from Orbotech Medical Solutions Ltd. and carried out a detailed characterization of each module at various temperatures and selected final nine detector modules for the flight model of HEX. Here we present the results of the characterization of all modules and the selection procedure for the HEX flight detector modules. These modules show 5-6% energy resolution (at 122 keV, for best 90% of pixels) at room temperature which is improved to ∼4% when these modules are cooled to sub-0 deg. C temperature. The gain and energy resolution were stable during the long duration tests.

  8. [Construction and application of an onboard absorption analyzer device for CDOM].

    Science.gov (United States)

    Lin, Jun-Fang; Sun, Zhao-Hua; Cao, Wen-Xi; Hu, Shui-Bo; Xu, Zhan-Tang

    2013-04-01

    Colored dissolved organic matter (CDOM) plays an important role in marine ecosystems. In order to solve the current problems in measurement of CDOM absorption, an automated onboard analyzer based on liquid core waveguides (Teflon AF LWCC/LCW) was constructed. This analyzer has remarkable characteristics including adjusted optical pathlength, wide measurement range, and high sensitivity. The model of filtration and injection can implement the function of automated filtration, sample injection, and LWCC cleaning. The LabVIEW software platform can efficiently control the running state of the analyzer and acquire real time data including light absorption spectra, GPS data, and CTW data. By the comparison experiments and shipboard measurements, it was proved that the analyzer was reliable and robust.

  9. Design of on-board Bluetooth wireless network system based on fault-tolerant technology

    Science.gov (United States)

    You, Zheng; Zhang, Xiangqi; Yu, Shijie; Tian, Hexiang

    2007-11-01

    In this paper, the Bluetooth wireless data transmission technology is applied in on-board computer system, to realize wireless data transmission between peripherals of the micro-satellite integrating electronic system, and in view of the high demand of reliability of a micro-satellite, a design of Bluetooth wireless network based on fault-tolerant technology is introduced. The reliability of two fault-tolerant systems is estimated firstly using Markov model, then the structural design of this fault-tolerant system is introduced; several protocols are established to make the system operate correctly, some related problems are listed and analyzed, with emphasis on Fault Auto-diagnosis System, Active-standby switch design and Data-Integrity process.

  10. Grid-Enabled Measures

    Science.gov (United States)

    Moser, Richard P.; Hesse, Bradford W.; Shaikh, Abdul R.; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-01-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment —a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute with two overarching goals: (1) Promote the use of standardized measures, which are tied to theoretically based constructs; and (2) Facilitate the ability to share harmonized data resulting from the use of standardized measures. This is done by creating an online venue connected to the Cancer Biomedical Informatics Grid (caBIG®) where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting and viewing meta-data about the measures and associated constructs. This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database, such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories— for data sharing). PMID:21521586

  11. Enabling distributed petascale science

    International Nuclear Information System (INIS)

    Baranovski, Andrew; Bharathi, Shishir; Bresnahan, John

    2007-01-01

    Petascale science is an end-to-end endeavour, involving not only the creation of massive datasets at supercomputers or experimental facilities, but the subsequent analysis of that data by a user community that may be distributed across many laboratories and universities. The new SciDAC Center for Enabling Distributed Petascale Science (CEDPS) is developing tools to support this end-to-end process. These tools include data placement services for the reliable, high-performance, secure, and policy-driven placement of data within a distributed science environment; tools and techniques for the construction, operation, and provisioning of scalable science services; and tools for the detection and diagnosis of failures in end-to-end data placement and distributed application hosting configurations. In each area, we build on a strong base of existing technology and have made useful progress in the first year of the project. For example, we have recently achieved order-of-magnitude improvements in transfer times (for lots of small files) and implemented asynchronous data staging capabilities; demonstrated dynamic deployment of complex application stacks for the STAR experiment; and designed and deployed end-to-end troubleshooting services. We look forward to working with SciDAC application and technology projects to realize the promise of petascale science

  12. Displays enabling mobile multimedia

    Science.gov (United States)

    Kimmel, Jyrki

    2007-02-01

    With the rapid advances in telecommunications networks, mobile multimedia delivery to handsets is now a reality. While a truly immersive multimedia experience is still far ahead in the mobile world, significant advances have been made in the constituent audio-visual technologies to make this become possible. One of the critical components in multimedia delivery is the mobile handset display. While such alternatives as headset-style near-to-eye displays, autostereoscopic displays, mini-projectors, and roll-out flexible displays can deliver either a larger virtual screen size than the pocketable dimensions of the mobile device can offer, or an added degree of immersion by adding the illusion of the third dimension in the viewing experience, there are still challenges in the full deployment of such displays in real-life mobile communication terminals. Meanwhile, direct-view display technologies have developed steadily, and can provide a development platform for an even better viewing experience for multimedia in the near future. The paper presents an overview of the mobile display technology space with an emphasis on the advances and potential in developing direct-view displays further to meet the goal of enabling multimedia in the mobile domain.

  13. Hydrogen production by onboard gasoline processing – Process simulation and optimization

    Energy Technology Data Exchange (ETDEWEB)

    Bisaria, Vega; Smith, R.J. Byron,

    2013-12-15

    Highlights: • Process flow sheet for an onboard fuel processor for 100 kW fuel cell output was simulated. • Gasoline fuel requirement was found to be 30.55 kg/hr. • The fuel processor efficiency was found to be 95.98%. • An heat integrated optimum flow sheet was developed. - Abstract: Fuel cell vehicles have reached the commercialization stage and hybrid vehicles are already on the road. While hydrogen storage and infrastructure remain critical issues in stand alone commercialization of the technology, researchers are developing onboard fuel processors, which can convert a variety of fuels into hydrogen to power these fuel cell vehicles. The feasibility study of a 100 kW on board fuel processor based on gasoline fuel is carried out using process simulation. The steady state model has been developed with the help of Aspen HYSYS to analyze the fuel processor and total system performance. The components of the fuel processor are the fuel reforming unit, CO clean-up unit and auxiliary units. Optimization studies were carried out by analyzing the influence of various operating parameters such as oxygen to carbon ratio, steam to carbon ratio, temperature and pressure on the process equipments. From the steady state model optimization using Aspen HYSYS, an optimized reaction composition in terms of hydrogen production and carbon monoxide concentration corresponds to: oxygen to carbon ratio of 0.5 and steam to carbon ratio of 0.5. The fuel processor efficiency of 95.98% is obtained under these optimized conditions. The heat integration of the system using the composite curve, grand composite curve and utility composite curve were studied for the system. The most appropriate heat exchanger network from the generated ones was chosen and that was incorporated into the optimized flow sheet of the100 kW fuel processor. A completely heat integrated 100 kW fuel processor flow sheet using gasoline as fuel was thus successfully simulated and optimized.

  14. Hydrogen production by onboard gasoline processing – Process simulation and optimization

    International Nuclear Information System (INIS)

    Bisaria, Vega; Smith, R.J. Byron

    2013-01-01

    Highlights: • Process flow sheet for an onboard fuel processor for 100 kW fuel cell output was simulated. • Gasoline fuel requirement was found to be 30.55 kg/hr. • The fuel processor efficiency was found to be 95.98%. • An heat integrated optimum flow sheet was developed. - Abstract: Fuel cell vehicles have reached the commercialization stage and hybrid vehicles are already on the road. While hydrogen storage and infrastructure remain critical issues in stand alone commercialization of the technology, researchers are developing onboard fuel processors, which can convert a variety of fuels into hydrogen to power these fuel cell vehicles. The feasibility study of a 100 kW on board fuel processor based on gasoline fuel is carried out using process simulation. The steady state model has been developed with the help of Aspen HYSYS to analyze the fuel processor and total system performance. The components of the fuel processor are the fuel reforming unit, CO clean-up unit and auxiliary units. Optimization studies were carried out by analyzing the influence of various operating parameters such as oxygen to carbon ratio, steam to carbon ratio, temperature and pressure on the process equipments. From the steady state model optimization using Aspen HYSYS, an optimized reaction composition in terms of hydrogen production and carbon monoxide concentration corresponds to: oxygen to carbon ratio of 0.5 and steam to carbon ratio of 0.5. The fuel processor efficiency of 95.98% is obtained under these optimized conditions. The heat integration of the system using the composite curve, grand composite curve and utility composite curve were studied for the system. The most appropriate heat exchanger network from the generated ones was chosen and that was incorporated into the optimized flow sheet of the100 kW fuel processor. A completely heat integrated 100 kW fuel processor flow sheet using gasoline as fuel was thus successfully simulated and optimized

  15. 49 CFR 395.15 - Automatic on-board recording devices.

    Science.gov (United States)

    2010-10-01

    ... information concerning on-board system sensor failures and identification of edited data. Such support systems... driving today; (iv) Total hours on duty for the 7 consecutive day period, including today; (v) Total hours...-driver operation; (7) The on-board recording device/system identifies sensor failures and edited data...

  16. An overview of CAFE credits and incorporation of the benefits of on-board carbon capture.

    Science.gov (United States)

    2014-05-01

    This report discusses the application of Corporate Average Fuel Economy (CAFE) : credits that are currently available to vehicle manufacturers in the U.S., and the implications of : on-board carbon capture and sequestration (on-board CCS) on fu...

  17. Enabling cleanup technology transfer

    International Nuclear Information System (INIS)

    Ditmars, J. D.

    2002-01-01

    Technology transfer in the environmental restoration, or cleanup, area has been challenging. While there is little doubt that innovative technologies are needed to reduce the times, risks, and costs associated with the cleanup of federal sites, particularly those of the Departments of Energy (DOE) and Defense, the use of such technologies in actual cleanups has been relatively limited. There are, of course, many reasons why technologies do not reach the implementation phase or do not get transferred from developing entities to the user community. For example, many past cleanup contracts provided few incentives for performance that would compel a contractor to seek improvement via technology applications. While performance-based contracts are becoming more common, they alone will not drive increased technology applications. This paper focuses on some applications of cleanup methodologies and technologies that have been successful and are illustrative of a more general principle. The principle is at once obvious and not widely practiced. It is that, with few exceptions, innovative cleanup technologies are rarely implemented successfully alone but rather are implemented in the context of enabling processes and methodologies. And, since cleanup is conducted in a regulatory environment, the stage is better set for technology transfer when the context includes substantive interactions with the relevant stakeholders. Examples of this principle are drawn from Argonne National Laboratory's experiences in Adaptive Sampling and Analysis Programs (ASAPs), Precise Excavation, and the DOE Technology Connection (TechCon) Program. The lessons learned may be applicable to the continuing challenges posed by the cleanup and long-term stewardship of radioactive contaminants and unexploded ordnance (UXO) at federal sites

  18. Comparison of onboard low-field magnetic resonance imaging versus onboard computed tomography for anatomy visualization in radiotherapy.

    Science.gov (United States)

    Noel, Camille E; Parikh, Parag J; Spencer, Christopher R; Green, Olga L; Hu, Yanle; Mutic, Sasa; Olsen, Jeffrey R

    2015-01-01

    Onboard magnetic resonance imaging (OB-MRI) for daily localization and adaptive radiotherapy has been under development by several groups. However, no clinical studies have evaluated whether OB-MRI improves visualization of the target and organs at risk (OARs) compared to standard onboard computed tomography (OB-CT). This study compared visualization of patient anatomy on images acquired on the MRI-(60)Co ViewRay system to those acquired with OB-CT. Fourteen patients enrolled on a protocol approved by the Institutional Review Board (IRB) and undergoing image-guided radiotherapy for cancer in the thorax (n = 2), pelvis (n = 6), abdomen (n = 3) or head and neck (n = 3) were imaged with OB-MRI and OB-CT. For each of the 14 patients, the OB-MRI and OB-CT datasets were displayed side-by-side and independently reviewed by three radiation oncologists. Each physician was asked to evaluate which dataset offered better visualization of the target and OARs. A quantitative contouring study was performed on two abdominal patients to assess if OB-MRI could offer improved inter-observer segmentation agreement for adaptive planning. In total 221 OARs and 10 targets were compared for visualization on OB-MRI and OB-CT by each of the three physicians. The majority of physicians (two or more) evaluated visualization on MRI as better for 71% of structures, worse for 10% of structures, and equivalent for 14% of structures. 5% of structures were not visible on either. Physicians agreed unanimously for 74% and in majority for > 99% of structures. Targets were better visualized on MRI in 4/10 cases, and never on OB-CT. Low-field MR provides better anatomic visualization of many radiotherapy targets and most OARs as compared to OB-CT. Further studies with OB-MRI should be pursued.

  19. Multi-gigabit optical interconnects for next-generation on-board digital equipment

    Science.gov (United States)

    Venet, Norbert; Favaro, Henri; Sotom, Michel; Maignan, Michel; Berthon, Jacques

    2017-11-01

    Parallel optical interconnects are experimentally assessed as a technology that may offer the high-throughput data communication capabilities required to the next-generation on-board digital processing units. An optical backplane interconnect was breadboarded, on the basis of a digital transparent processor that provides flexible connectivity and variable bandwidth in telecom missions with multi-beam antenna coverage. The unit selected for the demonstration required that more than tens of Gbit/s be supported by the backplane. The demonstration made use of commercial parallel optical link modules at 850 nm wavelength, with 12 channels running at up to 2.5 Gbit/s. A flexible optical fibre circuit was developed so as to route board-to-board connections. It was plugged to the optical transmitter and receiver modules through 12-fibre MPO connectors. BER below 10-14 and optical link budgets in excess of 12 dB were measured, which would enable to integrate broadcasting. Integration of the optical backplane interconnect was successfully demonstrated by validating the overall digital processor functionality.

  20. Database Organisation in a Web-Enabled Free and Open-Source Software (foss) Environment for Spatio-Temporal Landslide Modelling

    Science.gov (United States)

    Das, I.; Oberai, K.; Sarathi Roy, P.

    2012-07-01

    Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) Web Feature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.

  1. FOILFEST :community enabled security.

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Judy Hennessey; Johnson, Curtis Martin; Whitley, John B.; Drayer, Darryl Donald; Cummings, John C., Jr. (.,; .)

    2005-09-01

    The Advanced Concepts Group of Sandia National Laboratories hosted a workshop, ''FOILFest: Community Enabled Security'', on July 18-21, 2005, in Albuquerque, NM. This was a far-reaching look into the future of physical protection consisting of a series of structured brainstorming sessions focused on preventing and foiling attacks on public places and soft targets such as airports, shopping malls, hotels, and public events. These facilities are difficult to protect using traditional security devices since they could easily be pushed out of business through the addition of arduous and expensive security measures. The idea behind this Fest was to explore how the public, which is vital to the function of these institutions, can be leveraged as part of a physical protection system. The workshop considered procedures, space design, and approaches for building community through technology. The workshop explored ways to make the ''good guys'' in public places feel safe and be vigilant while making potential perpetrators of harm feel exposed and convinced that they will not succeed. Participants in the Fest included operators of public places, social scientists, technology experts, representatives of government agencies including DHS and the intelligence community, writers and media experts. Many innovative ideas were explored during the fest with most of the time spent on airports, including consideration of the local airport, the Albuquerque Sunport. Some provocative ideas included: (1) sniffers installed in passage areas like revolving door, escalators, (2) a ''jumbotron'' showing current camera shots in the public space, (3) transparent portal screeners allowing viewing of the screening, (4) a layered open/funnel/open/funnel design where open spaces are used to encourage a sense of ''communitas'' and take advantage of citizen ''sensing'' and funnels are technological

  2. Combining Hydrological Modeling and Remote Sensing Observations to Enable Data-Driven Decision Making for Devils Lake Flood Mitigation in a Changing Climate

    Science.gov (United States)

    Zhang, Xiaodong; Kirilenko, Andrei; Lim, Howe; Teng, Williams

    2010-01-01

    This slide presentation reviews work to combine the hydrological models and remote sensing observations to monitor Devils Lake in North Dakota, to assist in flood damage mitigation. This reports on the use of a distributed rainfall-runoff model, HEC-HMS, to simulate the hydro-dynamics of the lake watershed, and used NASA's remote sensing data, including the TRMM Multi-Satellite Precipitation Analysis (TMPA) and AIRS surface air temperature, to drive the model.

  3. Optimization of an on-board imaging system for extremely rapid radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Cherry Kemmerling, Erica M.; Wu, Meng, E-mail: mengwu@stanford.edu; Yang, He; Fahrig, Rebecca [Department of Radiology, Stanford University, Stanford, California 94305 (United States); Maxim, Peter G.; Loo, Billy W. [Department of Radiation Oncology, Stanford University, Stanford, California 94305 and Stanford Cancer Institute, Stanford University School of Medicine, Stanford, California 94305 (United States)

    2015-11-15

    Purpose: Next-generation extremely rapid radiation therapy systems could mitigate the need for motion management, improve patient comfort during the treatment, and increase patient throughput for cost effectiveness. Such systems require an on-board imaging system that is competitively priced, fast, and of sufficiently high quality to allow good registration between the image taken on the day of treatment and the image taken the day of treatment planning. In this study, three different detectors for a custom on-board CT system were investigated to select the best design for integration with an extremely rapid radiation therapy system. Methods: Three different CT detectors are proposed: low-resolution (all 4 × 4 mm pixels), medium-resolution (a combination of 4 × 4 mm pixels and 2 × 2 mm pixels), and high-resolution (all 1 × 1 mm pixels). An in-house program was used to generate projection images of a numerical anthropomorphic phantom and to reconstruct the projections into CT datasets, henceforth called “realistic” images. Scatter was calculated using a separate Monte Carlo simulation, and the model included an antiscatter grid and bowtie filter. Diagnostic-quality images of the phantom were generated to represent the patient scan at the time of treatment planning. Commercial deformable registration software was used to register the diagnostic-quality scan to images produced by the various on-board detector configurations. The deformation fields were compared against a “gold standard” deformation field generated by registering initial and deformed images of the numerical phantoms that were used to make the diagnostic and treatment-day images. Registrations of on-board imaging system data were judged by the amount their deformation fields differed from the corresponding gold standard deformation fields—the smaller the difference, the better the system. To evaluate the registrations, the pointwise distance between gold standard and realistic registration

  4. Optimization of an on-board imaging system for extremely rapid radiation therapy

    International Nuclear Information System (INIS)

    Cherry Kemmerling, Erica M.; Wu, Meng; Yang, He; Fahrig, Rebecca; Maxim, Peter G.; Loo, Billy W.

    2015-01-01

    Purpose: Next-generation extremely rapid radiation therapy systems could mitigate the need for motion management, improve patient comfort during the treatment, and increase patient throughput for cost effectiveness. Such systems require an on-board imaging system that is competitively priced, fast, and of sufficiently high quality to allow good registration between the image taken on the day of treatment and the image taken the day of treatment planning. In this study, three different detectors for a custom on-board CT system were investigated to select the best design for integration with an extremely rapid radiation therapy system. Methods: Three different CT detectors are proposed: low-resolution (all 4 × 4 mm pixels), medium-resolution (a combination of 4 × 4 mm pixels and 2 × 2 mm pixels), and high-resolution (all 1 × 1 mm pixels). An in-house program was used to generate projection images of a numerical anthropomorphic phantom and to reconstruct the projections into CT datasets, henceforth called “realistic” images. Scatter was calculated using a separate Monte Carlo simulation, and the model included an antiscatter grid and bowtie filter. Diagnostic-quality images of the phantom were generated to represent the patient scan at the time of treatment planning. Commercial deformable registration software was used to register the diagnostic-quality scan to images produced by the various on-board detector configurations. The deformation fields were compared against a “gold standard” deformation field generated by registering initial and deformed images of the numerical phantoms that were used to make the diagnostic and treatment-day images. Registrations of on-board imaging system data were judged by the amount their deformation fields differed from the corresponding gold standard deformation fields—the smaller the difference, the better the system. To evaluate the registrations, the pointwise distance between gold standard and realistic registration

  5. Hypersonic wind-tunnel free-flying experiments with onboard instrumentation

    KAUST Repository

    Mudford, Neil R.; O'Byrne, Sean B.; Neely, Andrew J.; Buttsworth, David R.; Balage, Sudantha

    2015-01-01

    Hypersonic wind-tunnel testing with "free-flight" models unconnected to a sting ensures that sting/wake flow interactions do not compromise aerodynamic coefficient measurements. The development of miniaturized electronics has allowed the demonstration of a variant of a new method for the acquisition of hypersonic model motion data using onboard accelerometers, gyroscopes, and a microcontroller. This method is demonstrated in a Mach 6 wind-tunnel flow, whose duration and pitot pressure are sufficient for the model to move a body length or more and turn through a significant angle. The results are compared with those obtained from video analysis of the model motion, the existing method favored for obtaining aerodynamic coefficients in similar hypersonic wind-tunnel facilities. The results from the two methods are in good agreement. The new method shows considerable promise for reliable measurement of aerodynamic coefficients, particularly because the data obtained are in more directly applicable forms of accelerations and rates of turn, rather than the model position and attitude obtained from the earlier visualization method. The ideal may be to have both methods operating together.

  6. Neon dewar for the X-ray spectrometer onboard Suzaku

    Energy Technology Data Exchange (ETDEWEB)

    Fujimoto, R. [Institute of Space and Astronautical Science (ISAS), JAXA, 3-1-1 Yoshinodai, Sagamihara 229-8510 (Japan)]. E-mail: fujimoto@isas.jaxa.jp; Mitsuda, K. [Institute of Space and Astronautical Science (ISAS), JAXA, 3-1-1 Yoshinodai, Sagamihara 229-8510 (Japan); Hirabayashi, M. [Sumitomo Heavy Industries, Ltd. (SHI), 5-2 Sobiraki-cho, Niihama 792-8588 (Japan); Narasaki, K. [Sumitomo Heavy Industries, Ltd. (SHI), 5-2 Sobiraki-cho, Niihama 792-8588 (Japan); Breon, S. [NASA Goddard Space Flight Center (GSFC), Greenbelt, MD 20771 (United States); Boyle, R. [NASA Goddard Space Flight Center (GSFC), Greenbelt, MD 20771 (United States); Di Pirro, M. [NASA Goddard Space Flight Center (GSFC), Greenbelt, MD 20771 (United States); Volz, S.M. [NASA Headquarters, Washington, DC 20546-0001 (United States); Kelley, R.L. [NASA Goddard Space Flight Center (GSFC), Greenbelt, MD 20771 (United States)

    2006-04-15

    The X-ray spectrometer (XRS) onboard Suzaku is the first X-ray microcalorimeter array in orbit. The sensor array is operated at 60mK, which is attained by an adiabatic demagnetization refrigerator and superfluid liquid helium. The neon dewar is a vacuum-insulated container for the XRS. The requirements for the XRS dewar are to maintain the detector and the cryogenic system under the mechanical environment at launch ({approx}15G), and to attain a lifetime of 3 years in a near-earth orbit. It is characterized with adoptions of solid neon as the second cryogen and a mechanical cooler, design optimization of the support straps for the neon tank to reduce the heat load as much as possible, and shock absorbers to mitigate the mechanical environment at launch. Microphonics from the mechanical cooler was one of the concerns for the detector performance, but the ground test results proved that they do not interfere with the detector. After about 1 month in orbit, its thermal performance showed that the dewar potentially achieves its design goals.

  7. Cosmic radiation dosimetry onboard aircrafts at the brazilian airspace

    International Nuclear Information System (INIS)

    Federico, Claudio Antonio

    2011-01-01

    The objective of this work is the establishment of a dosimetric system for the aircrew in the domestic territory. A technique to perform measurements of ambient dose equivalent in aircrafts was developed. An active detector was evaluated for onboard aircraft use, testing its adequacy to this specific type of measurement as well as its susceptibility to the magnetic and electromagnetic interferences. The equipment was calibrated in standard radiation beams and in a special field of the European Laboratory CERN, that reproduces with great proximity the real spectrum in aircraft flight altitudes; it was also tested in several flights, in an Brazilian Air Force's aircraft. The results were evaluated and compared with those obtained from several computational programs for cosmic radiation estimates, with respect to its adequacy for use in the South American region. The program CARI-6 was selected to evaluate the estimated averaged effective doses for the aircrew who operate in this region. A statistical distribution of aircrew effective doses in South America and Caribe was made, and the results show that a great part of this aircrew members are subjected to annual effective doses that exceed the dose limits for the members of the public. Additionally, a preliminary passive dosemeter, based in thermoluminescent detectors, was proposed; international collaborations with United Kingdom and Italy were established for joint measurements of the ambient equivalent doses in aircrafts. (author)

  8. Design of an onboard battery charger for an electric vehicle

    Energy Technology Data Exchange (ETDEWEB)

    Heckford, Simon

    2001-07-01

    This report describes the design of an on-board battery charger for an electric car. There are already various battery charger units on the market. However, these are not specifically designed for this application, and consequently do not provide an ideal solution. Because these products are not specific to one application, and instead opt to cover a variety of briefs, they are not ideal. They also tend to be heavier and more expensive than if the charger was built specifically for one purpose. The main design considerations were that the charger should be compact and lightweight. It was also specified that the design should be able to operate using either the single-phase or three-phase AC supply. Before the design process for the battery charger could commence, it was necessary for the author to get an appreciation of power electronics, since he had no previous experience in the subject. The author focused his attention on areas of the subject most valuable to the project, including becoming familiar with the principle behind battery chargers. Once the required knowledge was obtained, the author could begin designing the charger. The majority of the design was actually undertaken using two software packages called MATLAB and Simulink, whilst also using the knowledge acquired. Regular discussions were had with the project team in order to ensure that the correct methodology was being used and a suitable design was duly developed. Possible further work was identified which could not be carried out within the time constraints of this project.

  9. Research on lettuce growth technology onboard Chinese Tiangong II Spacelab

    Science.gov (United States)

    Shen, Yunze; Guo, Shuangsheng; Zhao, Pisheng; Wang, Longji; Wang, Xiaoxia; Li, Jian; Bian, Qiang

    2018-03-01

    Lettuce was grown in a space vegetable cultivation facility onboard the Tiangong Ⅱ Spacelab during October 18 to November 15, 2016, in order to testify the key cultivating technology in CELSS under spaceflight microgravity condition. Potable water was used for irrigation of rooting substrate and the SRF (slowly released fertilizer) offered mineral nutrition for plant growth. Water content and electric conductivity in rooting substrate were measured based on FDR(frequency domain reflectometry) principle applied first in spaceflight. Lettuce germinated with comparative growth vigor as the ground control, showing that the plants appeared to be not stressed by the spaceflight environment. Under microgravity, lettuce grew taller and showed deeper green color than the ground control. In addition, the phototropism of the on-orbit plants was more remarkable. The nearly 30-d spaceflight test verified the seed fixation technology and water& nutrition management technology, which manifests the feasibility of FDR being used for measuring moisture content and electric conductivity in rooting zone under microgravity. Furthermore, the edibility of the space-grown vegetable was proved, providing theoretical support for astronaut to consume the space vegetable in future manned spaceflight.

  10. Autonomous Onboard Science Data Analysis for Comet Missions

    Science.gov (United States)

    Thompson, David R.; Tran, Daniel Q.; McLaren, David; Chien, Steve A.; Bergman, Larry; Castano, Rebecca; Doyle, Richard; Estlin, Tara; Lenda, Matthew

    2012-01-01

    Coming years will bring several comet rendezvous missions. The Rosetta spacecraft arrives at Comet 67P/Churyumov-Gerasimenko in 2014. Subsequent rendezvous might include a mission such as the proposed Comet Hopper with multiple surface landings, as well as Comet Nucleus Sample Return (CNSR) and Coma Rendezvous and Sample Return (CRSR). These encounters will begin to shed light on a population that, despite several previous flybys, remains mysterious and poorly understood. Scientists still have little direct knowledge of interactions between the nucleus and coma, their variation across different comets or their evolution over time. Activity may change on short timescales so it is challenging to characterize with scripted data acquisition. Here we investigate automatic onboard image analysis that could act faster than round-trip light time to capture unexpected outbursts and plume activity. We describe one edge-based method for detect comet nuclei and plumes, and test the approach on an existing catalog of comet images. Finally, we quantify benefits to specific measurement objectives by simulating a basic plume monitoring campaign.

  11. The hard x-ray imager onboard IXO

    Science.gov (United States)

    Nakazawa, Kazuhiro; Takahashi, Tadayuki; Limousin, Olivier; Kokubun, Motohide; Watanabe, Shin; Laurent, Philippe; Arnaud, Monique; Tajima, Hiroyasu

    2010-07-01

    The Hard X-ray Imager (HXI) is one of the instruments onboard International X-ray Observatory (IXO), to be launched into orbit in 2020s. It covers the energy band of 10-40 keV, providing imaging-spectroscopy with a field of view of 8 x 8 arcmin2. The HXI is attached beneath the Wide Field Imager (WFI) covering 0.1-15 keV. Combined with the super-mirror coating on the mirror assembly, this configuration provides observation of X-ray source in wide energy band (0.1-40.0 keV) simultaneously, which is especially important for varying sources. The HXI sensor part consists of the semiconductor imaging spectrometer, using Si in the medium energy detector and CdTe in the high energy detector as its material, and an active shield covering its back to reduce background in orbit. The HXI technology is based on those of the Japanese-lead new generation X-ray observatory ASTRO-H, and partly from those developed for Simbol-X. Therefore, the technological development is in good progress. In the IXO mission, HXI will provide a major assets to identify the nature of the object by penetrating into thick absorbing materials and determined the inherent spectral shape in the energy band well above the structure around Fe-K lines and edges.

  12. Experiment in Onboard Synthetic Aperture Radar Data Processing

    Science.gov (United States)

    Holland, Matthew

    2011-01-01

    Single event upsets (SEUs) are a threat to any computing system running on hardware that has not been physically radiation hardened. In addition to mandating the use of performance-limited, hardened heritage equipment, prior techniques for dealing with the SEU problem often involved hardware-based error detection and correction (EDAC). With limited computing resources, software- based EDAC, or any more elaborate recovery methods, were often not feasible. Synthetic aperture radars (SARs), when operated in the space environment, are interesting due to their relevance to NASAs objectives, but problematic in the sense of producing prodigious amounts of raw data. Prior implementations of the SAR data processing algorithm have been too slow, too computationally intensive, and require too much application memory for onboard execution to be a realistic option when using the type of heritage processing technology described above. This standard C-language implementation of SAR data processing is distributed over many cores of a Tilera Multicore Processor, and employs novel Radiation Hardening by Software (RHBS) techniques designed to protect the component processes (one per core) and their shared application memory from the sort of SEUs expected in the space environment. The source code includes calls to Tilera APIs, and a specialized Tilera compiler is required to produce a Tilera executable. The compiled application reads input data describing the position and orientation of a radar platform, as well as its radar-burst data, over time and writes out processed data in a form that is useful for analysis of the radar observations.

  13. On-board aircrew dosimetry using a semiconductor spectrometer

    CERN Document Server

    Spurny, F

    2002-01-01

    Radiation fields on board aircraft contain particles with energies up to a few hundred MeV. Many instruments have been tested to characterise these fields. This paper presents the results of studies on the use of an Si diode spectrometer to characterise these fields. The spectrometer has been in use since spring 2000 on more than 130 return flights to monitor and characterise the on-board field. During a Czech Airlines flight from Prague to New York it was possible to register the effects of an intense solar flare, (ground level event, GLE 60), which occurred on 15 April 2001. It was found that the number of deposition events registered was increased by about 70% and the dose in Si by a factor of 2.0 when compared with the presence of galactic cosmic rays alone. Directly measured data are interpreted with respect to on-earth reference field calibration (photons, CERN high-energy particles); it was found that this approach leads to encouraging results and should be followed up. (7 refs).

  14. Collaborative Project: The problem of bias in defining uncertainty in computationally enabled strategies for data-driven climate model development. Final Technical Report.

    Energy Technology Data Exchange (ETDEWEB)

    Huerta, Gabriel [Univ. of New Mexico, Albuquerque, NM (United States)

    2016-05-10

    The objective of the project is to develop strategies for better representing scientific sensibilities within statistical measures of model skill that then can be used within a Bayesian statistical framework for data-driven climate model development and improved measures of model scientific uncertainty. One of the thorny issues in model evaluation is quantifying the effect of biases on climate projections. While any bias is not desirable, only those biases that affect feedbacks affect scatter in climate projections. The effort at the University of Texas is to analyze previously calculated ensembles of CAM3.1 with perturbed parameters to discover how biases affect projections of global warming. The hypothesis is that compensating errors in the control model can be identified by their effect on a combination of processes and that developing metrics that are sensitive to dependencies among state variables would provide a way to select version of climate models that may reduce scatter in climate projections. Gabriel Huerta at the University of New Mexico is responsible for developing statistical methods for evaluating these field dependencies. The UT effort will incorporate these developments into MECS, which is a set of python scripts being developed at the University of Texas for managing the workflow associated with data-driven climate model development over HPC resources. This report reflects the main activities at the University of New Mexico where the PI (Huerta) and the Postdocs (Nosedal, Hattab and Karki) worked on the project.

  15. Advanced Physical Models and Numerical Algorithms to Enable High-Fidelity Aerothermodynamic Simulations of Planetary Entry Vehicles on Emerging Distributed Heterogeneous Computing Architectures

    Data.gov (United States)

    National Aeronautics and Space Administration — The design and qualification of entry systems for planetary exploration largely rely on computational simulations. However, state-of-the-art modeling capabilities...

  16. Double-plating of ovine critical sized defects of the tibia: a low morbidity model enabling continuous in vivo monitoring of bone healing

    Directory of Open Access Journals (Sweden)

    Pearce Alexandra

    2011-09-01

    Full Text Available Abstract Background Recent studies using sheep critical sized defect models to test tissue engineered products report high morbidity and complications rates. This study evaluates a large bone defect model in the sheep tibia, stabilized with two, a novel Carbon fibre Poly-ether-ether-ketone (CF-PEEK and a locking compression plate (LCP which could sustain duration for up to 6 month with an acceptable low complication rate. Methods A large bone defect of 3 cm was performed in the mid diaphysis of the right tibia in 33 sheep. The defect was stabilised with the CF - PEEK plate and an LCP. All sheep were supported with slings for 8 weeks after surgery. The study was carried out for 3 months in 6 and for 6 months in 27 animals. Results The surgical procedure could easily be performed in all sheep and continuous in vivo radiographic evaluation of the defect was possible. This long bone critical sized defect model shows with 6.1% a low rate of complications compared with numbers mentioned in the literature. Conclusions This experimental animal model could serve as a standard model in comparative research. A well defined standard model would reduce the number of experimental animals needed in future studies and would therefore add to ethical considerations.

  17. Development of Multiorgan Finite Element-Based Prostate Deformation Model Enabling Registration of Endorectal Coil Magnetic Resonance Imaging for Radiotherapy Planning

    International Nuclear Information System (INIS)

    Hensel, Jennifer M.; Menard, Cynthia; Chung, Peter W.M.; Milosevic, Michael F.; Kirilova, Anna; Moseley, Joanne L.; Haider, Masoom A.; Brock, Kristy K.

    2007-01-01

    Purpose: Endorectal coil (ERC) magnetic resonance imaging (MRI) provides superior visualization of the prostate compared with computed tomography at the expense of deformation. This study aimed to develop a multiorgan finite element deformable method, Morfeus, to accurately co-register these images for radiotherapy planning. Methods: Patients with prostate cancer underwent fiducial marker implantation and computed tomography simulation for radiotherapy planning. A series of axial MRI scans were acquired with and without an ERC. The prostate, bladder, rectum, and pubic bones were manually segmented and assigned linear elastic material properties. Morfeus mapped the surface of the bladder and rectum between two imaged states, calculating the deformation of the prostate through biomechanical properties. The accuracy of deformation was measured as fiducial marker error and residual surface deformation between the inferred and actual prostate. The deformation map was inverted to deform from 100 cm 3 to no coil. Results: The data from 19 patients were analyzed. Significant prostate deformation occurred with the ERC (mean intrapatient range, 0.88 ± 0.25 cm). The mean vector error in fiducial marker position (n = 57) was 0.22 ± 0.09 cm, and the mean vector residual surface deformation (n = 19) was 0.15 ± 0.06 cm for deformation from no coil to 100-cm 3 ERC, with an image vector resolution of 0.22 cm. Accurately deformed MRI scans improved soft-tissue resolution of the anatomy for radiotherapy planning. Conclusions: This method of multiorgan deformable registration enabled accurate co-registration of ERC-MRI scans with computed tomography treatment planning images. Superior structural detail was visible on ERC-MRI, which has potential for improving target delineation

  18. The Large-scale Coronal Structure of the 2017 August 21 Great American Eclipse: An Assessment of Solar Surface Flux Transport Model Enabled Predictions and Observations

    Science.gov (United States)

    Nandy, Dibyendu; Bhowmik, Prantika; Yeates, Anthony R.; Panda, Suman; Tarafder, Rajashik; Dash, Soumyaranjan

    2018-01-01

    On 2017 August 21, a total solar eclipse swept across the contiguous United States, providing excellent opportunities for diagnostics of the Sun’s corona. The Sun’s coronal structure is notoriously difficult to observe except during solar eclipses; thus, theoretical models must be relied upon for inferring the underlying magnetic structure of the Sun’s outer atmosphere. These models are necessary for understanding the role of magnetic fields in the heating of the corona to a million degrees and the generation of severe space weather. Here we present a methodology for predicting the structure of the coronal field based on model forward runs of a solar surface flux transport model, whose predicted surface field is utilized to extrapolate future coronal magnetic field structures. This prescription was applied to the 2017 August 21 solar eclipse. A post-eclipse analysis shows good agreement between model simulated and observed coronal structures and their locations on the limb. We demonstrate that slow changes in the Sun’s surface magnetic field distribution driven by long-term flux emergence and its evolution governs large-scale coronal structures with a (plausibly cycle-phase dependent) dynamical memory timescale on the order of a few solar rotations, opening up the possibility for large-scale, global corona predictions at least a month in advance.

  19. Large-scale, high-performance and cloud-enabled multi-model analytics experiments in the context of the Earth System Grid Federation

    Science.gov (United States)

    Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.

    2017-12-01

    The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the

  20. AMO EXPRESS: A Command and Control Experiment for Crew Autonomy Onboard the International Space Station

    Science.gov (United States)

    Stetson, Howard K.; Haddock, Angie T.; Frank, Jeremy; Cornelius, Randy; Wang, Lui; Garner, Larry

    2015-01-01

    NASA is investigating a range of future human spaceflight missions, including both Mars-distance and Near Earth Object (NEO) targets. Of significant importance for these missions is the balance between crew autonomy and vehicle automation. As distance from Earth results in increasing communication delays, future crews need both the capability and authority to independently make decisions. However, small crews cannot take on all functions performed by ground today, and so vehicles must be more automated to reduce the crew workload for such missions. NASA's Advanced Exploration Systems Program funded Autonomous Mission Operations (AMO) project conducted an autonomous command and control experiment on-board the International Space Station that demonstrated single action intelligent procedures for crew command and control. The target problem was to enable crew initialization of a facility class rack with power and thermal interfaces, and involving core and payload command and telemetry processing, without support from ground controllers. This autonomous operations capability is enabling in scenarios such as initialization of a medical facility to respond to a crew medical emergency, and representative of other spacecraft autonomy challenges. The experiment was conducted using the Expedite the Processing of Experiments for Space Station (EXPRESS) rack 7, which was located in the Port 2 location within the U.S Laboratory onboard the International Space Station (ISS). Activation and deactivation of this facility is time consuming and operationally intensive, requiring coordination of three flight control positions, 47 nominal steps, 57 commands, 276 telemetry checks, and coordination of multiple ISS systems (both core and payload). Utilization of Draper Laboratory's Timeliner software, deployed on-board the ISS within the Command and Control (C&C) computers and the Payload computers, allowed development of the automated procedures specific to ISS without having to certify

  1. Evaluation of genome-enabled selection for bacterial cold water disease resistance using progeny performance data in Rainbow Trout: Insights on genotyping methods and genomic prediction models

    Science.gov (United States)

    Bacterial cold water disease (BCWD) causes significant economic losses in salmonid aquaculture, and traditional family-based breeding programs aimed at improving BCWD resistance have been limited to exploiting only between-family variation. We used genomic selection (GS) models to predict genomic br...

  2. Collaborative Business Models for Exploration: - The Expansion of Public-Private Partnerships to Enable Exploration and Improve the Quality of Life on Earth

    Science.gov (United States)

    Davis, Jeffrey R.

    2012-01-01

    In May of 2007, The Space Life Sciences Strategy was published, launching a series of efforts aimed at driving human health and performance innovations that both meet space flight needs and benefit life on Earth. These efforts, led by the Space Life Science Directorate (SLSD) at the NASA Johnson Space Center, led to the development and implementation of the NASA Human Health and Performance Center (NHHPC) in October 2010. The NHHPC now has over 100 members including seven NASA centers; other federal agencies; some of the International Space Station partners; industry; academia and non-profits. The NHHPC seeks to share best practices, develop collaborative projects and experiment with open collaboration techniques such as crowdsourcing. Using this approach, the NHHPC collaborative projects are anticipated to be at the earliest possible stage of development utilizing the many possible public-private partnerships in this center. Two workshops have been successfully conducted in 2011 (January and October) with a third workshop planned for the spring of 2012. The challenges of space flight are similar in many respects to providing health care and environmental monitoring in challenging settings on the earth. These challenges to technology development include the need for low power consumption, low weight, in-situ analysis, operator independence (i.e., minimal training), robustness, and limited resupply or maintenance. When similar technology challenges are identified (such as the need to provide and monitor a safe water supply or develop a portable medical diagnostic device for remote use), opportunities arise for public-private partnerships to engage in co-creation of novel approaches for space exploration and health and environmental applications on earth. This approach can enable the use of shared resources to reduce costs, engage other organizations and the public in participatory exploration (solving real-world problems), and provide technologies with multiple uses

  3. The Invasive Species Forecasting System (ISFS): An iRODS-Based, Cloud-Enabled Decision Support System for Invasive Species Habitat Suitability Modeling

    Science.gov (United States)

    Gill, Roger; Schnase, John L.

    2012-01-01

    The Invasive Species Forecasting System (ISFS) is an online decision support system that allows users to load point occurrence field sample data for a plant species of interest and quickly generate habitat suitability maps for geographic regions of interest, such as a national park, monument, forest, or refuge. Target customers for ISFS are natural resource managers and decision makers who have a need for scientifically valid, model- based predictions of the habitat suitability of plant species of management concern. In a joint project involving NASA and the Maryland Department of Natural Resources, ISFS has been used to model the potential distribution of Wavyleaf Basketgrass in Maryland's Chesapeake Bay Watershed. Maximum entropy techniques are used to generate predictive maps using predictor datasets derived from remotely sensed data and climate simulation outputs. The workflow to run a model is implemented in an iRODS microservice using a custom ISFS file driver that clips and re-projects data to geographic regions of interest, then shells out to perform MaxEnt processing on the input data. When the model completes, all output files and maps from the model run are registered in iRODS and made accessible to the user. The ISFS user interface is a web browser that uses the iRODS PHP client to interact with the ISFS/iRODS- server. ISFS is designed to reside in a VMware virtual machine running SLES 11 and iRODS 3.0. The ISFS virtual machine is hosted in a VMware vSphere private cloud infrastructure to deliver the online service.

  4. Observation sequences and onboard data processing of Planet-C

    Science.gov (United States)

    Suzuki, M.; Imamura, T.; Nakamura, M.; Ishi, N.; Ueno, M.; Hihara, H.; Abe, T.; Yamada, T.

    Planet-C or VCO Venus Climate Orbiter will carry 5 cameras IR1 IR 1micrometer camera IR2 IR 2micrometer camera UVI UV Imager LIR Long-IR camera and LAC Lightning and Airglow Camera in the UV-IR region to investigate atmospheric dynamics of Venus During 30 hr orbiting designed to quasi-synchronize to the super rotation of the Venus atmosphere 3 groups of scientific observations will be carried out i image acquisition of 4 cameras IR1 IR2 UVI LIR 20 min in 2 hrs ii LAC operation only when VCO is within Venus shadow and iii radio occultation These observation sequences will define the scientific outputs of VCO program but the sequences must be compromised with command telemetry downlink and thermal power conditions For maximizing science data downlink it must be well compressed and the compression efficiency and image quality have the significant scientific importance in the VCO program Images of 4 cameras IR1 2 and UVI 1Kx1K and LIR 240x240 will be compressed using JPEG2000 J2K standard J2K is selected because of a no block noise b efficiency c both reversible and irreversible d patent loyalty free and e already implemented as academic commercial software ICs and ASIC logic designs Data compression efficiencies of J2K are about 0 3 reversible and 0 1 sim 0 01 irreversible The DE Digital Electronics unit which controls 4 cameras and handles onboard data processing compression is under concept design stage It is concluded that the J2K data compression logics circuits using space

  5. Satellite on-board real-time SAR processor prototype

    Science.gov (United States)

    Bergeron, Alain; Doucet, Michel; Harnisch, Bernd; Suess, Martin; Marchese, Linda; Bourqui, Pascal; Desnoyers, Nicholas; Legros, Mathieu; Guillot, Ludovic; Mercier, Luc; Châteauneuf, François

    2017-11-01

    A Compact Real-Time Optronic SAR Processor has been successfully developed and tested up to a Technology Readiness Level of 4 (TRL4), the breadboard validation in a laboratory environment. SAR, or Synthetic Aperture Radar, is an active system allowing day and night imaging independent of the cloud coverage of the planet. The SAR raw data is a set of complex data for range and azimuth, which cannot be compressed. Specifically, for planetary missions and unmanned aerial vehicle (UAV) systems with limited communication data rates this is a clear disadvantage. SAR images are typically processed electronically applying dedicated Fourier transformations. This, however, can also be performed optically in real-time. Originally the first SAR images were optically processed. The optical Fourier processor architecture provides inherent parallel computing capabilities allowing real-time SAR data processing and thus the ability for compression and strongly reduced communication bandwidth requirements for the satellite. SAR signal return data are in general complex data. Both amplitude and phase must be combined optically in the SAR processor for each range and azimuth pixel. Amplitude and phase are generated by dedicated spatial light modulators and superimposed by an optical relay set-up. The spatial light modulators display the full complex raw data information over a two-dimensional format, one for the azimuth and one for the range. Since the entire signal history is displayed at once, the processor operates in parallel yielding real-time performances, i.e. without resulting bottleneck. Processing of both azimuth and range information is performed in a single pass. This paper focuses on the onboard capabilities of the compact optical SAR processor prototype that allows in-orbit processing of SAR images. Examples of processed ENVISAT ASAR images are presented. Various SAR processor parameters such as processing capabilities, image quality (point target analysis), weight and

  6. Event processing in X-IFU detector onboard Athena.

    Science.gov (United States)

    Ceballos, M. T.; Cobos, B.; van der Kuurs, J.; Fraga-Encinas, R.

    2015-05-01

    The X-ray Observatory ATHENA was proposed in April 2014 as the mission to implement the science theme "The Hot and Energetic Universe" selected by ESA for L2 (the second Large-class mission in ESA's Cosmic Vision science programme). One of the two X-ray detectors designed to be onboard ATHENA is X-IFU, a cryogenic microcalorimeter based on Transition Edge Sensor (TES) technology that will provide spatially resolved high-resolution spectroscopy. X-IFU will be developed by a consortium of European research institutions currently from France (leadership), Italy, The Netherlands, Belgium, UK, Germany and Spain. From Spain, IFCA (CSIC-UC) is involved in the Digital Readout Electronics (DRE) unit of the X-IFU detector, in particular in the Event Processor Subsytem. We at IFCA are in charge of the development and implementation in the DRE unit of the Event Processing algorithms, designed to recognize, from a noisy signal, the intensity pulses generated by the absorption of the X-ray photons, and lately extract their main parameters (coordinates, energy, arrival time, grade, etc.) Here we will present the design and performance of the algorithms developed for the event recognition (adjusted derivative), and pulse grading/qualification as well as the progress in the algorithms designed to extract the energy content of the pulses (pulse optimal filtering). IFCA will finally have the responsibility of the implementation on board in the (TBD) FPGAs or micro-processors of the DRE unit, where this Event Processing part will take place, to fit into the limited telemetry of the instrument.

  7. Education and Outreach from the JOIDES Resolution during IODP Expedition 360 : linking onboard research and classroom activities during and after the Expedition.

    Science.gov (United States)

    Burgio, M.; Zhang, J.; Kavanagh, L.; Martinez, A. O.; Expedition 360 Scientists, I.

    2016-12-01

    The International Ocean Discovery Program (IODP) expeditions provide an excellent opportunity for onboard Education Officers (EO) to communicate and disseminate exciting shipboard research and discoveries to students around the world. During expedition 360, the EOs carried out 140 live webcasts, using different strategies to create an effective link between both students and scientists. Below are examples of strategies we used: -Primary school: The Beauty of Gabbro! and Life in the rocks! During the webcasts, students could virtually tour the ship, interview scientists, and see and discuss samples of the cored gabbro and minerals in thin sections. Artistic contextualization by J. Zhang, facilitated these activities. Moreover, highlighting the search for microbes in the Earth's crust , was particularly successful in engaging the students. -Middle and High school: Fun and relationships in science. Students were able to email expert scientists in the scientific discipline they chose to research and interview them during a live webcast. Some students created a song about the expedition. "on the boat - cup song - IODP project" https://www.youtube.com/watch?v=qex-w9aSV7c-University: Travels, research and the everyday life of professors onboard. We used webcasts to connect with universities in France, Japan and Italy, to create vibrant interactions between students and scientists that enabled students to get closer to their professors and understand better the life of onboard researchers. In collaboration with the science party we developed new strategies to keep in touch with students after completion of the cruise. We generated teaching kits consisting of pedaqgoical sets of pictures, exercises using onboard data, a continuously updated map "tracking geologists", and live webcasts to be organized from laboratories to schools. We already have had enthusiastic feedback from teachers that took part in our webcasts and the challenge is to continue to foster the

  8. A method for enabling real-time structural deformation in remote handling control system by utilizing offline simulation results and 3D model morphing

    International Nuclear Information System (INIS)

    Kiviranta, Sauli; Saarinen, Hannu; Maekinen, Harri; Krassi, Boris

    2011-01-01

    A full scale physical test facility, DTP2 (Divertor Test Platform 2) has been established in Finland for demonstrating and refining the Remote Handling (RH) equipment designs for ITER. The first prototype RH equipment at DTP2 is the Cassette Multifunctional Mover (CMM) equipped with Second Cassette End Effector (SCEE) delivered to DTP2 in October 2008. The purpose is to prove that CMM/SCEE prototype can be used successfully for the 2nd cassette RH operations. At the end of F4E grant 'DTP2 test facility operation and upgrade preparation', the RH operations of the 2nd cassette were successfully demonstrated to the representatives of Fusion For Energy (F4E). Due to its design, the CMM/SCEE robot has relatively large mechanical flexibilities when the robot carries the nine-ton-weighting 2nd Cassette on the 3.6-m long lever. This leads into a poor absolute accuracy and into the situation where the 3D model, which is used in the control system, does not reflect the actual deformed state of the CMM/SCEE robot. To improve the accuracy, the new method has been developed in order to handle the flexibilities within the control system's virtual environment. The effect of the load on the CMM/SCEE has been measured and minimized in the load compensation model, which is implemented in the control system software. The proposed method accounts for the structural deformations of the robot in the control system through the 3D model morphing by utilizing the finite element method (FEM) analysis for morph targets. This resulted in a considerable improvement of the CMM/SCEE absolute accuracy and the adequacy of the 3D model, which is crucially important in the RH applications, where the visual information of the controlled device in the surrounding environment is limited.

  9. The Influence of Volcanic Eruptions on the Climate of Tropical South America During the Last Millennium in an Isotope-Enabled General Circulation Model

    Science.gov (United States)

    Colose, Christopher M.; LeGrande, Allegra N.; Vuille, Mathias

    2016-01-01

    Currently, little is known on how volcanic eruptions impact large-scale climate phenomena such as South American paleo-intertropical Convergence Zone (ITCZ) position and summer monsoon behavior. In this paper, an analysis of observations and model simulations is employed to assess the influence of large volcanic eruptions on the climate of tropical South America. This problem is first considered for historically recent volcanic episodes for which more observations are available but where fewer events exist and the confounding effects of El Niño-Southern Oscillation (ENSO) lead to inconclusive interpretation of the impact of volcanic eruptions at the continental scale. Therefore, we also examine a greater number of reconstructed volcanic events for the period 850 CE to present that are incorporated into the NASA GISS ModelE2-R simulation of the last millennium. An advantage of this model is its ability to explicitly track water isotopologues throughout the hydrologic cycle and simulating the isotopic imprint following a large eruption. This effectively removes a degree of uncertainty associated with error-prone conversion of isotopic signals into climate variables, and allows for a direct comparison between GISS simulations and paleoclimate proxy records. Our analysis reveals that both precipitation and oxygen isotope variability respond with a distinct seasonal and spatial structure across tropical South America following an eruption. During austral winter, the heavy oxygen isotope in precipitation is enriched, likely due to reduced moisture convergence in the ITCZ domain and reduced rainfall over northern South America. During austral summer, however, more negative values of the precipitation isotopic composition are simulated over Amazonia, despite reductions in rainfall, suggesting that the isotopic response is not a simple function of the "amount effect". During the South American monsoon season, the amplitude of the temperature response to volcanic forcing is

  10. Supplement of: The Influence of Volcanic Eruptions on the Climate of Tropical South America During the Last Millennium in an Isotope-Enabled General Circulation Model

    Science.gov (United States)

    Colose, Christopher; LeGrande, Allegra N.; Vuille, Mathias

    2016-01-01

    Currently, little is known on how volcanic eruptions impact large-scale climate phenomena such as South American paleo-intertropical Convergence Zone (ITCZ) position and summer monsoon behavior. In this paper, an analysis of observations and model simulations is employed to assess the influence of large volcanic eruptions on the climate of tropical South America. This problem is first considered for historically recent volcanic episodes for which more observations are available but where fewer events exist and the confounding effects of El NioSouthern Oscillation (ENSO) lead to inconclusive interpretation of the impact of volcanic eruptions at the continental scale. Therefore, we also examine a greater number of reconstructed volcanic events for the period 850CE to present that are incorporated into the NASA GISS ModelE2-R simulation of the last millennium.An advantage of this model is its ability to explicitly track water isotopologues throughout the hydrologic cycle and simulating the isotopic imprint following a large eruption. This effectively removes a degree of uncertainty associated with error-prone conversion of isotopic signals into climate variables, and allows for a direct comparison between GISS simulations and paleoclimate proxy records.Our analysis reveals that both precipitation and oxygen isotope variability respond with a distinct seasonal and spatial structure across tropical South America following an eruption. During austral winter, the heavy oxygen isotope in precipitation is enriched, likely due to reduced moisture convergence in the ITCZ domain and reduced rainfall over northern South America. During austral summer, however, more negative values of the precipitation isotopic composition are simulated over Amazonia, despite reductions in rainfall, suggesting that the isotopic response is not a simple function of the amount effect. During the South American monsoon season, the amplitude of the temperature response to volcanic forcing is larger

  11. Scaling between reanalyses and high-resolution land-surface modelling in mountainous areas - enabling better application and testing of reanalyses in heterogeneous environments

    Science.gov (United States)

    Gruber, S.; Fiddes, J.

    2013-12-01

    In mountainous topography, the difference in scale between atmospheric reanalyses (typically tens of kilometres) and relevant processes and phenomena near the Earth surface, such as permafrost or snow cover (meters to tens of meters) is most obvious. This contrast of scales is one of the major obstacles to using reanalysis data for the simulation of surface phenomena and to confronting reanalyses with independent observation. At the example of modelling permafrost in mountain areas (but simple to generalise to other phenomena and heterogeneous environments), we present and test methods against measurements for (A) scaling atmospheric data from the reanalysis to the ground level and (B) smart sampling of the heterogeneous landscape in order to set up a lumped model simulation that represents the high-resolution land surface. TopoSCALE (Part A, see http://dx.doi.org/10.5194/gmdd-6-3381-2013) is a scheme, which scales coarse-grid climate fields to fine-grid topography using pressure level data. In addition, it applies necessary topographic corrections e.g. those variables required for computation of radiation fields. This provides the necessary driving fields to the LSM. Tested against independent ground data, this scheme has been shown to improve the scaling and distribution of meteorological parameters in complex terrain, as compared to conventional methods, e.g. lapse rate based approaches. TopoSUB (Part B, see http://dx.doi.org/10.5194/gmd-5-1245-2012) is a surface pre-processor designed to sample a fine-grid domain (defined by a digital elevation model) along important topographical (or other) dimensions through a clustering scheme. This allows constructing a lumped model representing the main sources of fine-grid variability and applying a 1D LSM efficiently over large areas. Results can processed to derive (i) summary statistics at coarse-scale re-analysis grid resolution, (ii) high-resolution data fields spatialized to e.g., the fine-scale digital elevation

  12. An innovative methodology for the transmission of information, using Sensor Web Enablement, from ongoing research vessels.

    Science.gov (United States)

    Sorribas, Jordi; Sinquin, Jean Marc; Diviacco, Paolo; De Cauwer, Karien; Danobeitia, Juanjo; Olive, Joan; Bermudez, Luis

    2013-04-01

    Research vessels are sophisticated laboratories with complex data acquisition systems for a variety of instruments and sensors that acquire real-time information of many different parameters and disciplines. The overall data and metadata acquired commonly spread using well-established standards for data centers; however, the instruments and systems on board are not always well described and it may miss significant information. Thus, important information such as instrument calibration or operational data often does not reach to the data center. The OGC Sensor Web Enablement standards provide solutions to serve complex data along with the detailed description of the process used to obtain them. We show an innovative methodology on how to use Sensor Web Enablement standards to describe and serve information from the research vessels, the data acquisition systems used onboard, and data sets resulting from the onboard work. This methodology is designed to be used in research vessels, but also applies to data centers to avoid loss of information in between The proposed solution considers (I) the difficulty to describe a multidisciplinary and complex mobile sensor system, (II) it can be easily integrated with data acquisition systems onboard, (III) it uses the complex and incomplete typical vocabulary in marine disciplines, (IV) it provides contacts with the data and metadata services at the Data Centers, and (V) it manages the configuration changes with time of the instrument.

  13. Performance assessment of an onboard monitoring system for CMV drivers : a field operational test : research brief.

    Science.gov (United States)

    2016-11-01

    The primary goal of an onboard monitoring system (OBMS) is to enhance driver performance and safety. OBMSs are employed with the expectation that feedback provided concurrently (via flashing feedback lights in the vehicle) and cumulatively (via coach...

  14. Advanced Hybrid On-Board Data Processor - SpaceCube 2.0

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop advanced on-board processing to meet the requirements of the Decadal Survey missions: advanced instruments (hyper-spectral, SAR, etc) require advanced...

  15. On-Board Thermal Management of Waste Heat from a High-Energy Device

    National Research Council Canada - National Science Library

    Klatt, Nathan D

    2008-01-01

    The use of on-board high-energy devices such as megawatt lasers and microwave emitters requires aircraft system integration of thermal devices to either get rid of waste heat or utilize it in other areas of the aircraft...

  16. Geo-Enabled, Mobile Services

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard

    2006-01-01

    We are witnessing the emergence of a global infrastructure that enables the widespread deployment of geo-enabled, mobile services in practice. At the same time, the research community has also paid increasing attention to data management aspects of mobile services. This paper offers me...

  17. Toward genome-enabled mycology.

    Science.gov (United States)

    Hibbett, David S; Stajich, Jason E; Spatafora, Joseph W

    2013-01-01

    Genome-enabled mycology is a rapidly expanding field that is characterized by the pervasive use of genome-scale data and associated computational tools in all aspects of fungal biology. Genome-enabled mycology is integrative and often requires teams of researchers with diverse skills in organismal mycology, bioinformatics and molecular biology. This issue of Mycologia presents the first complete fungal genomes in the history of the journal, reflecting the ongoing transformation of mycology into a genome-enabled science. Here, we consider the prospects for genome-enabled mycology and the technical and social challenges that will need to be overcome to grow the database of complete fungal genomes and enable all fungal biologists to make use of the new data.

  18. Full 3D modelling of pulse propagation enables efficient nonlinear frequency conversion with low energy laser pulses in a single-element tripler

    Science.gov (United States)

    Kardaś, Tomasz M.; Nejbauer, Michał; Wnuk, Paweł; Resan, Bojan; Radzewicz, Czesław; Wasylczyk, Piotr

    2017-02-01

    Although new optical materials continue to open up access to more and more wavelength bands where femtosecond laser pulses can be generated, light frequency conversion techniques are still indispensable in filling the gaps on the ultrafast spectral scale. With high repetition rate, low pulse energy laser sources (oscillators) tight focusing is necessary for a robust wave mixing and the efficiency of broadband nonlinear conversion is limited by diffraction as well as spatial and temporal walk-off. Here we demonstrate a miniature third harmonic generator (tripler) with conversion efficiency exceeding 30%, producing 246 fs UV pulses via cascaded second order processes within a single laser beam focus. Designing this highly efficient and ultra compact frequency converter was made possible by full 3-dimentional modelling of propagation of tightly focused, broadband light fields in nonlinear and birefringent media.

  19. Smart Sensing Based on DNA-Metal Interaction Enables a Label-Free and Resettable Security Model of Electrochemical Molecular Keypad Lock.

    Science.gov (United States)

    Du, Yan; Han, Xu; Wang, Chenxu; Li, Yunhui; Li, Bingling; Duan, Hongwei

    2018-01-26

    Recently, molecular keypad locks have received increasing attention. As a new subgroup of smart biosensors, they show great potential for protecting information as a molecular security data processor, rather than merely molecular recognition and quantitation. Herein, label-free electrochemically transduced Ag + and cysteine (Cys) sensors were developed. A molecular keypad lock model with reset function was successfully realized based on the balanced interaction of metal ion with its nucleic acid and chemical ligands. The correct input of "1-2-3" (i.e., "Ag + -Cys-cDNA") is the only password of such molecular keypad lock. Moreover, the resetting process of either correct or wrong input order could be easily made by Cys, buffer, and DI water treatment. Therefore, our system provides an even smarter system of molecular keypad lock, which could inhibit illegal access of unauthorized users, holding great promise in information protection at the molecular level.

  20. Lipid-Based Formulations Can Enable the Model Poorly Water-Soluble Weakly Basic Drug Cinnarizine to Precipitate in an Amorphous-Salt Form during in Vitro Digestion

    DEFF Research Database (Denmark)

    Khan, Jamal; Rades, Thomas; Boyd, Ben J

    2016-01-01

    The tendency for poorly water-soluble weakly basic drugs to precipitate in a noncrystalline form during the in vitro digestion of lipid-based formulations (LBFs) was linked to an ionic interaction between drug and fatty acid molecules produced upon lipid digestion. Cinnarizine was chosen as a model...... from the starting free base crystalline material to the hydrochloride salt, thus supporting the case that ionic interactions between weak bases and fatty acid molecules during digestion are responsible for producing amorphous-salts upon precipitation. The conclusion has wide implications...... weakly basic drug and was dissolved in a medium-chain (MC) LBF, which was subject to in vitro lipolysis experiments at various pH levels above and below the reported pKa value of cinnarizine (7.47). The solid-state form of the precipitated drug was analyzed using X-ray diffraction (XRD), Fourier...

  1. Multimodal LA-ICP-MS and nanoSIMS imaging enables copper mapping within photoreceptor megamitochondria in a zebrafish model of Menkes disease.

    Science.gov (United States)

    Ackerman, Cheri M; Weber, Peter K; Xiao, Tong; Thai, Bao; Kuo, Tiffani J; Zhang, Emily; Pett-Ridge, Jennifer; Chang, Christopher J

    2018-03-01

    Copper is essential for eukaryotic life, and animals must acquire this nutrient through the diet and distribute it to cells and organelles for proper function of biological targets. Indeed, mutations in the central copper exporter ATP7A contribute to a spectrum of diseases, including Menkes disease, with symptoms ranging from neurodegeneration to lax connective tissue. As such, a better understanding of the fundamental impacts of ATP7A mutations on in vivo copper distributions is of relevance to those affected by these diseases. Here we combine metal imaging and optical imaging techniques at a variety of spatial resolutions to identify tissues and structures with altered copper levels in the Calamity gw71 zebrafish model of Menkes disease. Rapid profiling of tissue slices with LA-ICP-MS identified reduced copper levels in the brain, neuroretina, and liver of Menkes fish compared to control specimens. High resolution nanoSIMS imaging of the neuroretina, combined with electron and confocal microscopies, identified the megamitochondria of photoreceptors as loci of copper accumulation in wildtype fish, with lower levels of megamitochondrial copper observed in Calamity gw71 zebrafish. Interestingly, this localized copper decrease does not result in impaired photoreceptor development or altered megamitochondrial morphology, suggesting the prioritization of copper at sufficient levels for maintaining essential mitochondrial functions. Together, these data establish the Calamity gw71 zebrafish as an optically transparent in vivo model for the study of neural copper misregulation, illuminate a role for the ATP7A copper exporter in trafficking copper to the neuroretina, and highlight the utility of combining multiple imaging techniques for studying metals in whole organism settings with spatial resolution.

  2. Development strategies for the satellite flight software on-board Meteosat Third Generation

    Science.gov (United States)

    Tipaldi, Massimo; Legendre, Cedric; Koopmann, Olliver; Ferraguto, Massimo; Wenker, Ralf; D'Angelo, Gianni

    2018-04-01

    Nowadays, satellites are becoming increasingly software dependent. Satellite Flight Software (FSW), that is to say, the application software running on the satellite main On-Board Computer (OBC), plays a relevant role in implementing complex space mission requirements. In this paper, we examine relevant technical approaches and programmatic strategies adopted for the development of the Meteosat Third Generation Satellite (MTG) FSW. To begin with, we present its layered model-based architecture, and the means for ensuring a robust and reliable interaction among the FSW components. Then, we focus on the selection of an effective software development life cycle model. In particular, by combining plan-driven and agile approaches, we can fulfill the need of having preliminary SW versions. They can be used for the elicitation of complex system-level requirements as well as for the initial satellite integration and testing activities. Another important aspect can be identified in the testing activities. Indeed, very demanding quality requirements have to be fulfilled in satellite SW applications. This manuscript proposes a test automation framework, which uses an XML-based test procedure language independent of the underlying test environment. Finally, a short overview of the MTG FSW sizing and timing budgets concludes the paper.

  3. Adaptive approach for on-board impedance parameters and voltage estimation of lithium-ion batteries in electric vehicles

    Science.gov (United States)

    Farmann, Alexander; Waag, Wladislaw; Sauer, Dirk Uwe

    2015-12-01

    Robust algorithms using reduced order equivalent circuit model (ECM) for an accurate and reliable estimation of battery states in various applications become more popular. In this study, a novel adaptive, self-learning heuristic algorithm for on-board impedance parameters and voltage estimation of lithium-ion batteries (LIBs) in electric vehicles is introduced. The presented approach is verified using LIBs with different composition of chemistries (NMC/C, NMC/LTO, LFP/C) at different aging states. An impedance-based reduced order ECM incorporating ohmic resistance and a combination of a constant phase element and a resistance (so-called ZARC-element) is employed. Existing algorithms in vehicles are much more limited in the complexity of the ECMs. The algorithm is validated using seven day real vehicle data with high temperature variation including very low temperatures (from -20 °C to +30 °C) at different Depth-of-Discharges (DoDs). Two possibilities to approximate both ZARC-elements with finite number of RC-elements on-board are shown and the results of the voltage estimation are compared. Moreover, the current dependence of the charge-transfer resistance is considered by employing Butler-Volmer equation. Achieved results indicate that both models yield almost the same grade of accuracy.

  4. Analysis of On-Board Photovoltaics for a Battery Electric Bus and Their Impact on Battery Lifespan

    Directory of Open Access Journals (Sweden)

    Kevin R. Mallon

    2017-07-01

    Full Text Available Heavy-duty electric powertrains provide a potential solution to the high emissions and low fuel economy of trucks, buses, and other heavy-duty vehicles. However, the cost, weight, and lifespan of electric vehicle batteries limit the implementation of such vehicles. This paper proposes supplementing the battery with on-board photovoltaic modules. In this paper, a bus model is created to analyze the impact of on-board photovoltaics on electric bus range and battery lifespan. Photovoltaic systems that cover the bus roof and bus sides are considered. The bus model is simulated on a suburban bus drive cycle on a bus route in Davis, CA, USA for a representative sample of yearly weather conditions. Roof-mounted panels increased vehicle driving range by 4.7% on average annually, while roof and side modules together increased driving range by 8.9%. However, variations in weather conditions meant that this additional range was not reliably available. For constant vehicle range, rooftop photovoltaic modules extended battery cycle life by up to 10% while modules on both the roof and sides extended battery cycle life by up to 19%. Although side-mounted photovoltaics increased cycle life and range, they were less weight- and cost-effective compared to the roof-mounted panels.

  5. In situ characterization of martian materials and detection of organic compounds with the MOMA investigation onboard the ExoMars rover

    Science.gov (United States)

    Arevalo, R. D., Jr.; Grubisic, A.; van Amerom, F. H. W.; Danell, R.; Li, X.; Kaplan, D.; Pinnick, V. T.; Brinckerhoff, W. B.; Getty, S.; Goesmann, F.

    2017-12-01

    Ground-based observations (e.g., via the NASA Infrared Telescope Facility) and in situ investigations, including flybys (e.g., Mariner Program), orbiters (most recently MAVEN and ExoMars TGO), stationary landers (i.e., Viking, Pathfinder and Phoenix), and mobile rovers (i.e., Sojourner, Spirit/Opportunity and Curiosity), have enabled the progressive exploration of the Martian surface. Evidence for liquid water, manifest as hydrated and amorphous materials representative of alteration products of primary minerals/lithologies, and geomorphological features such as recurring slope lineae (RSL), valley networks and open-basin lakes, indicates that Mars may have hosted habitable environments, at least on local scales (temporally and spatially). However, the preservation potential of molecular biosignatures in the upper meter(s) of the surface is limited by destructive cosmic radiation and oxidative chemical reactions. Moreover, the determination of indigenous versus exogenous origins, and biotic versus abiotic formation mechanisms of detected organic material, provide additional challenges for future missions to the red planet. The Mars Organic Molecule Analyzer (MOMA) onboard the ExoMars rover, set to launch in 2020, provides an unprecedented opportunity to discover unambiguous indicators of life. The MOMA instrument will investigate the compositions of materials collected during multiple vertical surveys, extending as deep as two meters below the surface, via: i) gas chromatography mass spectrometry, a method geared towards the detection of volatile organics and the determination of molecular chirality, mapping to previous in situ Mars investigations; and, ii) laser desorption mass spectrometry, a technique commonly employed in research laboratories to detect larger, more refractory organic materials, but a first for spaceflight applications. Selective ion excitation and tandem mass spectrometry (MS/MS) techniques support the isolation and disambiguation of complex

  6. Enabling Philippine Farmers to Adapt to Climate Variability Using Seasonal Climate and Weather Forecast with a Crop Simulation Model in an SMS-based Farmer Decision Support System

    Science.gov (United States)

    Ebardaloza, J. B. R.; Trogo, R.; Sabido, D. J.; Tongson, E.; Bagtasa, G.; Balderama, O. F.

    2015-12-01

    Corn farms in the Philippines are rainfed farms, hence, it is of utmost importance to choose the start of planting date so that the critical growth stages that are in need of water will fall on dates when there is rain. Most farmers in the Philippines use superstitions and traditions as basis for farming decisions such as when to start planting [1]. Before climate change, superstitions like planting after a feast day of a saint has worked for them but with the recent progression of climate change, farmers now recognize that there is a need for technological intervention [1]. The application discussed in this paper presents a solution that makes use of meteorological station sensors, localized seasonal climate forecast, localized weather forecast and a crop simulation model to provide recommendations to farmers based on the crop cultivar, soil type and fertilizer type used by farmers. It is critical that the recommendations given to farmers are not generic as each farmer would have different needs based on their cultivar, soil, fertilizer, planting schedule and even location [2]. This application allows the farmer to inquire about whether it will rain in the next seven days, the best date to start planting based on the potential yield upon harvest, when to apply fertilizer and by how much, when to water and by how much. Short messaging service (SMS) is the medium chosen for this application because while mobile penetration in the Philippines is as high as 101%, the smart phone penetration is only at 15% [3]. SMS has been selected as it has been identified as the most effective way of reaching farmers with timely agricultural information and knowledge [4,5]. The recommendations while derived from making use of Automated Weather Station (AWS) sensor data, Weather Research Forecasting (WRF) models and DSSAT 4.5 [9], are translated into the local language of the farmers and in a format that is easily understood as recommended in [6,7,8]. A pilot study has been started

  7. Enabling DRM-preserving Digital content Redistribution

    NARCIS (Netherlands)

    Krishnan Nair, S.; Popescu, B.C.; Gamage, C.D.; Crispo, B.; Tanenbaum, A.S.

    2005-01-01

    Traditionally, the process of online digital content distribution has involved a limited number of centralised distributors selling protected contents and licenses authorising the use of the se contents, to consumers. In this paper, we extend this model by introducing a security scheme that enables

  8. Use of prototype two-channel endoscope with elevator enables larger lift-and-snare endoscopic mucosal resection in a porcine model.

    Science.gov (United States)

    Atkinson, Matthew; Chukwumah, Chike; Marks, Jeffrey; Chak, Amitabh

    2014-02-01

    Flat and depressed lesions are becoming increasingly recognized in the esophagus, stomach, and colon. Various techniques have been described for endoscopic mucosal resection (EMR) of these lesions. To evaluate the efficacy of lift-grasp-cut EMR using a prototype dual-channel forward-viewing endoscope with an instrument elevator in one accessory channel (dual-channel elevator scope) as compared to standard dual-channel endoscopes. EMR was performed using a lift-grasp-cut technique on normal flat rectosigmoid or gastric mucosa in live porcine models after submucosal injection of 4 mL of saline using a dual-channel elevator scope or a standard dual-channel endoscope. With the dual-channel elevator scope, the elevator was used to attain further lifting of the mucosa. The primary endpoint was size of the EMR specimen and the secondary endpoint was number of complications. Twelve experiments were performed (six gastric and six colonic). Mean specimen diameter was 2.27 cm with the dual-channel elevator scope and 1.34 cm with the dual-channel endoscope (P = 0.018). Two colonic perforations occurred with the dual-channel endoscope, vs no complications with the dual-channel elevator scope. The increased lift of the mucosal epithelium, through use of the dual-channel elevator scope, allows for larger EMR when using a lift-grasp-cut technique. Noting the thin nature of the porcine colonic wall, use of the elevator may also make this technique safer.

  9. An in vitro model of the horse gut microbiome enables identification of lactate-utilizing bacteria that differentially respond to starch induction.

    Science.gov (United States)

    Biddle, Amy S; Black, Samuel J; Blanchard, Jeffrey L

    2013-01-01

    Laminitis is a chronic, crippling disease triggered by the sudden influx of dietary starch. Starch reaches the hindgut resulting in enrichment of lactic acid bacteria, lactate accumulation, and acidification of the gut contents. Bacterial products enter the bloodstream and precipitate systemic inflammation. Hindgut lactate levels are normally low because specific bacterial groups convert lactate to short chain fatty acids. Why this mechanism fails when lactate levels rapidly rise, and why some hindgut communities can recover is unknown. Fecal samples from three adult horses eating identical diets provided bacterial communities for this in vitro study. Triplicate microcosms of fecal slurries were enriched with lactate and/or starch. Metabolic products (short chain fatty acids, headspace gases, and hydrogen sulfide) were measured and microbial community compositions determined using Illumina 16S rRNA sequencing over 12-hour intervals. We report that patterns of change in short chain fatty acid levels and pH in our in vitro system are similar to those seen in in vivo laminitis induction models. Community differences between microcosms with disparate abilities to clear excess lactate suggest profiles conferring resistance of starch-induction conditions. Where lactate levels recover following starch induction conditions, propionate and acetate levels rise correspondingly and taxa related to Megasphaeraelsdenii reach levels exceeding 70% relative abundance. In lactate and control cultures, taxa related to Veillonellamontpellierensis are enriched as lactate levels fall. Understanding these community differences and factors promoting the growth of specific lactate utilizing taxa may be useful to prevent acidosis under starch-induction conditions.

  10. A framework to promote collective action within the One Health community of practice: Using participatory modelling to enable interdisciplinary, cross-sectoral and multi-level integration

    Directory of Open Access Journals (Sweden)

    Aurelie Binot

    2015-12-01

    The implementation of a One Health (OH approach in this context calls for improved integration among disciplines and improved cross-sectoral collaboration, involving stakeholders at different levels. For sure, such integration is not achieved spontaneously, implies methodological guidelines and has transaction costs. We explore pathways for implementing such collaboration in SEA context, highlighting the main challenges to be faced by researchers and other target groups involved in OH actions. On this basis, we propose a conceptual framework of OH integration. Throughout 3 components (field-based data management, professional training workshops and higher education, we suggest to develop a new culture of networking involving actors from various disciplines, sectors and levels (from the municipality to the Ministries through a participatory modelling process, fostering synergies and cooperation. This framework could stimulate long-term dialogue process, based on the combination of case studies implementation and capacity building. It aims for implementing both institutional OH dynamics (multi-stakeholders and cross-sectoral and research approaches promoting systems thinking and involving social sciences to follow-up and strengthen collective action.

  11. Myelosuppressive conditioning using busulfan enables bone marrow cell accumulation in the spinal cord of a mouse model of amyotrophic lateral sclerosis.

    Directory of Open Access Journals (Sweden)

    Coral-Ann B Lewis

    Full Text Available Myeloablative preconditioning using irradiation is the most commonly used technique to generate rodents having chimeric bone marrow, employed for the study of bone marrow-derived cell accumulation in the healthy and diseased central nervous system. However, irradiation has been shown to alter the blood-brain barrier, potentially creating confounding artefacts. To better study the potential of bone marrow-derived cells to function as treatment vehicles for neurodegenerative diseases alternative preconditioning regimens must be developed. We treated transgenic mice that over-express human mutant superoxide dismutase 1, a model of amyotrophic lateral sclerosis, with busulfan to determine whether this commonly used chemotherapeutic leads to stable chimerism and promotes the entry of bone marrow-derived cells into spinal cord. Intraperitoneal treatment with busulfan at 60 mg/kg or 80 mg/kg followed by intravenous injection of green fluorescent protein-expressing bone marrow resulted in sustained levels of chimerism (~80%. Bone marrow-derived cells accumulated in the lumbar spinal cord of diseased mice at advanced stages of pathology at both doses, with limited numbers of bone marrow derived cells observed in the spinal cords of similarly treated, age-matched controls; the majority of bone marrow-derived cells in spinal cord immunolabelled for macrophage antigens. Comparatively, significantly greater numbers of bone marrow-derived cells were observed in lumbar spinal cord following irradiative myeloablation. These results demonstrate bone marrow-derived cell accumulation in diseased spinal cord is possible without irradiative preconditioning.

  12. Operational challenges in conducting a community-based technology-enabled mental health services delivery model for rural India: Experiences from the SMART Mental Health Project.

    Science.gov (United States)

    Maulik, Pallab K; Kallakuri, Sudha; Devarapalli, Siddhardha

    2018-01-01

    Background: There are large gaps in the delivery of mental health care in low- and middle-income countries such as India, and the problems are even more acute in rural settings due to lack of resources, remoteness, and lack of infrastructure, amongst other factors. The Systematic Medical Appraisal Referral and Treatment (SMART) Mental Health Project was conceived as a mental health services delivery model using technology-based solutions for rural India. This paper reports on the operational strategies used to facilitate the implementation of the intervention. Method: Key components of the SMART Mental Health Project included delivering an anti-stigma campaign, training of primary health workers in screening, diagnosing and managing stress, depression and increased suicide risk and task sharing of responsibilities in delivering care; and using mobile technology based electronic decision support systems to support delivery of algorithm based care for such disorders. The intervention was conducted in 42 villages across two sites in the state of Andhra Pradesh in south India. A pre-post mixed methods evaluation was done, and in this paper operational challenges are reported. Results: Both quantitative and qualitative results from the evaluation from one site covering about 5000 adults showed that the intervention was feasible and acceptable, and initial results indicated that it was beneficial in increasing access to mental health care and reducing depression and anxiety symptoms. A number of strategies were initiated in response to operational challenges to ensure smoother conduct of the project and facilitated the project to be delivered as envisaged. Conclusions: The operational strategies initiated for this project were successful in ensuring the delivery of the intervention. Those, coupled with other more systematic processes have informed the researchers to understand key processes that need to be in place to develop a more robust study, that could eventually be

  13. A framework to promote collective action within the One Health community of practice: Using participatory modelling to enable interdisciplinary, cross-sectoral and multi-level integration.

    Science.gov (United States)

    Binot, Aurelie; Duboz, Raphaël; Promburom, Panomsak; Phimpraphai, Waraphon; Cappelle, Julien; Lajaunie, Claire; Goutard, Flavie Luce; Pinyopummintr, Tanu; Figuié, Muriel; Roger, François Louis

    2015-12-01

    As Southeast Asia (SEA) is characterized by high human and domestic animal densities, growing intensification of trade, drastic land use changes and biodiversity erosion, this region appears to be a hotspot to study complex dynamics of zoonoses emergence and health issues at the Animal-Human-Environment interface. Zoonotic diseases and environmental health issues can have devastating socioeconomic and wellbeing impacts. Assessing and managing the related risks implies to take into account ecological and social dynamics at play, in link with epidemiological patterns. The implementation of a One Health ( OH ) approach in this context calls for improved integration among disciplines and improved cross-sectoral collaboration, involving stakeholders at different levels. For sure, such integration is not achieved spontaneously, implies methodological guidelines and has transaction costs. We explore pathways for implementing such collaboration in SEA context, highlighting the main challenges to be faced by researchers and other target groups involved in OH actions. On this basis, we propose a conceptual framework of OH integration. Throughout 3 components (field-based data management, professional training workshops and higher education), we suggest to develop a new culture of networking involving actors from various disciplines, sectors and levels (from the municipality to the Ministries) through a participatory modelling process, fostering synergies and cooperation. This framework could stimulate long-term dialogue process, based on the combination of case studies implementation and capacity building. It aims for implementing both institutional OH dynamics (multi-stakeholders and cross-sectoral) and research approaches promoting systems thinking and involving social sciences to follow-up and strengthen collective action.

  14. An in vitro model of the horse gut microbiome enables identification of lactate-utilizing bacteria that differentially respond to starch induction.

    Directory of Open Access Journals (Sweden)

    Amy S Biddle

    Full Text Available Laminitis is a chronic, crippling disease triggered by the sudden influx of dietary starch. Starch reaches the hindgut resulting in enrichment of lactic acid bacteria, lactate accumulation, and acidification of the gut contents. Bacterial products enter the bloodstream and precipitate systemic inflammation. Hindgut lactate levels are normally low because specific bacterial groups convert lactate to short chain fatty acids. Why this mechanism fails when lactate levels rapidly rise, and why some hindgut communities can recover is unknown. Fecal samples from three adult horses eating identical diets provided bacterial communities for this in vitro study. Triplicate microcosms of fecal slurries were enriched with lactate and/or starch. Metabolic products (short chain fatty acids, headspace gases, and hydrogen sulfide were measured and microbial community compositions determined using Illumina 16S rRNA sequencing over 12-hour intervals. We report that patterns of change in short chain fatty acid levels and pH in our in vitro system are similar to those seen in in vivo laminitis induction models. Community differences between microcosms with disparate abilities to clear excess lactate suggest profiles conferring resistance of starch-induction conditions. Where lactate levels recover following starch induction conditions, propionate and acetate levels rise correspondingly and taxa related to Megasphaeraelsdenii reach levels exceeding 70% relative abundance. In lactate and control cultures, taxa related to Veillonellamontpellierensis are enriched as lactate levels fall. Understanding these community differences and factors promoting the growth of specific lactate utilizing taxa may be useful to prevent acidosis under starch-induction conditions.

  15. Automated Miniaturized Instrument for Space Biology Applications and the Monitoring of the Astronauts Health Onboard the ISS

    Science.gov (United States)

    Karouia, Fathi; Peyvan, Kia; Danley, David; Ricco, Antonio J.; Santos, Orlando; Pohorille, Andrew

    2011-01-01

    substantially by combining it with other technologies for automated, miniaturized, high-throughput biological measurements, such as fast sequencing, protein identification (proteomics) and metabolite profiling (metabolomics). Thus, the system can be integrated with other biomedical instruments in order to support and enhance telemedicine capability onboard ISS. NASA's mission includes sustained investment in critical research leading to effective countermeasures to minimize the risks associated with human spaceflight, and the use of appropriate technology to sustain space exploration at reasonable cost. Our integrated microarray technology is expected to fulfill these two critical requirements and to enable the scientific community to better understand and monitor the effects of the space environment on microorganisms and on the astronaut, in the process leveraging current capabilities and overcoming present limitations.

  16. Fluid Physics Experiments onboard International Space Station: Through the Eyes of a Scientist.

    Science.gov (United States)

    Shevtsova, Valentina

    Fluids are present everywhere in everyday life. They are also present as fuel, in support systems or as consumable in rockets and onboard of satellites and space stations. Everyone experiences every day that fluids are very sensitive to gravity: on Earth liquids flow downwards and gases mostly rise. Nowadays much of the interest of the scientific community is on studying the phenomena at microscales in so-called microfluidic systems. However, at smaller scales the experimental investigation of convective flows becomes increasingly difficult as the control parameter Ra scales with g L (3) (g; acceleration level, L: length scale). A unique alternative to the difficulty of investigating systems with small length scale on the ground is to reduce the gravity level g. In systems with interfaces, buoyancy forces are proportional to the volume of the liquid, while capillary forces act solely on the liquid surface. The importance of buoyancy diminishes either at very small scales or with reducing the acceleration level. Under the weightless conditions of space where buoyancy is virtually eliminated, other mechanisms such as capillary forces, diffusion, vibration, shear forces, electrostatic and electromagnetic forces are dominating in the fluid behaviour. This is why research in space represents a powerful tool for scientific research in this field. Understanding how fluids work really matters and so does measuring their properties accurately. Presently, a number of scientific laboratories, as usual goes with multi-user instruments, are involved in fluid research on the ISS. The programme of fluid physics experiments on-board deals with capillary flows, diffusion, dynamics in complex fluids (foams, emulsions and granular matter), heat transfer processes with phase change, physics and physico-chemistry near or beyond the critical point and it also extends to combustion physics. The top-level objectives of fluid research in space are as follows: (i) to investigate fluid

  17. High-Speed On-Board Data Processing Platform for LIDAR Projects at NASA Langley Research Center

    Science.gov (United States)

    Beyon, J.; Ng, T. K.; Davis, M. J.; Adams, J. K.; Lin, B.

    2015-12-01

    The project called High-Speed On-Board Data Processing for Science Instruments (HOPS) has been funded by NASA Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST) program during April, 2012 - April, 2015. HOPS is an enabler for science missions with extremely high data processing rates. In this three-year effort of HOPS, Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) and 3-D Winds were of interest in particular. As for ASCENDS, HOPS replaces time domain data processing with frequency domain processing while making the real-time on-board data processing possible. As for 3-D Winds, HOPS offers real-time high-resolution wind profiling with 4,096-point fast Fourier transform (FFT). HOPS is adaptable with quick turn-around time. Since HOPS offers reusable user-friendly computational elements, its FPGA IP Core can be modified for a shorter development period if the algorithm changes. The FPGA and memory bandwidth of HOPS is 20 GB/sec while the typical maximum processor-to-SDRAM bandwidth of the commercial radiation tolerant high-end processors is about 130-150 MB/sec. The inter-board communication bandwidth of HOPS is 4 GB/sec while the effective processor-to-cPCI bandwidth of commercial radiation tolerant high-end boards is about 50-75 MB/sec. Also, HOPS offers VHDL cores for the easy and efficient implementation of ASCENDS and 3-D Winds, and other similar algorithms. A general overview of the 3-year development of HOPS is the goal of this presentation.

  18. Computer Security Systems Enable Access.

    Science.gov (United States)

    Riggen, Gary

    1989-01-01

    A good security system enables access and protects information from damage or tampering, but the most important aspects of a security system aren't technical. A security procedures manual addresses the human element of computer security. (MLW)

  19. How GNSS Enables Precision Farming

    Science.gov (United States)

    2014-12-01

    Precision farming: Feeding a Growing Population Enables Those Who Feed the World. Immediate and Ongoing Needs - population growth (more to feed) - urbanization (decrease in arable land) Double food production by 2050 to meet world demand. To meet thi...

  20. On-Board Real-Time Optimization Control for Turbo-Fan Engine Life Extending

    Science.gov (United States)

    Zheng, Qiangang; Zhang, Haibo; Miao, Lizhen; Sun, Fengyong

    2017-11-01

    A real-time optimization control method is proposed to extend turbo-fan engine service life. This real-time optimization control is based on an on-board engine mode, which is devised by a MRR-LSSVR (multi-input multi-output recursive reduced least squares support vector regression method). To solve the optimization problem, a FSQP (feasible sequential quadratic programming) algorithm is utilized. The thermal mechanical fatigue is taken into account during the optimization process. Furthermore, to describe the engine life decaying, a thermal mechanical fatigue model of engine acceleration process is established. The optimization objective function not only contains the sub-item which can get fast response of the engine, but also concludes the sub-item of the total mechanical strain range which has positive relationship to engine fatigue life. Finally, the simulations of the conventional optimization control which just consider engine acceleration performance or the proposed optimization method have been conducted. The simulations demonstrate that the time of the two control methods from idle to 99.5 % of the maximum power are equal. However, the engine life using the proposed optimization method could be surprisingly increased by 36.17 % compared with that using conventional optimization control.

  1. Measurement of dose equivalent distribution on-board commercial jet aircraft

    International Nuclear Information System (INIS)

    Kubancak, J.; Ambrozova, I.; Ploc, O.; Pachnerova Brabcova, K.; Stepan, V.; Uchihori, Y.

    2014-01-01

    The annual effective doses of aircrew members often exceed the limit of 1 mSv for the public due to the increased level of cosmic radiation at the flight altitudes, and thus, it is recommended to monitor them [International Commission on Radiation Protection. 1990 Recommendations of the International Commission on Radiological Protection. ICRP Publication 60. Ann. ICRP 21(1-3), (1991)]. According to the Monte Carlo simulations [Battistoni, G., Ferrari, A., Pelliccioni, M. and Villari, R. Evaluation of the doses to aircrew members taking into consideration the aircraft structures. Adv. Space Res. 36, 1645-1652 (2005) and Ferrari, A., Pelliccioni, M. and Villari, R. Evaluation of the influence of aircraft shielding on the aircrew exposure through an aircraft mathematical model. Radiat. Prot. Dosim. 108(2), 91-105 (2004)], the ambient dose equivalent rate H*(10) depends on the location in the aircraft. The aim of this article is to experimentally evaluate H*(10) on-board selected types of aircraft. The authors found that H*(10) values are higher in the front and the back of the cabin and lesser in the middle of the cabin. Moreover, total dosimetry characteristics obtained in this way are in a reasonable agreement with other data, in particular with the above-mentioned simulations. (authors)

  2. Onboard guidance system design for reusable launch vehicles in the terminal area energy management phase

    Science.gov (United States)

    Mu, Lingxia; Yu, Xiang; Zhang, Y. M.; Li, Ping; Wang, Xinmin

    2018-02-01

    A terminal area energy management (TAEM) guidance system for an unpowered reusable launch vehicle (RLV) is proposed in this paper. The mathematical model representing the RLV gliding motion is provided, followed by a transformation of extracting the required dynamics for reference profile generation. Reference longitudinal profiles are conceived based on the capability of maximum dive and maximum glide that a RLV can perform. The trajectory is obtained by iterating the motion equations at each node of altitude, where the angle of attack and the flight-path angle are regarded as regulating variables. An onboard ground-track predictor is constructed to generate the current range-to-go and lateral commands online. Although the longitudinal profile generation requires pre-processing using the RLV aerodynamics, the ground-track prediction can be executed online. This makes the guidance scheme adaptable to abnormal conditions. Finally, the guidance law is designed to track the reference commands. Numerical simulations demonstrate that the proposed guidance scheme is capable of guiding the RLV to the desired touchdown conditions.

  3. An adaptive multi-spline refinement algorithm in simulation based sailboat trajectory optimization using onboard multi-core computer systems

    Directory of Open Access Journals (Sweden)

    Dębski Roman

    2016-06-01

    Full Text Available A new dynamic programming based parallel algorithm adapted to on-board heterogeneous computers for simulation based trajectory optimization is studied in the context of “high-performance sailing”. The algorithm uses a new discrete space of continuously differentiable functions called the multi-splines as its search space representation. A basic version of the algorithm is presented in detail (pseudo-code, time and space complexity, search space auto-adaptation properties. Possible extensions of the basic algorithm are also described. The presented experimental results show that contemporary heterogeneous on-board computers can be effectively used for solving simulation based trajectory optimization problems. These computers can be considered micro high performance computing (HPC platforms-they offer high performance while remaining energy and cost efficient. The simulation based approach can potentially give highly accurate results since the mathematical model that the simulator is built upon may be as complex as required. The approach described is applicable to many trajectory optimization problems due to its black-box represented performance measure and use of OpenCL.

  4. HTML 5 Displays for On-Board Flight Systems

    Science.gov (United States)

    Silva, Chandika

    2016-01-01

    During my Internship at NASA in the summer of 2016, I was assigned to a project which dealt with developing a web-server that would display telemetry and other system data using HTML 5, JavaScript, and CSS. By doing this, it would be possible to view the data across a variety of screen sizes, and establish a standard that could be used to simplify communication and software development between NASA and other countries. Utilizing a web- approach allowed us to add in more functionality, as well as make the displays more aesthetically pleasing for the users. When I was assigned to this project my main task was to first establish communication with the current display server. This display server would output data from the on-board systems in XML format. Once communication was established I was then asked to create a dynamic telemetry table web page that would update its header and change as new information came in. After this was completed, certain minor functionalities were added to the table such as a hide column and filter by system option. This was more for the purpose of making the table more useful for the users, as they can now filter and view relevant data. Finally my last task was to create a graphical system display for all the systems on the space craft. This was by far the most challenging part of my internship as finding a JavaScript library that was both free and contained useful functions to assist me in my task was difficult. In the end I was able to use the JointJs library and accomplish the task. With the help of my mentor and the HIVE lab team, we were able to establish stable communication with the display server. We also succeeded in creating a fully dynamic telemetry table and in developing a graphical system display for the advanced modular power system. Working in JSC for this internship has taught me a lot about coding in JavaScript and HTML 5. I was also introduced to the concept of developing software as a team, and exposed to the different

  5. A new on-board imaging treatment technique for palliative and emergency treatments in radiation oncology

    International Nuclear Information System (INIS)

    Held, Mareike

    2016-01-01

    This dissertation focuses on the use of on-board imaging systems as the basis for treatment planning, presenting an additional application for on-board images. A clinical workflow is developed to simulate, plan, and deliver a simple radiation oncology treatment rapidly, using 3D patient scans. The work focuses on an on-line dose planning and delivery process based on on-board images entirely performed with the patient set up on the treatment couch of the linear accelerator. This potentially reduces the time between patient simulation and treatment to about 30 minutes. The basis for correct dose calculation is the accurate image gray scale to tissue density calibration. The gray scale, which is defined in CT Numbers, is dependent on the energy spectrum of the beam. Therefore, an understanding of the physics characteristics of each on-board system is required to evaluate the impact on image quality, especially regarding the underlying cause of image noise, contrast, and non-uniformity. Modern on-board imaging systems, including kV and megavoltage (MV) cone beam (CB) CT as well as MV CT, are characterized in terms of image quality and stability. A library of phantom and patient CT images is used to evaluate the dose calculation accuracy for the on-board images. The dose calculation objective is to stay within 5% local dose differences compared to standard kV CT dose planning. The objective is met in many treatment cases. However, dose calculation accuracy depends on the anatomical treatment site. While on-board CT-based treatments of the head and extremities are predictable within 5% on all systems, lung tissue and air cavities may create local dose discrepancies of more than 5%. The image quality varies between the tested units. Consequently, the CT number-to-density calibration is defined independently for each system. In case of some imaging systems, the CT numbers of the images are dependent on the protocol used for on-board imaging, which defines the imaging dose

  6. A new on-board imaging treatment technique for palliative and emergency treatments in radiation oncology

    Energy Technology Data Exchange (ETDEWEB)

    Held, Mareike

    2016-03-23

    This dissertation focuses on the use of on-board imaging systems as the basis for treatment planning, presenting an additional application for on-board images. A clinical workflow is developed to simulate, plan, and deliver a simple radiation oncology treatment rapidly, using 3D patient scans. The work focuses on an on-line dose planning and delivery process based on on-board images entirely performed with the patient set up on the treatment couch of the linear accelerator. This potentially reduces the time between patient simulation and treatment to about 30 minutes. The basis for correct dose calculation is the accurate image gray scale to tissue density calibration. The gray scale, which is defined in CT Numbers, is dependent on the energy spectrum of the beam. Therefore, an understanding of the physics characteristics of each on-board system is required to evaluate the impact on image quality, especially regarding the underlying cause of image noise, contrast, and non-uniformity. Modern on-board imaging systems, including kV and megavoltage (MV) cone beam (CB) CT as well as MV CT, are characterized in terms of image quality and stability. A library of phantom and patient CT images is used to evaluate the dose calculation accuracy for the on-board images. The dose calculation objective is to stay within 5% local dose differences compared to standard kV CT dose planning. The objective is met in many treatment cases. However, dose calculation accuracy depends on the anatomical treatment site. While on-board CT-based treatments of the head and extremities are predictable within 5% on all systems, lung tissue and air cavities may create local dose discrepancies of more than 5%. The image quality varies between the tested units. Consequently, the CT number-to-density calibration is defined independently for each system. In case of some imaging systems, the CT numbers of the images are dependent on the protocol used for on-board imaging, which defines the imaging dose

  7. Model-Based Control of an Aircraft Engine using an Optimal Tuner Approach

    Science.gov (United States)

    Connolly, Joseph W.; Chicatelli, Amy; Garg, Sanjay

    2012-01-01

    This paper covers the development of a model-based engine control (MBEC) method- ology applied to an aircraft turbofan engine. Here, a linear model extracted from the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS40k) at a cruise operating point serves as the engine and the on-board model. The on-board model is up- dated using an optimal tuner Kalman Filter (OTKF) estimation routine, which enables the on-board model to self-tune to account for engine performance variations. The focus here is on developing a methodology for MBEC with direct control of estimated parameters of interest such as thrust and stall margins. MBEC provides the ability for a tighter control bound of thrust over the entire life cycle of the engine that is not achievable using traditional control feedback, which uses engine pressure ratio or fan speed. CMAPSS40k is capable of modeling realistic engine performance, allowing for a verification of the MBEC tighter thrust control. In addition, investigations of using the MBEC to provide a surge limit for the controller limit logic are presented that could provide benefits over a simple acceleration schedule that is currently used in engine control architectures.

  8. SU-D-18A-02: Towards Real-Time On-Board Volumetric Image Reconstruction for Intrafraction Target Verification in Radiation Therapy

    International Nuclear Information System (INIS)

    Xu, X; Iliopoulos, A; Zhang, Y; Pitsianis, N; Sun, X; Yin, F; Ren, L

    2014-01-01

    Purpose: To expedite on-board volumetric image reconstruction from limited-angle kV—MV projections for intrafraction verification. Methods: A limited-angle intrafraction verification (LIVE) system has recently been developed for real-time volumetric verification of moving targets, using limited-angle kV—MV projections. Currently, it is challenged by the intensive computational load of the prior-knowledge-based reconstruction method. To accelerate LIVE, we restructure the software pipeline to make it adaptable to model and algorithm parameter changes, while enabling efficient utilization of rapidly advancing, modern computer architectures. In particular, an innovative two-level parallelization scheme has been designed: At the macroscopic level, data and operations are adaptively partitioned, taking into account algorithmic parameters and the processing capacity or constraints of underlying hardware. The control and data flows of the pipeline are scheduled in such a way as to maximize operation concurrency and minimize total processing time. At the microscopic level, the partitioned functions act as independent modules, operating on data partitions in parallel. Each module is pre-parallelized and optimized for multi-core processors (CPUs) and graphics processing units (GPUs). Results: We present results from a parallel prototype, where most of the controls and module parallelization are carried out via Matlab and its Parallel Computing Toolbox. The reconstruction is 5 times faster on a data-set of twice the size, compared to recently reported results, without compromising on algorithmic optimization control. Conclusion: The prototype implementation and its results have served to assess the efficacy of our system concept. While a production implementation will yield much higher processing rates by approaching full-capacity utilization of CPUs and GPUs, some mutual constraints between algorithmic flow and architecture specifics remain. Based on a careful analysis

  9. Statistical analysis of geomagnetic field intensity differences between ASM and VFM instruments onboard Swarm constellation

    Science.gov (United States)

    De Michelis, Paola; Tozzi, Roberta; Consolini, Giuseppe

    2017-02-01

    From the very first measurements made by the magnetometers onboard Swarm satellites launched by European Space Agency (ESA) in late 2013, it emerged a discrepancy between scalar and vector measurements. An accurate analysis of this phenomenon brought to build an empirical model of the disturbance, highly correlated with the Sun incidence angle, and to correct vector data accordingly. The empirical model adopted by ESA results in a significant decrease in the amplitude of the disturbance affecting VFM measurements so greatly improving the vector magnetic data quality. This study is focused on the characterization of the difference between magnetic field intensity measured by the absolute scalar magnetometer (ASM) and that reconstructed using the vector field magnetometer (VFM) installed on Swarm constellation. Applying empirical mode decomposition method, we find the intrinsic mode functions (IMFs) associated with ASM-VFM total intensity differences obtained with data both uncorrected and corrected for the disturbance correlated with the Sun incidence angle. Surprisingly, no differences are found in the nature of the IMFs embedded in the analyzed signals, being these IMFs characterized by the same dominant periodicities before and after correction. The effect of correction manifests in the decrease in the energy associated with some IMFs contributing to corrected data. Some IMFs identified by analyzing the ASM-VFM intensity discrepancy are characterized by the same dominant periodicities of those obtained by analyzing the temperature fluctuations of the VFM electronic unit. Thus, the disturbance correlated with the Sun incidence angle could be still present in the corrected magnetic data. Furthermore, the ASM-VFM total intensity difference and the VFM electronic unit temperature display a maximal shared information with a time delay that depends on local time. Taken together, these findings may help to relate the features of the observed VFM-ASM total intensity

  10. OGC® Sensor Web Enablement Standards

    Directory of Open Access Journals (Sweden)

    George Percivall

    2006-09-01

    Full Text Available This article provides a high-level overview of and architecture for the Open Geospatial Consortium (OGC standards activities that focus on sensors, sensor networks, and a concept called the “Sensor Web”. This OGC work area is known as Sensor Web Enablement (SWE. This article has been condensed from "OGC® Sensor Web Enablement: Overview And High Level Architecture," an OGC White Paper by Mike Botts, PhD, George Percivall, Carl Reed, PhD, and John Davidson which can be downloaded from http://www.opengeospatial.org/pt/15540. Readers interested in greater technical and architecture detail can download and read the OGC SWE Architecture Discussion Paper titled “The OGC Sensor Web Enablement Architecture” (OGC document 06-021r1, http://www.opengeospatial.org/pt/14140.

  11. Atmospheric energy harvesting: use of Doppler Wind Lidars on UAVs to extend mission endurance and enable quiet operations

    Science.gov (United States)

    Greco, S.; Emmitt, G. D.; Wood, S. A.; Costello, M.

    2014-10-01

    The investigators are developing a system tool that utilizes both pre-flight information and continuous real-time knowledge and description of the state of the atmosphere and atmospheric energetics by an Airborne Doppler Wind Lidar (ADWL) to provide the autonomous guidance for detailed and adaptive flight path planning by UAS and small manned aircraft. This flight planning and control has the potential to reduce mission dependence upon preflight assumptions, extend flight duration and endurance, enable long periods of quiet operations and allow for the optimum self-routing of the aircraft. The ADWL wind data is used in real-time to detect atmospheric energy features such as thermals, waves, wind shear and others. These detected features are then used with an onboard, weather model driven flight control model to adaptively plan a flight path that optimizes energy harvesting with frequent updates on local changes in the opportunities and atmospheric flow characteristics. We have named this package AEORA for the Atmospheric Energy Opportunity Ranking Algorithm (AEORA).

  12. Microwave Discharge Ion Engines onboard Hayabusa Asteroid Explorer

    International Nuclear Information System (INIS)

    Kuninaka, Hitoshi

    2008-01-01

    The Hayabusa spacecraft rendezvoused with the asteroid Itokawa in 2005 after the powered flight in the deep space by the μl0 cathode-less electron cyclotron resonance ion engines. Though the spacecraft was seriously damaged after the successful soft-landing and lift-off, the xenon cold gas jets from the ion engines rescued it. New attitude stabilization method using a single reaction wheel, the ion beam jets, and the photon pressure was established and enabled the homeward journey from April 2007 aiming the Earth return on 2010. The total accumulated operational time of the ion engines reaches 31,400 hours at the end of 2007. One of four thrusters achieved 13,400-hour space operation

  13. On-board landmark navigation and attitude reference parallel processor system

    Science.gov (United States)

    Gilbert, L. E.; Mahajan, D. T.

    1978-01-01

    An approach to autonomous navigation and attitude reference for earth observing spacecraft is described along with the landmark identification technique based on a sequential similarity detection algorithm (SSDA). Laboratory experiments undertaken to determine if better than one pixel accuracy in registration can be achieved consistent with onboard processor timing and capacity constraints are included. The SSDA is implemented using a multi-microprocessor system including synchronization logic and chip library. The data is processed in parallel stages, effectively reducing the time to match the small known image within a larger image as seen by the onboard image system. Shared memory is incorporated in the system to help communicate intermediate results among microprocessors. The functions include finding mean values and summation of absolute differences over the image search area. The hardware is a low power, compact unit suitable to onboard application with the flexibility to provide for different parameters depending upon the environment.

  14. Evaluation of the use of on-board spacecraft energy storage for electric propulsion missions

    Science.gov (United States)

    Poeschel, R. L.; Palmer, F. M.

    1983-01-01

    On-board spacecraft energy storage represents an under utilized resource for some types of missions that also benefit from using relatively high specific impulse capability of electric propulsion. This resource can provide an appreciable fraction of the power required for operating the electric propulsion subsystem in some missions. The most probable mission requirement for utilization of this energy is that of geostationary satellites which have secondary batteries for operating at high power levels during eclipse. The study summarized in this report selected four examples of missions that could benefit from use of electric propulsion and on-board energy storage. Engineering analyses were performed to evaluate the mass saved and economic benefit expected when electric propulsion and on-board batteries perform some propulsion maneuvers that would conventionally be provided by chemical propulsion. For a given payload mass in geosynchronous orbit, use of electric propulsion in this manner typically provides a 10% reduction in spacecraft mass.

  15. New control method of on-board ATP system of Shinkansen trains

    Energy Technology Data Exchange (ETDEWEB)

    Fukuda, N.; Watanabe, T. [Railway Technical Research Inst. (Japan)

    2000-07-01

    We studied a new control method of the on-board automatic train protection (ATP) system for Shinkansen trains to shorten the operation time and not to degrade ride comfort at changes in deceleration of the train, while maintaining the safety and reliability of the present ATP signal system. We propose a new on-board pattern brake control system based on the present ATP data without changing the wayside equipment. By simulating the ATP braking of the proposed control method, we succeeded in shortening the operation time by 48 seconds per one station in comparison with the present ATP brake control system. This paper reports the concept of the system and simulation results of the on-board pattern. (orig.)

  16. Implementing Temperature Supervision for the ALICE CRU Card Using the Onboard Microcontroller

    CERN Document Server

    Perez Bernabeu, Ruben

    2017-01-01

    We report on the first implementation of the thermal supervisory firmware for the onboard microcontroller on the ALICE CRU card. The Common Readout Unit (CRU) is a custom PCI Express FPGA card developed by “Centre Physique des Particules de Marseille” in collaboration of LHCb and ALICE. While the main effort has been focused on the development of the FPGA firmware that implements all the communication needs, there are several independent design tasks identified to ensure the safe operation of the CRU card under all possible conditions. One such task is to implement a robust local (on-board) temperature monitoring and safeguarding subsystem based on ATmega128 microcontroller. It will autonomously prevent the thermal damage of the card even if the remote HW monitoring and controlling functions (integrated in DCS) failed for any reason. Consequently, our main goal in this project will be implementing the temperature supervision using the onboard microcontroller.

  17. Chang?E-5T Orbit Determination Using Onboard GPS Observations

    OpenAIRE

    Su, Xing; Geng, Tao; Li, Wenwen; Zhao, Qile; Xie, Xin

    2017-01-01

    In recent years, Global Navigation Satellite System (GNSS) has played an important role in Space Service Volume, the region enclosing the altitudes above 3000 km up to 36,000 km. As an in-flight test for the feasibility as well as for the performance of GNSS-based satellite orbit determination (OD), the Chinese experimental lunar mission Chang?E-5T had been equipped with an onboard high-sensitivity GNSS receiver with GPS and GLONASS tracking capability. In this contribution, the 2-h onboard G...

  18. Development and application of an emitter for research of an on-board ultraviolet polarimeter

    Science.gov (United States)

    Nevodovskyi, P. V.; Geraimchuk, M. D.; Vidmachenko, A. P.; Ivakhiv, O. V.

    2018-05-01

    In carrying out of the work a layout of on-board small-sized ultraviolet polarimeter (UVP) was created. UVP is the device, which provides an implementation of passive remote studies of stratospheric aerosol from the board of the microsatellite of the Earth by the method of polarimetry. For carrying out of tests and the research of polarimetric equipment, a special stand was created at MAO of NAS of Ukraine. In its composition is an ultraviolet emitter. Emitter is one of the main components of a special stand for the study of on-board ultraviolet polarimeters.

  19. Sustainable Venture Capital Investments: An Enabler Investigation

    Directory of Open Access Journals (Sweden)

    Elena Antarciuc

    2018-04-01

    Full Text Available Investing in sustainable projects can help tackle the current sustainability challenges. Venture capital investments can contribute significantly to the growth of sustainable start-ups. Sustainable venture capital (SVC research is just emerging. This paper identifies enablers for sustainable venture capital investments in Saudi Arabia taking into account different stakeholders and firm’s tangible and intangible resources. Using perspectives from venture capital experts in Saudi Arabia and the grey-based Decision-Making Trial and Evaluation Laboratory (DEMATEL method, this study pinpoints the most critical enablers and investigates their causal and effect interconnections. The methodological process consists of reviewing the SVC literature and consulting the experts to identify the SVC enablers, creating a questionnaire, acquiring the answers from four experts, analyzing the data with grey-based DEMATEL and performing a sensitivity analysis. The government use of international standards, policies and regulations for sustainable investments, the commitment of the venture capitalists to sustainability and their deep understanding of sustainable business models are the most influential enablers. The paper concludes with implications for different actors, limitations and prospective directions for the sustainable venture capital research.

  20. The MicrOmega Investigation Onboard Hayabusa2

    Science.gov (United States)

    Bibring, J.-P.; Hamm, V.; Langevin, Y.; Pilorget, C.; Arondel, A.; Bouzit, M.; Chaigneau, M.; Crane, B.; Darié, A.; Evesque, C.; Hansotte, J.; Gardien, V.; Gonnod, L.; Leclech, J.-C.; Meslier, L.; Redon, T.; Tamiatto, C.; Tosti, S.; Thoores, N.

    2017-07-01

    MicrOmega is a near-IR hyperspectral microscope designed to characterize in situ the texture and composition of the surface materials of the Hayabusa2 target asteroid. MicrOmega is implemented within the MASCOT lander (Ho et al. in Space Sci. Rev., 2016, this issue, doi:10.1007/s11214-016-0251-6). The spectral range (0.99-3.65 μm) and the spectral sampling (20 cm^{-1}) of MicrOmega have been chosen to allow the identification of most potential constituent minerals, ices and organics, within each 25 μm pixel of the 3.2× 3.2 mm2 FOV. Such an unprecedented characterization will (1) enable the identification of most major and minor phases, including the potential organic phases, and ascribe their mineralogical context, as a critical set of clues to decipher the origin and evolution of this primitive body, and (2) provide the ground truth for the orbital measurements as well as a reference for the analyses later performed on returned samples.

  1. Model-Based Control of a Nonlinear Aircraft Engine Simulation using an Optimal Tuner Kalman Filter Approach

    Science.gov (United States)

    Connolly, Joseph W.; Csank, Jeffrey Thomas; Chicatelli, Amy; Kilver, Jacob

    2013-01-01

    This paper covers the development of a model-based engine control (MBEC) methodology featuring a self tuning on-board model applied to an aircraft turbofan engine simulation. Here, the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS40k) serves as the MBEC application engine. CMAPSS40k is capable of modeling realistic engine performance, allowing for a verification of the MBEC over a wide range of operating points. The on-board model is a piece-wise linear model derived from CMAPSS40k and updated using an optimal tuner Kalman Filter (OTKF) estimation routine, which enables the on-board model to self-tune to account for engine performance variations. The focus here is on developing a methodology for MBEC with direct control of estimated parameters of interest such as thrust and stall margins. Investigations using the MBEC to provide a stall margin limit for the controller protection logic are presented that could provide benefits over a simple acceleration schedule that is currently used in traditional engine control architectures.

  2. Proposal of experimental setup on boiling two-phase flow on-orbit experiments onboard Japanese experiment module "KIBO"

    Science.gov (United States)

    Baba, S.; Sakai, T.; Sawada, K.; Kubota, C.; Wada, Y.; Shinmoto, Y.; Ohta, H.; Asano, H.; Kawanami, O.; Suzuki, K.; Imai, R.; Kawasaki, H.; Fujii, K.; Takayanagi, M.; Yoda, S.

    2011-12-01

    Boiling is one of the efficient modes of heat transfer due to phase change, and is regarded as promising means to be applied for the thermal management systems handling a large amount of waste heat under high heat flux. However, gravity effects on the two-phase flow phenomena and corresponding heat transfer characteristics have not been clarified in detail. The experiments onboard Japanese Experiment Module "KIBO" in International Space Station on boiling two-phase flow under microgravity conditions are proposed to clarify both of heat transfer and flow characteristics under microgravity conditions. To verify the feasibility of ISS experiments on boiling two-phase flow, the Bread Board Model is assembled and its performance and the function of components installed in a test loop are examined.

  3. Enabling Rapid Naval Architecture Design Space Exploration

    Science.gov (United States)

    Mueller, Michael A.; Dufresne, Stephane; Balestrini-Robinson, Santiago; Mavris, Dimitri

    2011-01-01

    Well accepted conceptual ship design tools can be used to explore a design space, but more precise results can be found using detailed models in full-feature computer aided design programs. However, defining a detailed model can be a time intensive task and hence there is an incentive for time sensitive projects to use conceptual design tools to explore the design space. In this project, the combination of advanced aerospace systems design methods and an accepted conceptual design tool facilitates the creation of a tool that enables the user to not only visualize ship geometry but also determine design feasibility and estimate the performance of a design.

  4. Product Line Enabled Intelligent Mobile Middleware

    DEFF Research Database (Denmark)

    Zhang, Weishan; Kunz, Thomas; Hansen, Klaus Marius

    2007-01-01

    research project called PLIMM that focuses on user-centered application scenarios. PLIMM is designed based on software product line ideas which make it possible for specialized customization and optimization for different purposes and hardware/software platforms. To enable intelligence, the middleware...... needs access to a range of context models. We model these contexts with OWL, focusing on user-centered concepts. The basic building block of PLIMM is the enhanced BDI agent where OWL context ontology logic reasoning will add indirect beliefs to the belief sets. Our approach also addresses the handling...

  5. Principles for enabling deep secondary design

    DEFF Research Database (Denmark)

    Pries-Heje, Jan; Hansen, Magnus Rotvit Perlt

    2017-01-01

    design by analyzing two cases where secondary designers fundamentally change functionality, content and technology complexity level. The first case redesigns a decision model for agile development in an insurance company; the second creates a contingency model for choosing project management tools...... and techniques in a hospital. Our analysis of the two cases leads to the identification of four principles of design implementation that primary designers can apply to enable secondary design and four corresponding design implementation principles that secondary designers themselves need to apply....

  6. Organizational Enablers for Project Governance

    DEFF Research Database (Denmark)

    Müller, Ralf; Shao, Jingting; Pemsel, Sofia

    and their relationships to organizational success. Based on these results, the authors discovered that organizational enablers (including key factors such as leadership, governance, and influence of project managers) have a critical impact on how organizations operate, adapt to market fluctuations and forces, and make......While corporate culture plays a significant role in the success of any corporation, governance and “governmentality” not only determine how business should be conducted, but also define the policies and procedures organizations follow to achieve business functions and goals. In their book......, Organizational Enablers for Project Governance, Ralf Müller, Jingting Shao, and Sofia Pemsel examine the interaction of governance and governmentality in various types of companies and demonstrate how these factors drive business success and influence project work, efficiency, and profitability. The data...

  7. 'Ethos' Enabling Organisational Knowledge Creation

    Science.gov (United States)

    Matsudaira, Yoshito

    This paper examines knowledge creation in relation to improvements on the production line in the manufacturing department of Nissan Motor Company and aims to clarify embodied knowledge observed in the actions of organisational members who enable knowledge creation will be clarified. For that purpose, this study adopts an approach that adds a first, second, and third-person's viewpoint to the theory of knowledge creation. Embodied knowledge, observed in the actions of organisational members who enable knowledge creation, is the continued practice of 'ethos' (in Greek) founded in Nissan Production Way as an ethical basis. Ethos is knowledge (intangible) assets for knowledge creating companies. Substantiated analysis classifies ethos into three categories: the individual, team and organisation. This indicates the precise actions of the organisational members in each category during the knowledge creation process. This research will be successful in its role of showing the indispensability of ethos - the new concept of knowledge assets, which enables knowledge creation -for future knowledge-based management in the knowledge society.

  8. EUV high resolution imager on-board solar orbiter: optical design and detector performances

    Science.gov (United States)

    Halain, J. P.; Mazzoli, A.; Rochus, P.; Renotte, E.; Stockman, Y.; Berghmans, D.; BenMoussa, A.; Auchère, F.

    2017-11-01

    The EUV high resolution imager (HRI) channel of the Extreme Ultraviolet Imager (EUI) on-board Solar Orbiter will observe the solar atmospheric layers at 17.4 nm wavelength with a 200 km resolution. The HRI channel is based on a compact two mirrors off-axis design. The spectral selection is obtained by a multilayer coating deposited on the mirrors and by redundant Aluminum filters rejecting the visible and infrared light. The detector is a 2k x 2k array back-thinned silicon CMOS-APS with 10 μm pixel pitch, sensitive in the EUV wavelength range. Due to the instrument compactness and the constraints on the optical design, the channel performance is very sensitive to the manufacturing, alignments and settling errors. A trade-off between two optical layouts was therefore performed to select the final optical design and to improve the mirror mounts. The effect of diffraction by the filter mesh support and by the mirror diffusion has been included in the overall error budget. Manufacturing of mirror and mounts has started and will result in thermo-mechanical validation on the EUI instrument structural and thermal model (STM). Because of the limited channel entrance aperture and consequently the low input flux, the channel performance also relies on the detector EUV sensitivity, readout noise and dynamic range. Based on the characterization of a CMOS-APS back-side detector prototype, showing promising results, the EUI detector has been specified and is under development. These detectors will undergo a qualification program before being tested and integrated on the EUI instrument.

  9. Development of an on-board H2 storage and recovery system based on lithium borohydride.

    Science.gov (United States)

    2014-02-28

    Alkali metal borohydrides based on sodium and lithium, NaBH4 and LiBH4, have been evaluated as a potential hydrogen storage and recovery system for on-board vehicle use. The borohydride salts could be dissolved in water, followed by a hydrolytic reac...

  10. Study of X-ray transients with Scanning Sky Monitor (SSM) onboard ...

    Indian Academy of Sciences (India)

    M. C. RAMADEVI

    MS received 1 September 2017; accepted 19 December 2017; published online 10 February 2018. Abstract. Scanning Sky Monitor (SSM) onboard AstroSat is an X-ray sky monitor in the ..... 31(2–3), 99. Ramadevi M. C., Seetha S., Babu V. C., Ashoka B. N., Sreeku- mar P. 2006, Optimization of Gas Proportional Coun-.

  11. On-board conversion of methanol to dimethyl ether as an alternative diesel fuel

    Energy Technology Data Exchange (ETDEWEB)

    Armbruster, H; Heinzelmann, G; Struis, R; Stucki, S [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    The catalytic dehydration of methanol to dimethyl ether was investigated for application on-board a methanol fuelled vehicle. Several catalysts have been tested in a fixed bed reactor. Our approach is to develop a small and efficient reactor converting liquid MeOH under pressure and at low reaction temperatures. (author) 2 figs., 5 refs.

  12. 40 CFR 85.2222 - On-board diagnostic test procedures.

    Science.gov (United States)

    2010-07-01

    ... on-board diagnostic systems on 1996 and newer light-duty vehicles and light-duty trucks shall consist... the unset readiness code(s) in question may be issued a passing certificate without being required to... lit malfunction indicator light (MIL) must be failed, though setting the unset readiness flags in...

  13. On-Board, Real-Time Preprocessing System for Optical Remote-Sensing Imagery.

    Science.gov (United States)

    Qi, Baogui; Shi, Hao; Zhuang, Yin; Chen, He; Chen, Liang

    2018-04-25

    With the development of remote-sensing technology, optical remote-sensing imagery processing has played an important role in many application fields, such as geological exploration and natural disaster prevention. However, relative radiation correction and geometric correction are key steps in preprocessing because raw image data without preprocessing will cause poor performance during application. Traditionally, remote-sensing data are downlinked to the ground station, preprocessed, and distributed to users. This process generates long delays, which is a major bottleneck in real-time applications for remote-sensing data. Therefore, on-board, real-time image preprocessing is greatly desired. In this paper, a real-time processing architecture for on-board imagery preprocessing is proposed. First, a hierarchical optimization and mapping method is proposed to realize the preprocessing algorithm in a hardware structure, which can effectively reduce the computation burden of on-board processing. Second, a co-processing system using a field-programmable gate array (FPGA) and a digital signal processor (DSP; altogether, FPGA-DSP) based on optimization is designed to realize real-time preprocessing. The experimental results demonstrate the potential application of our system to an on-board processor, for which resources and power consumption are limited.

  14. Contribution of magnetic measurements onboard NetLander to Mars exploration

    DEFF Research Database (Denmark)

    Menvielle, M.; Musmann, G.; Kuhnke, F.

    2000-01-01

    between the environment of the planet and solar radiation, and a secondary source, the electric currents induced in the conductive planet. The continuous recording of the time variations of the magnetic field at the surface of Mars by means of three component magnetometers installed onboard Net...

  15. THE DEVELOPMENT OF METHOD AND ON-BOARD DEVICES FOR COLLISION AVOIDANCE WHEN OVERTAKING

    Directory of Open Access Journals (Sweden)

    Podryhalo, M.

    2013-06-01

    Full Text Available A method for improving the safety of overtaking maneuver by using the on-board collision avoidance system, which has an increased assessment reliability of safety of vehicles overtaking that move in the same direction is offered. The proposed system takes into account the main factors that affect the overtaking maneuver.

  16. Functional requirements for onboard management of space shuttle consumables, volume 1

    Science.gov (United States)

    Graf, P. J.; Herwig, H. A.; Neel, L. W.

    1973-01-01

    A study was conducted to determine the functional requirements for onboard management of space shuttle consumables. A generalized consumable management concept was developed for application to advanced spacecraft. The subsystems and related consumables selected for inclusion in the consumables management system are: (1) propulsion, (2) power generation, and (3) environmental and life support.

  17. A novel approach for navigational guidance of ships using onboard monitoring systems

    DEFF Research Database (Denmark)

    Nielsen, Ulrik Dam; Jensen, Jørgen Juncher

    2011-01-01

    A novel approach and conceptual ideas are outlined for risk-based navigational guidance of ships using decision support systems in combination with onboard, in-service monitoring systems. The guidance has as the main objective to advise on speed and/or course changes; in particular with focus...

  18. Evaluating the Onboarding Phase of Free-toPlay Mobile Games

    DEFF Research Database (Denmark)

    Weigert Petersen, Falko; Thomsen, Line Ebdrup; Mirza-Babaei, Pejman

    2017-01-01

    . This paper presents a study utilizing a lab-based mixed-methods approach in providing insights for evaluating the user experience of onboarding phases in mobile games. This includes an investigation into the contribution of physiological measures (Heart-Rate Variability and Galvanic Skin Conductance) as well...

  19. CALIBRATION OF MODIFIED LIULIN DETECTOR FOR COSMIC RADIATION MEASUREMENTS ON-BOARD AIRCRAFT

    Czech Academy of Sciences Publication Activity Database

    Kyselová, Dagmar; Ambrožová, Iva; Krist, Pavel; Kubančák, Ján; Uchihori, Y.; Kitamura, H.; Ploc, Ondřej

    2015-01-01

    Roč. 164, č. 4 (2015), s. 489-492 ISSN 0144-8420 R&D Projects: GA MŠk(CZ) LG13031 Institutional support: RVO:61389005 Keywords : Liulin detector * on-board aircraft * cosmic radiation measurement Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders Impact factor: 0.894, year: 2015

  20. On-Board, Real-Time Preprocessing System for Optical Remote-Sensing Imagery

    Science.gov (United States)

    Qi, Baogui; Zhuang, Yin; Chen, He; Chen, Liang

    2018-01-01

    With the development of remote-sensing technology, optical remote-sensing imagery processing has played an important role in many application fields, such as geological exploration and natural disaster prevention. However, relative radiation correction and geometric correction are key steps in preprocessing because raw image data without preprocessing will cause poor performance during application. Traditionally, remote-sensing data are downlinked to the ground station, preprocessed, and distributed to users. This process generates long delays, which is a major bottleneck in real-time applications for remote-sensing data. Therefore, on-board, real-time image preprocessing is greatly desired. In this paper, a real-time processing architecture for on-board imagery preprocessing is proposed. First, a hierarchical optimization and mapping method is proposed to realize the preprocessing algorithm in a hardware structure, which can effectively reduce the computation burden of on-board processing. Second, a co-processing system using a field-programmable gate array (FPGA) and a digital signal processor (DSP; altogether, FPGA-DSP) based on optimization is designed to realize real-time preprocessing. The experimental results demonstrate the potential application of our system to an on-board processor, for which resources and power consumption are limited. PMID:29693585

  1. Data systems and computer science space data systems: Onboard networking and testbeds

    Science.gov (United States)

    Dalton, Dan

    1991-01-01

    The technical objectives are to develop high-performance, space-qualifiable, onboard computing, storage, and networking technologies. The topics are presented in viewgraph form and include the following: justification; technology challenges; program description; and state-of-the-art assessment.

  2. Validation of multi-channel scanning microwave radiometer on-board Oceansat-1

    Digital Repository Service at National Institute of Oceanography (India)

    Muraleedharan, P.M.; Pankajakshan, T.; Harikrishnan, M.

    Sea surface temperature (SST), sea surface wind speed (WS) and columnar water vapour (WV) derived from Multi-frequency Scanning Microwave Radiometer (MSMR) sensor on-board IRS-P4 (Oceansat-1) were validated against the in situ measurements from ship...

  3. On-Board File Management and Its Application in Flight Operations

    Science.gov (United States)

    Kuo, N.

    1998-01-01

    In this paper, the author presents the minimum functions required for an on-board file management system. We explore file manipulation processes and demonstrate how the file transfer along with the file management system will be utilized to support flight operations and data delivery.

  4. Human Resources Management: Onboarding Program and Trainer's Guide for Charter School Employees

    Science.gov (United States)

    Cook, Jeannette

    2016-01-01

    The applied dissertation project focused on the development of a comprehensive onboarding program and Trainer's Guide specifically developed for charter school management employees. Charter school education has grown significantly in the last several decades with over 6,100 charter schools that are currently serving students nationwide. Formal or…

  5. Onboard Flow Sensing For Downwash Detection and Avoidance On Small Quadrotor Helicopters

    Science.gov (United States)

    2015-01-01

    onboard computers, one for flight stabilization and a Linux computer for sensor integration and control calculations . The Linux computer runs Robot...Hirokawa, D. Kubo , S. Suzuki, J. Meguro, and T. Suzuki. Small uav for immediate hazard map generation. In AIAA Infotech@Aerospace Conf, May 2007. 8F

  6. The CFRP primary structure of the MIRI instrument onboard the James Webb Space Telescope

    DEFF Research Database (Denmark)

    Jessen, Niels Christian; Nørgaard-Nielsen, Hans Ulrik; Schroll, J

    2004-01-01

    The design of the Primary Structure of the Mid Infra-Red Instrument (MIRI) onboard the NASA/ESA James Webb Space Telescope will be presented. The main design driver is the energy flow from the 35 K "hot" satellite interface to the 7 K "cold" MIRI interface. Carbon fibre reinforced plastic (CFRP...

  7. SE83-9 'Chix in Space' student experimenter monitors STS-29 onboard activity

    Science.gov (United States)

    1989-01-01

    Student experimenter John C. Vellinger watches monitor in the JSC Mission Control Center (MCC) Bldg 30 Customer Support Room (CSR) during the STS-29 mission. Crewmembers are working with his Student Experiment (SE) 83-9 Chicken Embryo Development in Space or 'Chix in Space' onboard Discovery, Orbiter Vehicle (OV) 103. The student's sponsor is Kentucky Fried Chicken (KFC).

  8. 19 CFR 122.49b - Electronic manifest requirement for crew members and non-crew members onboard commercial aircraft...

    Science.gov (United States)

    2010-04-01

    ...” means air carrier employees and their family members and persons traveling onboard a commercial aircraft...), air carrier employees, their family members, and persons onboard for the safety of the flight are...) Date of birth; (iii) Place of birth (city, state—if applicable, country); (iv) Gender (F = female; M...

  9. USING THE INFORMATION OF ON-BOARD DIAGNOSTIC SYSTEMS IN DETERMINING THE TECHNICAL STATE OF THE LOCOMOTIVE

    Directory of Open Access Journals (Sweden)

    B. Ye. Bodnar

    2008-12-01

    Full Text Available The issues of increase of efficiency of information processing by оn-board systems of diagnostics of locomotives are considered. The examples of information processing by the on-board system of diagnostics of electric locomotives DE1 are presented. The suggestions on improvement of systematization and processing of information by on-board systems of diagnostics are given.

  10. Onboard screening dockside testing as a new means of managing paralytic shellfish poisoning risks in federally closed waters

    Science.gov (United States)

    DeGrasse, Stacey; Conrad, Stephen; DiStefano, Paul; Vanegas, Camilo; Wallace, David; Jensen, Pete; Hickey, J. Michael; Cenci, Florence; Pitt, Jaclyn; Deardorff, Dave; Rubio, Fernando; Easy, Dorothy; Donovan, Mary Anne; Laycock, Maurice; Rouse, Debbie; Mullen, John

    2014-05-01

    using the mouse bioassay (MBA) prior to its introduction into commerce. This paper presents data from the pilot study, with primary focus on the advantages and challenges of the field kits employed onboard compared to the dockside MBA, which has served as the longstanding regulatory method for PSP toxins. In 2010 alone, the successful pilot study resulted in the safe harvest of over 2.7 million worth of surfclams in an area that has otherwise been unavailable for decades. Due to the success of this pilot study, the Protocol was adopted into the National Shellfish Sanitation Program Model Ordinance as an approved marine biotoxin control strategy for use in federal waters at the 2011 ISSC Biennial Meeting. In January 2013 a portion of Georges Bank was reopened for the harvest of Atlantic surfclams and ocean quahogs to fishermen following the Protocol.

  11. Traffic Analysis for Real-Time Communication Networks onboard Ships

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Nielsen, Jens Frederik Dalsgaard; Jørgensen, N.

    1998-01-01

    The paper presents a novel method for establishing worst case estimates of queue lenghts and transmission delays in networks of interconnected segments each of ring topology as defined by the ATOMOS project for marine automation. A non probalistic model for describing traffic is introduced as well...

  12. Traffic Analysis for Real-Time Communication Networks onboard Ships

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Nielsen, Jens Frederik Dalsgaard; Jørgensen, N.

    The paper presents a novel method for establishing worst case estimates of queue lenghts and transmission delays in networks of interconnected segments each of ring topology as defined by the ATOMOS project for marine automation. A non probalistic model for describing traffic is introduced as well...

  13. On-Board Sound Intensity (OBSI) study : phase 2.

    Science.gov (United States)

    2014-05-01

    This is a continuation effort of previous research (Modeling of Quieter Pavement in Florida) : and as such is a sister report to the previous final report. Both research efforts pertain to the : noise created at the tire/pavement interface, which con...

  14. Online technique for detecting state of onboard fiber optic gyroscope

    International Nuclear Information System (INIS)

    Miao, Zhiyong; He, Kunpeng; Pang, Shuwan; Xu, Dingjie; Tian, Chunmiao

    2015-01-01

    Although angle random walk (ARW) of fiber optic gyroscope (FOG) has been well modeled and identified before being integrated into the high-accuracy attitude control system of satellite, aging and unexpected failures can affect the performance of FOG after launch, resulting in the variation of ARW coefficient. Therefore, the ARW coefficient can be regarded as an indicator of “state of health” for FOG diagnosis in some sense. The Allan variance method can be used to estimate ARW coefficient of FOG, however, it requires a large amount of data to be stored. Moreover, the procedure of drawing slope lines for estimation is painful. To overcome the barriers, a weighted state-space model that directly models the ARW to obtain a nonlinear state-space model was established for FOG. Then, a neural extended-Kalman filter algorithm was implemented to estimate and track the variation of ARW in real time. The results of experiment show that the proposed approach is valid to detect the state of FOG. Moreover, the proposed technique effectively avoids the storage of data

  15. Online technique for detecting state of onboard fiber optic gyroscope

    Energy Technology Data Exchange (ETDEWEB)

    Miao, Zhiyong; He, Kunpeng, E-mail: pengkhe@126.com; Pang, Shuwan [Department of Automation, Harbin Engineering University, Harbin, Heilongjiang 150000 (China); Xu, Dingjie [School of Electrical Engineering and Automation, Harbin Institute of Technology, Harbin, Heilongjiang 150000 (China); Tian, Chunmiao [Department of Information and Communication Engineering, Harbin Engineering University, Harbin, Heilongjiang 150000 (China)

    2015-02-15

    Although angle random walk (ARW) of fiber optic gyroscope (FOG) has been well modeled and identified before being integrated into the high-accuracy attitude control system of satellite, aging and unexpected failures can affect the performance of FOG after launch, resulting in the variation of ARW coefficient. Therefore, the ARW coefficient can be regarded as an indicator of “state of health” for FOG diagnosis in some sense. The Allan variance method can be used to estimate ARW coefficient of FOG, however, it requires a large amount of data to be stored. Moreover, the procedure of drawing slope lines for estimation is painful. To overcome the barriers, a weighted state-space model that directly models the ARW to obtain a nonlinear state-space model was established for FOG. Then, a neural extended-Kalman filter algorithm was implemented to estimate and track the variation of ARW in real time. The results of experiment show that the proposed approach is valid to detect the state of FOG. Moreover, the proposed technique effectively avoids the storage of data.

  16. Using Small Capacity Fuel Cells Onboard Drones for Battery Cooling: An Experimental Study

    Directory of Open Access Journals (Sweden)

    Shayok Mukhopadhyay

    2018-06-01

    Full Text Available Recently, quadrotor-based drones have attracted a lot of attention because of their versatility, which makes them an ideal medium for a variety of applications, e.g., personal photography, surveillance, and the delivery of lightweight packages. The flight duration of a drone is limited by its battery capacity. Increasing the payload capacity of a drone requires more current to be supplied by the battery onboard a drone. Elevated currents through a Li-ion battery can increase the battery temperature, thus posing a significant risk of fire or explosion. Li-ion batteries are suited for drone applications, due to their high energy density. There have been attempts to use hydrogen fuel cells onboard drones. Fuel cell stacks and fuel tank assemblies can have a high energy to weight ratio. So, they may be able to power long duration drone flights, but such fuel cell stacks and associated systems, are usually extremely expensive. Hence, this work proposes the novel use of a less expensive, low capacity, metal hydride fuel stick-powered fuel cell stack as an auxiliary power supply onboard a drone. A primary advantage of this is that the fuel sticks can be used to cool the batteries, and a side effect is that this slightly reduces the burden on the onboard Li-ion battery and provides a small increment in flight time. This work presents the results of an experimental study which shows the primary effect (i.e., decrease in battery temperature and the secondary side effect (i.e., a small increment in flight time obtained by using a fuel cell stack. In this work, a metal hydride fuel stick powered hydrogen fuel cell is used along with a Li-ion battery onboard a drone.

  17. Applying the COM-B model to creation of an IT-enabled health coaching and resource linkage program for low-income Latina moms with recent gestational diabetes: the STAR MAMA program.

    Science.gov (United States)

    Handley, Margaret A; Harleman, Elizabeth; Gonzalez-Mendez, Enrique; Stotland, Naomi E; Althavale, Priyanka; Fisher, Lawrence; Martinez, Diana; Ko, Jocelyn; Sausjord, Isabel; Rios, Christina

    2016-05-18

    One of the fastest growing risk groups for early onset of diabetes is women with a recent pregnancy complicated by gestational diabetes, and for this group, Latinas are the largest at-risk group in the USA. Although evidence-based interventions, such as the Diabetes Prevention Program (DPP), which focuses on low-cost changes in eating, physical activity and weight management can lower diabetes risk and delay onset, these programs have yet to be tailored to postpartum Latina women. This study aims to tailor a IT-enabled health communication program to promote DPP-concordant behavior change among postpartum Latina women with recent gestational diabetes. The COM-B model (incorporating Capability, Opportunity, and Motivational behavioral barriers and enablers) and the Behavior Change Wheel (BCW) framework, convey a theoretically based approach for intervention development. We combined a health literacy-tailored health IT tool for reaching ethnic minority patients with diabetes with a BCW-based approach to develop a health coaching intervention targeted to postpartum Latina women with recent gestational diabetes. Current evidence, four focus groups (n = 22 participants), and input from a Regional Consortium of health care providers, diabetes experts, and health literacy practitioners informed the intervention development. Thematic analysis of focus group data used the COM-B model to determine content. Relevant cultural, theoretical, and technological components that underpin the design and development of the intervention were selected using the BCW framework. STAR MAMA delivers DPP content in Spanish and English using health communication strategies to: (1) validate the emotions and experiences postpartum women struggle with; (2) encourage integration of prevention strategies into family life through mothers becoming intergenerational custodians of health; and (3) increase social and material supports through referral to social networks, health coaches, and

  18. Uav Onboard Photogrammetry and GPS Positionning for Earthworks

    Science.gov (United States)

    Daakir, M.; Pierrot-Deseilligny, M.; Bosser, P.; Pichard, F.; Thom, C.

    2015-08-01

    Over the last decade, Unmanned Airbone Vehicles (UAVs) have been largely used for civil applications. Airborne photogrammetry has found place in these applications not only for 3D modeling but also as a measurement tool. Vinci-Construction-Terrassement is a private company specialized in public works sector and uses airborn photogrammetry as a mapping solution and metrology investigation tool on its sites. This technology is very efficient for the calculation of stock volumes for instance, or for time tracking of specific areas with risk of landslides. The aim of the present work is to perform a direct georeferencing of images acquired by the camera leaning on an embedded GPS receiver. UAV, GPS receiver and camera used are low-cost models and therefore data processing is adapted to this particular constraint.

  19. Smart Grid enabled heat pumps

    DEFF Research Database (Denmark)

    Carmo, Carolina; Detlefsen, Nina; Nielsen, Mads Pagh

    2014-01-01

    The transition towards a 100 % fossil-free energy system, while achieving extreme penetration levels of intermittent wind and solar power in electricity generation, requires demand-side technologies that are smart (intermittency-friendly) and efficient. The integration of Smart Grid enabling...... with an empirical study in order to achieve a number of recommendations with respect to technology concepts and control strategies that would allow residential vapor-compression heat pumps to support large-scale integration of intermittent renewables. The analysis is based on data gathered over a period of up to 3...

  20. Extending Quad-Rotor UAV Autonomy with Onboard Image Processing

    Science.gov (United States)

    2015-03-01

    Recognition subsystem of the Image Capture model. ..........................52 Figure 40. Remote-controlled car , used as the target in this experiment...RELATED WORK Unmanned vehicles are used by researchers throughout the world to study control theory, aerodynamics , guidance, and dozens of other...2. The algorithm is tested in an outdoor suburban environment, where the Parrot successfully tracks a variety of objects including people, cars , and

  1. Limb clouds and dust on Mars from images obtained by the Visual Monitoring Camera (VMC) onboard Mars Express

    Science.gov (United States)

    Sánchez-Lavega, A.; Chen-Chen, H.; Ordoñez-Etxeberria, I.; Hueso, R.; del Río-Gaztelurrutia, T.; Garro, A.; Cardesín-Moinelo, A.; Titov, D.; Wood, S.

    2018-01-01

    The Visual Monitoring Camera (VMC) onboard the Mars Express (MEx) spacecraft is a simple camera aimed to monitor the release of the Beagle-2 lander on Mars Express and later used for public outreach. Here, we employ VMC as a scientific instrument to study and characterize high altitude aerosols events (dust and condensates) observed at the Martian limb. More than 21,000 images taken between 2007 and 2016 have been examined to detect and characterize elevated layers of dust in the limb, dust storms and clouds. We report a total of 18 events for which we give their main properties (areographic location, maximum altitude, limb projected size, Martian solar longitude and local time of occurrence). The top altitudes of these phenomena ranged from 40 to 85 km and their horizontal extent at the limb ranged from 120 to 2000 km. They mostly occurred at Equatorial and Tropical latitudes (between ∼30°N and 30°S) at morning and afternoon local times in the southern fall and northern winter seasons. None of them are related to the orographic clouds that typically form around volcanoes. Three of these events have been studied in detail using simultaneous images taken by the MARCI instrument onboard Mars Reconnaissance Orbiter (MRO) and studying the properties of the atmosphere using the predictions from the Mars Climate Database (MCD) General Circulation Model. This has allowed us to determine the three-dimensional structure and nature of these events, with one of them being a regional dust storm and the two others water ice clouds. Analyses based on MCD and/or MARCI images for the other cases studied indicate that the rest of the events correspond most probably to water ice clouds.

  2. Development and implementation of a new onboard diagnosis method for automotive lithium-ion-batteries; Entwicklung und Implementierung einer neuen Onboard-Diagnosemethode fuer Lithium-Ionen-Fahrzeugbatterien

    Energy Technology Data Exchange (ETDEWEB)

    Brill, Michael

    2012-11-01

    The author of the contribution under consideration reports on a onboard diagnosis for lithium ion accumulators which determines the actual state of aging of a high voltage drive battery during the normal usage of hybrid vehicles and electrically driven vehicles. Due to the limited computing time and storages resources in the battery control unit a combined process is shown which analyses the state of aging of the total battery as a unit and additionally the scattering of the battery cells. Furthermore the procedure is design to supply an optimal result with the available measurement signals.

  3. Fault-Tolerant Onboard Monitoring and Decision Support Systems

    DEFF Research Database (Denmark)

    Lajic, Zoran

    a crude and simple estimation of the actual sea state (Hs and Tz), information about the longitudinal hull girder loading, seakeeping performance of the ship, and decision support on how to operate the ship within acceptable limits. The system is able to identify critical forthcoming events and to give...... advice regarding speed and course changes to decrease the wave-induced loads. The SeaSense system is based on the combined use of a mathematical model and measurements from a set of sensors. The overall dependability of a shipboard monitoring and decision support system such as the SeaSense system can...

  4. Background simulations for the Large Area Detector onboard LOFT

    DEFF Research Database (Denmark)

    Campana, Riccardo; Feroci, Marco; Ettore, Del Monte

    2013-01-01

    and magnetic fields around compact objects and in supranuclear density conditions. Having an effective area of similar to 10 m(2) at 8 keV, LOFT will be able to measure with high sensitivity very fast variability in the X-ray fluxes and spectra. A good knowledge of the in-orbit background environment...... is essential to assess the scientific performance of the mission and optimize the design of its main instrument, the Large Area Detector (LAD). In this paper the results of an extensive Geant-4 simulation of the instrumentwillbe discussed, showing the main contributions to the background and the design...... an anticipated modulation of the background rate as small as 10 % over the orbital timescale. The intrinsic photonic origin of the largest background component also allows for an efficient modelling, supported by an in-flight active monitoring, allowing to predict systematic residuals significantly better than...

  5. PHM Enabled Autonomous Propellant Loading Operations

    Science.gov (United States)

    Walker, Mark; Figueroa, Fernando

    2017-01-01

    The utility of Prognostics and Health Management (PHM) software capability applied to Autonomous Operations (AO) remains an active research area within aerospace applications. The ability to gain insight into which assets and subsystems are functioning properly, along with the derivation of confident predictions concerning future ability, reliability, and availability, are important enablers for making sound mission planning decisions. When coupled with software that fully supports mission planning and execution, an integrated solution can be developed that leverages state assessment and estimation for the purposes of delivering autonomous operations. The authors have been applying this integrated, model-based approach to the autonomous loading of cryogenic spacecraft propellants at Kennedy Space Center.

  6. Robotics On-Board Trainer (ROBoT)

    Science.gov (United States)

    Johnson, Genevieve; Alexander, Greg

    2013-01-01

    ROBoT is an on-orbit version of the ground-based Dynamics Skills Trainer (DST) that astronauts use for training on a frequent basis. This software consists of two primary software groups. The first series of components is responsible for displaying the graphical scenes. The remaining components are responsible for simulating the Mobile Servicing System (MSS), the Japanese Experiment Module Remote Manipulator System (JEMRMS), and the H-II Transfer Vehicle (HTV) Free Flyer Robotics Operations. The MSS simulation software includes: Robotic Workstation (RWS) simulation, a simulation of the Space Station Remote Manipulator System (SSRMS), a simulation of the ISS Command and Control System (CCS), and a portion of the Portable Computer System (PCS) software necessary for MSS operations. These components all run under the CentOS4.5 Linux operating system. The JEMRMS simulation software includes real-time, HIL, dynamics, manipulator multi-body dynamics, and a moving object contact model with Tricks discrete time scheduling. The JEMRMS DST will be used as a functional proficiency and skills trainer for flight crews. The HTV Free Flyer Robotics Operations simulation software adds a functional simulation of HTV vehicle controllers, sensors, and data to the MSS simulation software. These components are intended to support HTV ISS visiting vehicle analysis and training. The scene generation software will use DOUG (Dynamic On-orbit Ubiquitous Graphics) to render the graphical scenes. DOUG runs on a laptop running the CentOS4.5 Linux operating system. DOUG is an Open GL-based 3D computer graphics rendering package. It uses pre-built three-dimensional models of on-orbit ISS and space shuttle systems elements, and provides realtime views of various station and shuttle configurations.

  7. Rapid hyperspectral image classification to enable autonomous search systems

    Directory of Open Access Journals (Sweden)

    Raj Bridgelal

    2016-11-01

    Full Text Available The emergence of lightweight full-frame hyperspectral cameras is destined to enable autonomous search vehicles in the air, on the ground and in water. Self-contained and long-endurance systems will yield important new applications, for example, in emergency response and the timely identification of environmental hazards. One missing capability is rapid classification of hyperspectral scenes so that search vehicles can immediately take actions to verify potential targets. Onsite verifications minimise false positives and preclude the expense of repeat missions. Verifications will require enhanced image quality, which is achievable by either moving closer to the potential target or by adjusting the optical system. Such a solution, however, is currently impractical for small mobile platforms with finite energy sources. Rapid classifications with current methods demand large computing capacity that will quickly deplete the on-board battery or fuel. To develop the missing capability, the authors propose a low-complexity hyperspectral image classifier that approaches the performance of prevalent classifiers. This research determines that the new method will require at least 19-fold less computing capacity than the prevalent classifier. To assess relative performances, the authors developed a benchmark that compares a statistic of library endmember separability in their respective feature spaces.

  8. The European project Merlin on multi-gigabit, energy-efficient, ruggedized lightwave engines for advanced on-board digital processors

    Science.gov (United States)

    Stampoulidis, L.; Kehayas, E.; Karppinen, M.; Tanskanen, A.; Heikkinen, V.; Westbergh, P.; Gustavsson, J.; Larsson, A.; Grüner-Nielsen, L.; Sotom, M.; Venet, N.; Ko, M.; Micusik, D.; Kissinger, D.; Ulusoy, A. C.; King, R.; Safaisini, R.

    2017-11-01

    Modern broadband communication networks rely on satellites to complement the terrestrial telecommunication infrastructure. Satellites accommodate global reach and enable world-wide direct broadcasting by facilitating wide access to the backbone network from remote sites or areas where the installation of ground segment infrastructure is not economically viable. At the same time the new broadband applications increase the bandwidth demands in every part of the network - and satellites are no exception. Modern telecom satellites incorporate On-Board Processors (OBP) having analogue-to-digital (ADC) and digital-to-analogue converters (DAC) at their inputs/outputs and making use of digital processing to handle hundreds of signals; as the amount of information exchanged increases, so do the physical size, mass and power consumption of the interconnects required to transfer massive amounts of data through bulk electric wires.

  9. Cyber-Enabled Scientific Discovery

    International Nuclear Information System (INIS)

    Chan, Tony; Jameson, Leland

    2007-01-01

    It is often said that numerical simulation is third in the group of three ways to explore modern science: theory, experiment and simulation. Carefully executed modern numerical simulations can, however, be considered at least as relevant as experiment and theory. In comparison to physical experimentation, with numerical simulation one has the numerically simulated values of every field variable at every grid point in space and time. In comparison to theory, with numerical simulation one can explore sets of very complex non-linear equations such as the Einstein equations that are very difficult to investigate theoretically. Cyber-enabled scientific discovery is not just about numerical simulation but about every possible issue related to scientific discovery by utilizing cyberinfrastructure such as the analysis and storage of large data sets, the creation of tools that can be used by broad classes of researchers and, above all, the education and training of a cyber-literate workforce

  10. Simulation enabled safeguards assessment methodology

    International Nuclear Information System (INIS)

    Bean, Robert; Bjornard, Trond; Larson, Tom

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wire-frame construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed. (authors)

  11. Simulation Enabled Safeguards Assessment Methodology

    International Nuclear Information System (INIS)

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment Methodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed

  12. Context-Enabled Business Intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Troy Hiltbrand

    2012-04-01

    To truly understand context and apply it in business intelligence, it is vital to understand what context is and how it can be applied in addressing organizational needs. Context describes the facets of the environment that impact the way that end users interact with the system. Context includes aspects of location, chronology, access method, demographics, social influence/ relationships, end-user attitude/ emotional state, behavior/ past behavior, and presence. To be successful in making Business Intelligence content enabled, it is important to be able to capture the context of use user. With advances in technology, there are a number of ways in which this user based information can be gathered and exposed to enhance the overall end user experience.

  13. Informatics enables public health surveillance

    Directory of Open Access Journals (Sweden)

    Scott J. N McNabb

    2017-01-01

    Full Text Available Over the past decade, the world has radically changed. New advances in information and communication technologies (ICT connect the world in ways never imagined. Public health informatics (PHI leveraged for public health surveillance (PHS, can enable, enhance, and empower essential PHS functions (i.e., detection, reporting, confirmation, analyses, feedback, response. However, the tail doesn't wag the dog; as such, ICT cannot (should not drive public health surveillance strengthening. Rather, ICT can serve PHS to more effectively empower core functions. In this review, we explore promising ICT trends for prevention, detection, and response, laboratory reporting, push notification, analytics, predictive surveillance, and using new data sources, while recognizing that it is the people, politics, and policies that most challenge progress for implementation of solutions.

  14. Advanced light source technologies that enable high-volume manufacturing of DUV lithography extensions

    Science.gov (United States)

    Cacouris, Theodore; Rao, Rajasekhar; Rokitski, Rostislav; Jiang, Rui; Melchior, John; Burfeindt, Bernd; O'Brien, Kevin

    2012-03-01

    Deep UV (DUV) lithography is being applied to pattern increasingly finer geometries, leading to solutions like double- and multiple-patterning. Such process complexities lead to higher costs due to the increasing number of steps required to produce the desired results. One of the consequences is that the lithography equipment needs to provide higher operating efficiencies to minimize the cost increases, especially for producers of memory devices that experience a rapid decline in sales prices of these products over time. In addition to having introduced higher power 193nm light sources to enable higher throughput, we previously described technologies that also enable: higher tool availability via advanced discharge chamber gas management algorithms; improved process monitoring via enhanced on-board beam metrology; and increased depth of focus (DOF) via light source bandwidth modulation. In this paper we will report on the field performance of these technologies with data that supports the desired improvements in on-wafer performance and operational efficiencies.

  15. Uncertainty enabled Sensor Observation Services

    Science.gov (United States)

    Cornford, Dan; Williams, Matthew; Bastin, Lucy

    2010-05-01

    Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.

  16. Precomputing Process Noise Covariance for Onboard Sequential Filters

    Science.gov (United States)

    Olson, Corwin G.; Russell, Ryan P.; Carpenter, J. Russell

    2017-01-01

    Process noise is often used in estimation filters to account for unmodeled and mismodeled accelerations in the dynamics. The process noise covariance acts to inflate the state covariance over propagation intervals, increasing the uncertainty in the state. In scenarios where the acceleration errors change significantly over time, the standard process noise covariance approach can fail to provide effective representation of the state and its uncertainty. Consider covariance analysis techniques provide a method to precompute a process noise covariance profile along a reference trajectory using known model parameter uncertainties. The process noise covariance profile allows significantly improved state estimation and uncertainty representation over the traditional formulation. As a result, estimation performance on par with the consider filter is achieved for trajectories near the reference trajectory without the additional computational cost of the consider filter. The new formulation also has the potential to significantly reduce the trial-and-error tuning currently required of navigation analysts. A linear estimation problem as described in several previous consider covariance analysis studies is used to demonstrate the effectiveness of the precomputed process noise covariance, as well as a nonlinear descent scenario at the asteroid Bennu with optical navigation.

  17. Real-Time On-Board Airborne Demonstration of High-Speed On-Board Data Processing for Science Instruments (HOPS)

    Science.gov (United States)

    Beyon, Jeffrey Y.; Ng, Tak-Kwong; Davis, Mitchell J.; Adams, James K.; Bowen, Stephen C.; Fay, James J.; Hutchinson, Mark A.

    2015-01-01

    The project called High-Speed On-Board Data Processing for Science Instruments (HOPS) has been funded by NASA Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST) program since April, 2012. The HOPS team recently completed two flight campaigns during the summer of 2014 on two different aircrafts with two different science instruments. The first flight campaign was in July, 2014 based at NASA Langley Research Center (LaRC) in Hampton, VA on the NASA's HU-25 aircraft. The science instrument that flew with HOPS was Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) CarbonHawk Experiment Simulator (ACES) funded by NASA's Instrument Incubator Program (IIP). The second campaign was in August, 2014 based at NASA Armstrong Flight Research Center (AFRC) in Palmdale, CA on the NASA's DC-8 aircraft. HOPS flew with the Multifunctional Fiber Laser Lidar (MFLL) instrument developed by Excelis Inc. The goal of the campaigns was to perform an end-to-end demonstration of the capabilities of the HOPS prototype system (HOPS COTS) while running the most computationally intensive part of the ASCENDS algorithm real-time on-board. The comparison of the two flight campaigns and the results of the functionality tests of the HOPS COTS are presented in this paper.

  18. Evaluation of Crew-Centric Onboard Mission Operations Planning and Execution Tool: Year 2

    Science.gov (United States)

    Hillenius, S.; Marquez, J.; Korth, D.; Rosenbaum, M.; Deliz, Ivy; Kanefsky, Bob; Zheng, Jimin

    2018-01-01

    Currently, mission planning for the International Space Station (ISS) is largely affected by ground operators in mission control. The task of creating a week-long mission plan for ISS crew takes dozens of people multiple days to complete, and is often created far in advance of its execution. As such, re-planning or adapting to changing real-time constraints or emergent issues is similarly taxing. As we design for future mission operations concepts to other planets or areas with limited connectivity to Earth, more of these ground-based tasks will need to be handled autonomously by the crew onboard.There is a need for a highly usable (including low training time) tool that enables efficient self-scheduling and execution within a single package. The ISS Program has identified Playbook as a potential option. It already has high crew acceptance as a plan viewer from previous analogs and can now support a crew self-scheduling assessment on ISS or on another mission. The goals of this work, a collaboration between the Human Research Program and the ISS Program, are to inform the design of systems for more autonomous crew operations and provide a platform for research on crew autonomy for future deep space missions. Our second year of the research effort have included new insights on the crew self-scheduling sessions performed by the crew through use on the HERA (Human Exploration Research Analog) and NEEMO (NASA Extreme Environment Mission Operations) analogs. Use on the NEEMO analog involved two self-scheduling strategies where the crew planned and executed two days of EVAs (Extra-Vehicular Activities). On HERA year two represented the first HERA campaign where we were able to perform research tasks. This involved selected flexible activities that the crew could schedule, mock timelines where the crew completed more complex planning exercises, usability evaluation of the crew self-scheduling features, and more insights into the limit of plan complexity that the crew

  19. Grid Enabled Geospatial Catalogue Web Service

    Science.gov (United States)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  20. Enabling Data-Driven Methodologies Across the Data Lifecycle and Ecosystem

    Science.gov (United States)

    Doyle, R. J.; Crichton, D.

    2017-12-01

    NASA has unlocked unprecedented scientific knowledge through exploration of the Earth, our solar system, and the larger universe. NASA is generating enormous amounts of data that are challenging traditional approaches to capturing, managing, analyzing and ultimately gaining scientific understanding from science data. New architectures, capabilities and methodologies are needed to span the entire observing system, from spacecraft to archive, while integrating data-driven discovery and analytic capabilities. NASA data have a definable lifecycle, from remote collection point to validated accessibility in multiple archives. Data challenges must be addressed across this lifecycle, to capture opportunities and avoid decisions that may limit or compromise what is achievable once data arrives at the archive. Data triage may be necessary when the collection capacity of the sensor or instrument overwhelms data transport or storage capacity. By migrating computational and analytic capability to the point of data collection, informed decisions can be made about which data to keep; in some cases, to close observational decision loops onboard, to enable attending to unexpected or transient phenomena. Along a different dimension than the data lifecycle, scientists and other end-users must work across an increasingly complex data ecosystem, where the range of relevant data is rarely owned by a single institution. To operate effectively, scalable data architectures and community-owned information models become essential. NASA's Planetary Data System is having success with this approach. Finally, there is the difficult challenge of reproducibility and trust. While data provenance techniques will be part of the solution, future interactive analytics environments must support an ability to provide a basis for a result: relevant data source and algorithms, uncertainty tracking, etc., to assure scientific integrity and to enable confident decision making. Advances in data science offer

  1. Robust Control of Wide Bandgap Power Electronics Device Enabled Smart Grid

    Science.gov (United States)

    Yao, Tong

    In recent years, wide bandgap (WBG) devices enable power converters with higher power density and higher efficiency. On the other hand, smart grid technologies are getting mature due to new battery technology and computer technology. In the near future, the two technologies will form the next generation of smart grid enabled by WBG devices. This dissertation deals with two applications: silicon carbide (SiC) device used for medium voltage level interface (7.2 kV to 240 V) and gallium nitride (GaN) device used for low voltage level interface (240 V/120 V). A 20 kW solid state transformer (SST) is designed with 6 kHz switching frequency SiC rectifier. Then three robust control design methods are proposed for each of its smart grid operation modes. In grid connected mode, a new LCL filter design method is proposed considering grid voltage THD, grid current THD and current regulation loop robust stability with respect to the grid impedance change. In grid islanded mode, micro synthesis method combined with variable structure control is used to design a robust controller for grid voltage regulation. For grid emergency mode, multivariable controller designed using Hinfinity synthesis method is proposed for accurate power sharing. Controller-hardware-in-the-loop (CHIL) testbed considering 7-SST system is setup with Real Time Digital Simulator (RTDS). The real TMS320F28335 DSP and Spartan 6 FPGA control board is used to interface a switching model SST in RTDS. And the proposed control methods are tested. For low voltage level application, a 3.3 kW smart grid hardware is built with 3 GaN inverters. The inverters are designed with the GaN device characterized using the proposed multi-function double pulse tester. The inverter is controlled by onboard TMS320F28379D dual core DSP with 200 kHz sampling frequency. Each inverter is tested to process 2.2 kW power with overall efficiency of 96.5 % at room temperature. The smart grid monitor system and fault interrupt devices (FID

  2. Onboard power line conditioning system for an electric or hybrid vehicle

    Science.gov (United States)

    Kajouke, Lateef A.; Perisic, Milun

    2016-06-14

    A power line quality conditioning system for a vehicle includes an onboard rechargeable direct current (DC) energy storage system and an onboard electrical system coupled to the energy storage system. The energy storage system provides DC energy to drive an electric traction motor of the vehicle. The electrical system operates in a charging mode such that alternating current (AC) energy from a power grid external to the vehicle is converted to DC energy to charge the DC energy storage system. The electrical system also operates in a vehicle-to-grid power conditioning mode such that DC energy from the DC energy storage system is converted to AC energy to condition an AC voltage of the power grid.

  3. The LYRA Instrument Onboard PROBA2: Description and In-Flight Performance

    Science.gov (United States)

    Dominique, M.; Hochedez, J.-F.; Schmutz, W.; Dammasch, I. E.; Shapiro, A. I.; Kretzschmar, M.; Zhukov, A. N.; Gillotay, D.; Stockman, Y.; BenMoussa, A.

    2013-08-01

    The Large Yield Radiometer (LYRA) is an XUV-EUV-MUV (soft X-ray to mid-ultraviolet) solar radiometer onboard the European Space Agency Project for On-Board Autonomy 2 (PROBA2) mission, which was launched in November 2009. LYRA acquires solar-irradiance measurements at a high cadence (nominally 20 Hz) in four broad spectral channels, from soft X-ray to MUV, which have been chosen for their relevance to solar physics, space weather, and aeronomy. We briefly review the design of the instrument, give an overview of the data products distributed through the instrument website, and describe how the data are calibrated. We also briefly present a summary of the main fields of research currently under investigation by the LYRA consortium.

  4. QMX3.3 module-based on-board vehicle charger

    Energy Technology Data Exchange (ETDEWEB)

    Evans, S. [Delta-Q Technologies, Burnaby, BC (Canada)

    2010-07-01

    Delta-Q is a tier one supplier to industrial electric vehicle manufacturers offering in-house product design and development as well as sales, marketing and customer service. This presentation discussed on-board chargers for use in electric vehicles. Electric vehicle chargers are needed due to their lower cost, lack of time for generational change, and long lifetime and safety requirements. The presentation discussed universal on-board charger requirements as well as final design requirements. Other topics that were addressed included common control; QMX prototypes; steps from prototype to production; and Delta-Q and tier one partnering. It was concluded that there is a complicated array of diverse requirements with multiple stakeholders and standards. figs.

  5. Catalytic on-board hydrogen production from methanol and ammonia for mobile application

    Energy Technology Data Exchange (ETDEWEB)

    Soerijanto, H.

    2008-08-15

    This PhD thesis deals with the catalytic hydrogen production for mobile application, for example for the use in fuel cells for electric cars. Electric powered buses with fuel cells as driving system are well known, but the secure hydrogen storage in adequate amounts for long distance drive is still a topic of discussion. Methanol is an excellent hydrogen carrier. First of all it has a high H:C ratio and therefore a high energy density. Secondly the operating temperature of steam reforming of methanol is comparatively low (250 C) and there is no risk of coking since methanol has no C-C bond. Thirdly methanol is a liquid, which means that the present gasoline infrastructure can be used. For the further development of catalysts and for the construction of a reformer it is very important to characterize the catalysts very well. For the dimensioning and the control of an on-board production of hydrogen it is essential to draw accurately on the thermodynamic, chemical and kinetic data of the reaction. At the first part of this work the mesoporous Cu/ZrO{sub 2}/CeO{sub 2}-catalysts with various copper contents were characterized and their long-term stability and selectivity were investigated, and the kinetic data were determined. Carbon monoxide is generated by reforming of carbon containing material. This process is undesired since CO poisons the Pt electrode of the fuel cell. The separation of hydrogen by metal membranes is technically feasible and a high purity of hydrogen can be obtained. However, due to their high density this procedure is not favourable because of its energy loss. In this study a concept is presented, which enables an autothermal mode by application of ceramic membrane and simultaneously could help to deal with the CO problem. The search for an absolutely selective catalyst is uncertain. The production of CO can be neither chemically nor thermodynamically excluded, if carbon is present in the hydrogen carrier. Since enrichment or separation are

  6. Catalytic on-board hydrogen production from methanol and ammonia for mobile application

    Energy Technology Data Exchange (ETDEWEB)

    Soerijanto, H

    2008-08-15

    This PhD thesis deals with the catalytic hydrogen production for mobile application, for example for the use in fuel cells for electric cars. Electric powered buses with fuel cells as driving system are well known, but the secure hydrogen storage in adequate amounts for long distance drive is still a topic of discussion. Methanol is an excellent hydrogen carrier. First of all it has a high H:C ratio and therefore a high energy density. Secondly the operating temperature of steam reforming of methanol is comparatively low (250 C) and there is no risk of coking since methanol has no C-C bond. Thirdly methanol is a liquid, which means that the present gasoline infrastructure can be used. For the further development of catalysts and for the construction of a reformer it is very important to characterize the catalysts very well. For the dimensioning and the control of an on-board production of hydrogen it is essential to draw accurately on the thermodynamic, chemical and kinetic data of the reaction. At the first part of this work the mesoporous Cu/ZrO{sub 2}/CeO{sub 2}-catalysts with various copper contents were characterized and their long-term stability and selectivity were investigated, and the kinetic data were determined. Carbon monoxide is generated by reforming of carbon containing material. This process is undesired since CO poisons the Pt electrode of the fuel cell. The separation of hydrogen by metal membranes is technically feasible and a high purity of hydrogen can be obtained. However, due to their high density this procedure is not favourable because of its energy loss. In this study a concept is presented, which enables an autothermal mode by application of ceramic membrane and simultaneously could help to deal with the CO problem. The search for an absolutely selective catalyst is uncertain. The production of CO can be neither chemically nor thermodynamically excluded, if carbon is present in the hydrogen carrier. Since enrichment or separation are

  7. MICROBIOLOGICAL EFFECTS OF ON-BOARD FISHING VESSEL HANDLING IN MERLUCCIUS MERLUCCIUS

    Directory of Open Access Journals (Sweden)

    P. Serratore

    2011-01-01

    Full Text Available The purpose of the present study was to determine the impact of different manipulation techniques applied on board fishing vessel, on the microbiological quality of the flesh of European hake (Merluccius merluccius during storage at +3°C ± 1°C for a time (T of 10 days after landing (T1-T10. Samples of fish were taken from a fishing vessel of the Adriatic Sea and from one of the Tyrrhenian Sea, treated on-board under different icing conditions: 1 a low ice/product weight ratio and 2 an optimal ice/product weight ratio, up to 1:3 (3. Spoilage bacteria as Total Bacterial Count (TBC and specific spoilage bacteria as Sulphide Producing Bacteria (SPB were enumerated in fish flesh as Colony Forming Units (CFU/g on Plate Count Agar and Lyngby Agar at 20°C for 3-5 days. TBC of the Adriatic fishes (gutted on-board resulted 103 UFC/g at T1-T6, and 104-105 at T10, whereas TBC of the Tyrrhenian fishes (not gutted on-board resulted 10-102 UFC/g at T2- T3, 103 at T6, and 104-105 at T10. SPB resulted 10- 102 UFC/g at T1-T6, and 103- 104 at T10, with absolute values higher in the Adriatic fishes, in respect with the Tyrrhenian fishes, and in the low icing conditions in respect with the optimal icing condition. At the experimented condition, the lowering of the microbiological quality of fish flesh during storage, seems to be more dependent on the gutting versus not gutting on-board practice rather than on the low versus optimal icing treatment.

  8. Data processing in Software-type Wave-Particle Interaction Analyzer onboard the Arase satellite

    Science.gov (United States)

    Hikishima, Mitsuru; Kojima, Hirotsugu; Katoh, Yuto; Kasahara, Yoshiya; Kasahara, Satoshi; Mitani, Takefumi; Higashio, Nana; Matsuoka, Ayako; Miyoshi, Yoshizumi; Asamura, Kazushi; Takashima, Takeshi; Yokota, Shoichiro; Kitahara, Masahiro; Matsuda, Shoya

    2018-05-01

    The software-type wave-particle interaction analyzer (S-WPIA) is an instrument package onboard the Arase satellite, which studies the magnetosphere. The S-WPIA represents a new method for directly observing wave-particle interactions onboard a spacecraft in a space plasma environment. The main objective of the S-WPIA is to quantitatively detect wave-particle interactions associated with whistler-mode chorus emissions and electrons over a wide energy range (from several keV to several MeV). The quantity of energy exchanges between waves and particles can be represented as the inner product of the wave electric-field vector and the particle velocity vector. The S-WPIA requires accurate measurement of the phase difference between wave and particle gyration. The leading edge of the S-WPIA system allows us to collect comprehensive information, including the detection time, energy, and incoming direction of individual particles and instantaneous-wave electric and magnetic fields, at a high sampling rate. All the collected particle and waveform data are stored in the onboard large-volume data storage. The S-WPIA executes calculations asynchronously using the collected electric and magnetic wave data, data acquired from multiple particle instruments, and ambient magnetic-field data. The S-WPIA has the role of handling large amounts of raw data that are dedicated to calculations of the S-WPIA. Then, the results are transferred to the ground station. This paper describes the design of the S-WPIA and its calculations in detail, as implemented onboard Arase.[Figure not available: see fulltext.

  9. Lunar Penetrating Radar onboard the Chang'e-3 mission

    International Nuclear Information System (INIS)

    Fang Guang-You; Zhou Bin; Ji Yi-Cai; Zhang Qun-Ying; Shen Shao-Xiang; Li Yu-Xi; Guan Hong-Fei; Tang Chuan-Jun; Gao Yun-Ze; Lu Wei; Ye Sheng-Bo; Han Hai-Dong; Zheng Jin; Wang Shu-Zhi

    2014-01-01

    Lunar Penetrating Radar (LPR) is one of the important scientific instruments onboard the Chang'e-3 spacecraft. Its scientific goals are the mapping of lunar regolith and detection of subsurface geologic structures. This paper describes the goals of the mission, as well as the basic principles, design, composition and achievements of the LPR. Finally, experiments on a glacier and the lunar surface are analyzed

  10. Onboard functional and molecular imaging: A design investigation for robotic multipinhole SPECT

    International Nuclear Information System (INIS)

    Bowsher, James; Giles, William; Yin, Fang-Fang; Yan, Susu; Roper, Justin

    2014-01-01

    Purpose: Onboard imaging—currently performed primarily by x-ray transmission modalities—is essential in modern radiation therapy. As radiation therapy moves toward personalized medicine, molecular imaging, which views individual gene expression, may also be important onboard. Nuclear medicine methods, such as single photon emission computed tomography (SPECT), are premier modalities for molecular imaging. The purpose of this study is to investigate a robotic multipinhole approach to onboard SPECT. Methods: Computer-aided design (CAD) studies were performed to assess the feasibility of maneuvering a robotic SPECT system about a patient in position for radiation therapy. In order to obtain fast, high-quality SPECT images, a 49-pinhole SPECT camera was designed which provides high sensitivity to photons emitted from an imaging region of interest. This multipinhole system was investigated by computer-simulation studies. Seventeen hot spots 10 and 7 mm in diameter were placed in the breast region of a supine female phantom. Hot spot activity concentration was six times that of background. For the 49-pinhole camera and a reference, more conventional, broad field-of-view (FOV) SPECT system, projection data were computer simulated for 4-min scans and SPECT images were reconstructed. Hot-spot localization was evaluated using a nonprewhitening forced-choice numerical observer. Results: The CAD simulation studies found that robots could maneuver SPECT cameras about patients in position for radiation therapy. In the imaging studies, most hot spots were apparent in the 49-pinhole images. Average localization errors for 10-mm- and 7-mm-diameter hot spots were 0.4 and 1.7 mm, respectively, for the 49-pinhole system, and 3.1 and 5.7 mm, respectively, for the reference broad-FOV system. Conclusions: A robot could maneuver a multipinhole SPECT system about a patient in position for radiation therapy. The system could provide onboard functional and molecular imaging with 4-min

  11. Onboard functional and molecular imaging: A design investigation for robotic multipinhole SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Bowsher, James, E-mail: james.bowsher@duke.edu; Giles, William; Yin, Fang-Fang [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27710 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27710 (United States); Yan, Susu [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27710 (United States); Roper, Justin [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27710 (United States)

    2014-01-15

    Purpose: Onboard imaging—currently performed primarily by x-ray transmission modalities—is essential in modern radiation therapy. As radiation therapy moves toward personalized medicine, molecular imaging, which views individual gene expression, may also be important onboard. Nuclear medicine methods, such as single photon emission computed tomography (SPECT), are premier modalities for molecular imaging. The purpose of this study is to investigate a robotic multipinhole approach to onboard SPECT. Methods: Computer-aided design (CAD) studies were performed to assess the feasibility of maneuvering a robotic SPECT system about a patient in position for radiation therapy. In order to obtain fast, high-quality SPECT images, a 49-pinhole SPECT camera was designed which provides high sensitivity to photons emitted from an imaging region of interest. This multipinhole system was investigated by computer-simulation studies. Seventeen hot spots 10 and 7 mm in diameter were placed in the breast region of a supine female phantom. Hot spot activity concentration was six times that of background. For the 49-pinhole camera and a reference, more conventional, broad field-of-view (FOV) SPECT system, projection data were computer simulated for 4-min scans and SPECT images were reconstructed. Hot-spot localization was evaluated using a nonprewhitening forced-choice numerical observer. Results: The CAD simulation studies found that robots could maneuver SPECT cameras about patients in position for radiation therapy. In the imaging studies, most hot spots were apparent in the 49-pinhole images. Average localization errors for 10-mm- and 7-mm-diameter hot spots were 0.4 and 1.7 mm, respectively, for the 49-pinhole system, and 3.1 and 5.7 mm, respectively, for the reference broad-FOV system. Conclusions: A robot could maneuver a multipinhole SPECT system about a patient in position for radiation therapy. The system could provide onboard functional and molecular imaging with 4-min

  12. Spaceflight Systems Training: A Comparison and Contrasting of Techniques for Training Ground Operators and Onboard Crewmembers

    Science.gov (United States)

    Balmain, Clinton; Fleming, Mark

    2009-01-01

    When developing techniques and products for instruction on manned spaceflight systems, training organizations are often faced with two very different customers: ground operators and onboard crewmembers. Frequently, instructional development focuses on one of these customers with the assumption that the other s needs will be met by default. Experience teaches us that differing approaches are required when developing training tailored to the specific needs of each customer. As a rule, ground operators require focused instruction on specific areas of expertise. Their knowledge should be of the details of the hardware, software, and operational techniques associated with that system. They often benefit from historical knowledge of how their system has operated over its lifetime. Since several different ground operators may be interfacing with the same system, each individual operator must understand the agreed-to principles by which that system will be run. In contrast, onboard crewmembers require a more broad, hands-on awareness of their operational environment. Their training should be developed with an understanding of the physical environment in which they live and work and the day-to-day tasks they are most likely to perform. Rarely do they require a deep understanding of the details of a system; it is often sufficient to teach them just enough to maintain situational awareness and perform basic tasks associated with maintenance and operation of onboard systems. Crewmembers may also develop unique onboard operational techniques that differ from preceding crews. They should be taught what flexibility they have in systems operations and how their specific habits can be communicated to ground support personnel. This paper will explore the techniques that can be employed when developing training for these unique customers. We will explore the history of International Space Station training development and how past efforts can guide us in creating training for users of

  13. Solid and hazardous waste management practices onboard ocean going vessels: a review.

    Science.gov (United States)

    Swamy, Yeddanapudi V R P P

    2012-01-01

    Shipping or carriage of goods play an important role in the development of human societies and international shipping industry, which carries 90% of the world trade, is the life blood of global economy. During ships operational activity a number of solid and hazardous wastes, also referred as garbage are produced from galleys, crew cabins and engine/deck departments stores. This review provides an overview of the current practices onboard and examines the evidence that links waste management plan regulations to shipping trade. With strict compliance to International Maritime Organization's MARPOL regulations, which prevents the pollution of sea from ships various discharges, well documented solid and hazardous waste management practices are being followed onboard ships. All ship board wastes are collected, segregated, stored and disposed of in appropriate locations, in accordance with shipping company's environmental protection policy and solid and hazardous waste management plan. For example, food residues are ground onboard and dropped into the sea as fish food. Cardboard and the like are burned onboard in incinerators. Glass is sorted into dark/light and deposited ashore, as are plastics, metal, tins, batteries, fluorescent tubes, etc. The residue from plastic incineration which is still considered as plastic is brought back to shore for disposal. New targets are being set up to reduce the volume of garbage generated and disposed of to shore facilities, and newer ships are using baling machines which compress cardboard etc into bales to be taken ashore. The garbage management and its control system work as a 'continual improvement' process to achieve new targets.

  14. Improving BeiDou precise orbit determination using observations of onboard MEO satellite receivers

    Science.gov (United States)

    Ge, Haibo; Li, Bofeng; Ge, Maorong; Shen, Yunzhong; Schuh, Harald

    2017-12-01

    In recent years, the precise orbit determination (POD) of the regional Chinese BeiDou Navigation Satellite System (BDS) has been a hot spot because of its special constellation consisting of five geostationary earth orbit (GEO) satellites and five inclined geosynchronous satellite orbit (IGSO) satellites besides four medium earth orbit (MEO) satellites since the end of 2012. GEO and IGSO satellites play an important role in regional BDS applications. However, this brings a great challenge to the POD, especially for the GEO satellites due to their geostationary orbiting. Though a number of studies have been carried out to improve the POD performance of GEO satellites, the result is still much worse than that of IGSO and MEO, particularly in the along-track direction. The major reason is that the geostationary characteristic of a GEO satellite results in a bad geometry with respect to the ground tracking network. In order to improve the tracking geometry of the GEO satellites, a possible strategy is to mount global navigation satellite system (GNSS) receivers on MEO satellites to collect the signals from GEO/IGSO GNSS satellites so as that these observations can be used to improve GEO/IGSO POD. We extended our POD software package to simulate all the related observations and to assimilate the MEO-onboard GNSS observations in orbit determination. Based on GPS and BDS constellations, simulated studies are undertaken for various tracking scenarios. The impact of the onboard GNSS observations is investigated carefully and presented in detail. The results show that MEO-onboard observations can significantly improve the orbit precision of GEO satellites from metres to decimetres, especially in the along-track direction. The POD results of IGSO satellites also benefit from the MEO-onboard data and the precision can be improved by more than 50% in 3D direction.

  15. Risk Mitigation for the Development of the New Ariane 5 On-Board Computer

    Science.gov (United States)

    Stransky, Arnaud; Chevalier, Laurent; Dubuc, Francois; Conde-Reis, Alain; Ledoux, Alain; Miramont, Philippe; Johansson, Leif

    2010-08-01

    In the frame of the Ariane 5 production, some equipment will become obsolete and need to be redesigned and redeveloped. This is the case for the On-Board Computer, which has to be completely redesigned and re-qualified by RUAG Space, as well as all its on-board software and associated development tools by ASTRIUM ST. This paper presents this obsolescence treatment, which has started in 2007 under an ESA contract, in the frame of ACEP and ARTA accompaniment programmes, and is very critical in technical term but also from schedule point of view: it gives the context and overall development plan, and details the risk mitigation actions agreed with ESA, especially those related to the development of the input/output ASIC, and also the on-board software porting and revalidation strategy. The efficiency of these risk mitigation actions has been proven by the outcome schedule; this development constitutes an up-to-date case for good practices, including some experience report and feedback for future other developments.

  16. Functional Requirements for Onboard Management of Space Shuttle Consumables. Volume 2

    Science.gov (United States)

    Graf, P. J.; Herwig, H. A.; Neel, L. W.

    1973-01-01

    This report documents the results of the study "Functional Requirements for Onboard Management of Space Shuttle Consumables." The study was conducted for the Mission Planning and Analysis Division of the NASA Lyndon B. Johnson Space Center, Houston, Texas, between 3 July 1972 and 16 November 1973. The overall study program objective was two-fold. The first objective was to define a generalized consumable management concept which is applicable to advanced spacecraft. The second objective was to develop a specific consumables management concept for the Space Shuttle vehicle and to generate the functional requirements for the onboard portion of that concept. Consumables management is the process of controlling or influencing the usage of expendable materials involved in vehicle subsystem operation. The report consists of two volumes. Volume I presents a description of the study activities related to general approaches for developing consumable management, concepts for advanced spacecraft applications, and functional requirements for a Shuttle consumables management concept. Volume II presents a detailed description of the onboard consumables management concept proposed for use on the Space Shuttle.

  17. Onboard autonomous mission re-planning for multi-satellite system

    Science.gov (United States)

    Zheng, Zixuan; Guo, Jian; Gill, Eberhard

    2018-04-01

    This paper presents an onboard autonomous mission re-planning system for Multi-Satellites System (MSS) to perform onboard re-planing in disruptive situations. The proposed re-planning system can deal with different potential emergency situations. This paper uses Multi-Objective Hybrid Dynamic Mutation Genetic Algorithm (MO-HDM GA) combined with re-planning techniques as the core algorithm. The Cyclically Re-planning Method (CRM) and the Near Real-time Re-planning Method (NRRM) are developed to meet different mission requirements. Simulations results show that both methods can provide feasible re-planning sequences under unforeseen situations. The comparisons illustrate that using the CRM is average 20% faster than the NRRM on computation time. However, by using the NRRM more raw data can be observed and transmitted than using the CRM within the same period. The usability of this onboard re-planning system is not limited to multi-satellite system. Other mission planning and re-planning problems related to autonomous multiple vehicles with similar demands are also applicable.

  18. Long-term monitoring of air crew exposure onboard of Czech Airlines aircraft

    International Nuclear Information System (INIS)

    Ploc, O.; Spurny, F.; Ploc, O.

    2007-01-01

    This contribution presents new results related to the aircraft crew exposure onboard aircraft of Czech air companies. First, the results of long term monitoring onboard of an aircraft of Czech Airlines are presented. In the period May-December 2005, 494 individual flights have been followed using MDU-Liulin Si-diode based spectrometer, together with thermoluminescent and track detectors. The results of measurements are analyzed and compared with those of calculation performed with CARI6 and EPCARD3.2 codes. Monitoring period represented about 4.6 times more than usual annual engagement of an aircrew (600 hours). Total effective dose during these 2 755 hours was between Il and 12 mSv, following the considered method of evaluation. Both the measuring and calculation methods correlate well. This fact leads to confirmation of the routine method evaluating the level of aircraft crew exposure using CARI6 code as correct for this purpose. Second, the results of individual monitoring of aircrew members obtained during few last years by this routine method are presented; general tendencies of aircraft crew onboard exposure of Czech air companies are outlined. The contribution of aircrew exposure to total occupational exposure in the Czech Republic represents about 20%. (authors)

  19. Data Products From Particle Detectors On-Board NOAA's Newest Space Weather Monitor

    Science.gov (United States)

    Kress, B. T.; Rodriguez, J. V.; Onsager, T. G.

    2017-12-01

    NOAA's newest Geostationary Operational Environmental Satellite, GOES-16, was launched on 19 November 2016. Instrumentation on-board GOES-16 includes the new Space Environment In-Situ Suite (SEISS), which has been collecting data since 8 January 2017. SEISS is composed of five magnetospheric particle sensor units: an electrostatic analyzer for measuring 30 eV - 30 keV ions and electrons (MPS-LO), a high energy particle sensor (MPS-HI) that measures keV to MeV electrons and protons, east and west facing Solar and Galactic Proton Sensor (SGPS) units with 13 differential channels between 1-500 MeV, and an Energetic Heavy Ion Sensor (EHIS) that measures 30 species of heavy ions (He-Ni) in five energy bands in the 10-200 MeV/nuc range. Measurement of low energy magnetospheric particles by MPS-LO and heavy ions by EHIS are new capabilities not previously flown on the GOES system. Real-time data from GOES-16 will support space weather monitoring and first-principles space weather modeling by NOAA's Space Weather Prediction Center (SWPC). Space weather level 2+ data products under development at NOAA's National Centers for Environmental Information (NCEI) include the Solar Energetic Particle (SEP) Event Detection algorithm. Legacy components of the SEP event detection algorithm (currently produced by SWPC) include the Solar Radiation Storm Scales. New components will include, e.g., event fluences. New level 2+ data products also include the SEP event Linear Energy Transfer (LET) Algorithm, for transforming energy spectra from EHIS into LET spectra, and the Density and Temperature Moments and Spacecraft Charging algorithm. The moments and charging algorithm identifies electron and ion signatures of spacecraft surface (frame) charging in the MPS-LO fluxes. Densities and temperatures from MPS-LO will also be used to support a magnetopause crossing detection algorithm. The new data products will provide real-time indicators of potential radiation hazards for the satellite

  20. Registered particles onboard identification in the various apertures of GAMMA-400 space gamma-telescope

    Science.gov (United States)

    Arkhangelskaja, Irene

    2016-07-01

    GAMMA-400 (Gamma Astronomical Multifunctional Modular Apparatus) will be the gamma-telescope onboard international satellite gamma-observatory designed for particle registration in the wide energy band. Its parameters are optimized for detection of gamma-quanta with the energy ˜ 100 GeV in the main aperture. The main scientific goals of GAMMA-400 are to investigate fluxes of γ-rays and the electron-positron cosmic ray component possibly generated by dark matter particles decay or annihilation and to search for and study in detail discrete γ-ray sources, to investigate the energy spectra of Galactic and extragalactic diffuse γ-rays, and to study γ-ray bursts and γ-emission from the active Sun. This article presents analysis of detected events identification procedures and energy resolution in three apertures