WorldWideScience

Sample records for space modeling approach

  1. Unified Approach to Modeling and Simulation of Space Communication Networks and Systems

    Science.gov (United States)

    Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth

    2010-01-01

    Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks

  2. Truncated conformal space approach to scaling Lee-Yang model

    International Nuclear Information System (INIS)

    Yurov, V.P.; Zamolodchikov, Al.B.

    1989-01-01

    A numerical approach to 2D relativstic field theories is suggested. Considering a field theory model as an ultraviolet conformal field theory perturbed by suitable relevant scalar operator one studies it in finite volume (on a circle). The perturbed Hamiltonian acts in the conformal field theory space of states and its matrix elements can be extracted from the conformal field theory. Truncation of the space at reasonable level results in a finite dimensional problem for numerical analyses. The nonunitary field theory with the ultraviolet region controlled by the minimal conformal theory μ(2/5) is studied in detail. 9 refs.; 17 figs

  3. State space model extraction of thermohydraulic systems – Part I: A linear graph approach

    International Nuclear Information System (INIS)

    Uren, K.R.; Schoor, G. van

    2013-01-01

    Thermohydraulic simulation codes are increasingly making use of graphical design interfaces. The user can quickly and easily design a thermohydraulic system by placing symbols on the screen resembling system components. These components can then be connected to form a system representation. Such system models may then be used to obtain detailed simulations of the physical system. Usually this kind of simulation models are too complex and not ideal for control system design. Therefore, a need exists for automated techniques to extract lumped parameter models useful for control system design. The goal of this first paper, in a two part series, is to propose a method that utilises a graphical representation of a thermohydraulic system, and a lumped parameter modelling approach, to extract state space models. In this methodology each physical domain of the thermohydraulic system is represented by a linear graph. These linear graphs capture the interaction between all components within and across energy domains – hydraulic, thermal and mechanical. These linear graphs are analysed using a graph-theoretic approach to derive reduced order state space models. These models capture the dominant dynamics of the thermohydraulic system and are ideal for control system design purposes. The proposed state space model extraction method is demonstrated by considering a U-tube system. A non-linear state space model is extracted representing both the hydraulic and thermal domain dynamics of the system. The simulated state space model is compared with a Flownex ® model of the U-tube. Flownex ® is a validated systems thermal-fluid simulation software package. - Highlights: • A state space model extraction methodology based on graph-theoretic concepts. • An energy-based approach to consider multi-domain systems in a common framework. • Allow extraction of transparent (white-box) state space models automatically. • Reduced order models containing only independent state

  4. A Declarative Design Approach to Modeling Traditional and Non-Traditional Space Systems

    Science.gov (United States)

    Hoag, Lucy M.

    The space system design process is known to be laborious, complex, and computationally demanding. It is highly multi-disciplinary, involving several interdependent subsystems that must be both highly optimized and reliable due to the high cost of launch. Satellites must also be capable of operating in harsh and unpredictable environments, so integrating high-fidelity analysis is important. To address each of these concerns, a holistic design approach is necessary. However, while the sophistication of space systems has evolved significantly in the last 60 years, improvements in the design process have been comparatively stagnant. Space systems continue to be designed using a procedural, subsystem-by-subsystem approach. This method is inadequate since it generally requires extensive iteration and limited or heuristic-based search, which can be slow, labor-intensive, and inaccurate. The use of a declarative design approach can potentially address these inadequacies. In the declarative programming style, the focus of a problem is placed on what the objective is, and not necessarily how it should be achieved. In the context of design, this entails knowledge expressed as a declaration of statements that are true about the desired artifact instead of explicit instructions on how to implement it. A well-known technique is through constraint-based reasoning, where a design problem is represented as a network of rules and constraints that are reasoned across by a solver to dynamically discover the optimal candidate(s). This enables implicit instantiation of the tradespace and allows for automatic generation of all feasible design candidates. As such, this approach also appears to be well-suited to modeling adaptable space systems, which generally have large tradespaces and possess configurations that are not well-known a priori. This research applied a declarative design approach to holistic satellite design and to tradespace exploration for adaptable space systems. The

  5. A perturbative approach to the redshift space correlation function: beyond the Standard Model

    Science.gov (United States)

    Bose, Benjamin; Koyama, Kazuya

    2017-08-01

    We extend our previous redshift space power spectrum code to the redshift space correlation function. Here we focus on the Gaussian Streaming Model (GSM). Again, the code accommodates a wide range of modified gravity and dark energy models. For the non-linear real space correlation function used in the GSM we use the Fourier transform of the RegPT 1-loop matter power spectrum. We compare predictions of the GSM for a Vainshtein screened and Chameleon screened model as well as GR. These predictions are compared to the Fourier transform of the Taruya, Nishimichi and Saito (TNS) redshift space power spectrum model which is fit to N-body data. We find very good agreement between the Fourier transform of the TNS model and the GSM predictions, with <= 6% deviations in the first two correlation function multipoles for all models for redshift space separations in 50Mpch <= s <= 180Mpc/h. Excellent agreement is found in the differences between the modified gravity and GR multipole predictions for both approaches to the redshift space correlation function, highlighting their matched ability in picking up deviations from GR. We elucidate the timeliness of such non-standard templates at the dawn of stage-IV surveys and discuss necessary preparations and extensions needed for upcoming high quality data.

  6. International Space Station Centrifuge Rotor Models A Comparison of the Euler-Lagrange and the Bond Graph Modeling Approach

    Science.gov (United States)

    Nguyen, Louis H.; Ramakrishnan, Jayant; Granda, Jose J.

    2006-01-01

    The assembly and operation of the International Space Station (ISS) require extensive testing and engineering analysis to verify that the Space Station system of systems would work together without any adverse interactions. Since the dynamic behavior of an entire Space Station cannot be tested on earth, math models of the Space Station structures and mechanical systems have to be built and integrated in computer simulations and analysis tools to analyze and predict what will happen in space. The ISS Centrifuge Rotor (CR) is one of many mechanical systems that need to be modeled and analyzed to verify the ISS integrated system performance on-orbit. This study investigates using Bond Graph modeling techniques as quick and simplified ways to generate models of the ISS Centrifuge Rotor. This paper outlines the steps used to generate simple and more complex models of the CR using Bond Graph Computer Aided Modeling Program with Graphical Input (CAMP-G). Comparisons of the Bond Graph CR models with those derived from Euler-Lagrange equations in MATLAB and those developed using multibody dynamic simulation at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) are presented to demonstrate the usefulness of the Bond Graph modeling approach for aeronautics and space applications.

  7. A GOCE-only global gravity field model by the space-wise approach

    DEFF Research Database (Denmark)

    Migliaccio, Frederica; Reguzzoni, Mirko; Gatti, Andrea

    2011-01-01

    The global gravity field model computed by the spacewise approach is one of three official solutions delivered by ESA from the analysis of the GOCE data. The model consists of a set of spherical harmonic coefficients and the corresponding error covariance matrix. The main idea behind this approach...... the orbit to reduce the noise variance and correlation before gridding the data. In the first release of the space-wise approach, based on a period of about two months, some prior information coming from existing gravity field models entered into the solution especially at low degrees and low orders...... degrees; the second is an internally computed GOCE-only prior model to be used in place of the official quick-look model, thus removing the dependency on EIGEN5C especially in the polar gaps. Once the procedure to obtain a GOCE-only solution has been outlined, a new global gravity field model has been...

  8. A perturbative approach to the redshift space correlation function: beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Bose, Benjamin; Koyama, Kazuya, E-mail: benjamin.bose@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk [Institute of Cosmology and Gravitation, University of Portsmouth, Burnaby Road, Portsmouth, Hampshire, PO1 3FX (United Kingdom)

    2017-08-01

    We extend our previous redshift space power spectrum code to the redshift space correlation function. Here we focus on the Gaussian Streaming Model (GSM). Again, the code accommodates a wide range of modified gravity and dark energy models. For the non-linear real space correlation function used in the GSM we use the Fourier transform of the RegPT 1-loop matter power spectrum. We compare predictions of the GSM for a Vainshtein screened and Chameleon screened model as well as GR. These predictions are compared to the Fourier transform of the Taruya, Nishimichi and Saito (TNS) redshift space power spectrum model which is fit to N-body data. We find very good agreement between the Fourier transform of the TNS model and the GSM predictions, with ≤ 6% deviations in the first two correlation function multipoles for all models for redshift space separations in 50Mpc h ≤ s ≤ 180Mpc/ h . Excellent agreement is found in the differences between the modified gravity and GR multipole predictions for both approaches to the redshift space correlation function, highlighting their matched ability in picking up deviations from GR. We elucidate the timeliness of such non-standard templates at the dawn of stage-IV surveys and discuss necessary preparations and extensions needed for upcoming high quality data.

  9. Limitations Of The Current State Space Modelling Approach In Multistage Machining Processes Due To Operation Variations

    Science.gov (United States)

    Abellán-Nebot, J. V.; Liu, J.; Romero, F.

    2009-11-01

    The State Space modelling approach has been recently proposed as an engineering-driven technique for part quality prediction in Multistage Machining Processes (MMP). Current State Space models incorporate fixture and datum variations in the multi-stage variation propagation, without explicitly considering common operation variations such as machine-tool thermal distortions, cutting-tool wear, cutting-tool deflections, etc. This paper shows the limitations of the current State Space model through an experimental case study where the effect of the spindle thermal expansion, cutting-tool flank wear and locator errors are introduced. The paper also discusses the extension of the current State Space model to include operation variations and its potential benefits.

  10. On discourse space modeling

    OpenAIRE

    Казыдуб, Надежда

    2013-01-01

    Discourse space is a complex structure that incorporates different levels and dimensions. The paper focuses on developing a multidisciplinary approach that is congruent to the complex character of the modern discourse. Two models of discourse space are proposed here. The Integrated Model reveals the interaction of different categorical mechanisms in the construction of the discourse space. The Evolutionary Model describes the historical roots of the modern discourse. It also reveals historica...

  11. Finite frequency shear wave splitting tomography: a model space search approach

    Science.gov (United States)

    Mondal, P.; Long, M. D.

    2017-12-01

    Observations of seismic anisotropy provide key constraints on past and present mantle deformation. A common method for upper mantle anisotropy is to measure shear wave splitting parameters (delay time and fast direction). However, the interpretation is not straightforward, because splitting measurements represent an integration of structure along the ray path. A tomographic approach that allows for localization of anisotropy is desirable; however, tomographic inversion for anisotropic structure is a daunting task, since 21 parameters are needed to describe general anisotropy. Such a large parameter space does not allow a straightforward application of tomographic inversion. Building on previous work on finite frequency shear wave splitting tomography, this study aims to develop a framework for SKS splitting tomography with a new parameterization of anisotropy and a model space search approach. We reparameterize the full elastic tensor, reducing the number of parameters to three (a measure of strength based on symmetry considerations for olivine, plus the dip and azimuth of the fast symmetry axis). We compute Born-approximation finite frequency sensitivity kernels relating model perturbations to splitting intensity observations. The strong dependence of the sensitivity kernels on the starting anisotropic model, and thus the strong non-linearity of the inverse problem, makes a linearized inversion infeasible. Therefore, we implement a Markov Chain Monte Carlo technique in the inversion procedure. We have performed tests with synthetic data sets to evaluate computational costs and infer the resolving power of our algorithm for synthetic models with multiple anisotropic layers. Our technique can resolve anisotropic parameters on length scales of ˜50 km for realistic station and event configurations for dense broadband experiments. We are proceeding towards applications to real data sets, with an initial focus on the High Lava Plains of Oregon.

  12. Review of the Space Mapping Approach to Engineering Optimization and Modeling

    DEFF Research Database (Denmark)

    Bakr, M. H.; Bandler, J. W.; Madsen, Kaj

    2000-01-01

    We review the Space Mapping (SM) concept and its applications in engineering optimization and modeling. The aim of SM is to avoid computationally expensive calculations encountered in simulating an engineering system. The existence of less accurate but fast physically-based models is exploited. S......-based Modeling (SMM). These include Space Derivative Mapping (SDM), Generalized Space Mapping (GSM) and Space Mapping-based Neuromodeling (SMN). Finally, we address open points for research and future development....

  13. Multiple Model-Based Synchronization Approaches for Time Delayed Slaving Data in a Space Launch Vehicle Tracking System

    Directory of Open Access Journals (Sweden)

    Haryong Song

    2016-01-01

    Full Text Available Due to the inherent characteristics of the flight mission of a space launch vehicle (SLV, which is required to fly over very large distances and have very high fault tolerances, in general, SLV tracking systems (TSs comprise multiple heterogeneous sensors such as radars, GPS, INS, and electrooptical targeting systems installed over widespread areas. To track an SLV without interruption and to hand over the measurement coverage between TSs properly, the mission control system (MCS transfers slaving data to each TS through mission networks. When serious network delays occur, however, the slaving data from the MCS can lead to the failure of the TS. To address this problem, in this paper, we propose multiple model-based synchronization (MMS approaches, which take advantage of the multiple motion models of an SLV. Cubic spline extrapolation, prediction through an α-β-γ filter, and a single model Kalman filter are presented as benchmark approaches. We demonstrate the synchronization accuracy and effectiveness of the proposed MMS approaches using the Monte Carlo simulation with the nominal trajectory data of Korea Space Launch Vehicle-I.

  14. A Learning State-Space Model for Image Retrieval

    Directory of Open Access Journals (Sweden)

    Lee Greg C

    2007-01-01

    Full Text Available This paper proposes an approach based on a state-space model for learning the user concepts in image retrieval. We first design a scheme of region-based image representation based on concept units, which are integrated with different types of feature spaces and with different region scales of image segmentation. The design of the concept units aims at describing similar characteristics at a certain perspective among relevant images. We present the details of our proposed approach based on a state-space model for interactive image retrieval, including likelihood and transition models, and we also describe some experiments that show the efficacy of our proposed model. This work demonstrates the feasibility of using a state-space model to estimate the user intuition in image retrieval.

  15. Modelling airborne gravity data by means of adapted Space-Wise approach

    Science.gov (United States)

    Sampietro, Daniele; Capponi, Martina; Hamdi Mansi, Ahmed; Gatti, Andrea

    2017-04-01

    Regional gravity field modelling by means of remove - restore procedure is nowadays widely applied to predict grids of gravity anomalies (Bouguer, free-air, isostatic, etc.) in gravimetric geoid determination as well as in exploration geophysics. Considering this last application, due to the required accuracy and resolution, airborne gravity observations are generally adopted. However due to the relatively high acquisition velocity, presence of atmospheric turbulence, aircraft vibration, instrumental drift, etc. airborne data are contaminated by a very high observation error. For this reason, a proper procedure to filter the raw observations both in the low and high frequency should be applied to recover valuable information. In this work, a procedure to predict a grid or a set of filtered along track gravity anomalies, by merging GGM and airborne dataset, is presented. The proposed algorithm, like the Space-Wise approach developed by Politecnico di Milano in the framework of GOCE data analysis, is based on a combination of along track Wiener filter and Least Squares Collocation adjustment and properly considers the different altitudes of the gravity observations. Among the main differences with respect to the satellite application of the Space-Wise approach there is the fact that, while in processing GOCE data the stochastic characteristics of the observation error can be considered a-priori well known, in airborne gravimetry, due to the complex environment in which the observations are acquired, these characteristics are unknown and should be retrieved from the dataset itself. Some innovative theoretical aspects focusing in particular on the theoretical covariance modelling are presented too. In the end, the goodness of the procedure is evaluated by means of a test on real data recovering the gravitational signal with a predicted accuracy of about 0.25 mGal.

  16. Urban Multisensory Laboratory, AN Approach to Model Urban Space Human Perception

    Science.gov (United States)

    González, T.; Sol, D.; Saenz, J.; Clavijo, D.; García, H.

    2017-09-01

    An urban sensory lab (USL or LUS an acronym in Spanish) is a new and avant-garde approach for studying and analyzing a city. The construction of this approach allows the development of new methodologies to identify the emotional response of public space users. The laboratory combines qualitative analysis proposed by urbanists and quantitative measures managed by data analysis applications. USL is a new approach to go beyond the borders of urban knowledge. The design thinking strategy allows us to implement methods to understand the results provided by our technique. In this first approach, the interpretation is made by hand. However, our goal is to combine design thinking and machine learning in order to analyze the qualitative and quantitative data automatically. Now, the results are being used by students from the Urbanism and Architecture courses in order to get a better understanding of public spaces in Puebla, Mexico and its interaction with people.

  17. A self-organizing state-space-model approach for parameter estimation in hodgkin-huxley-type models of single neurons.

    Directory of Open Access Journals (Sweden)

    Dimitrios V Vavoulis

    Full Text Available Traditional approaches to the problem of parameter estimation in biophysical models of neurons and neural networks usually adopt a global search algorithm (for example, an evolutionary algorithm, often in combination with a local search method (such as gradient descent in order to minimize the value of a cost function, which measures the discrepancy between various features of the available experimental data and model output. In this study, we approach the problem of parameter estimation in conductance-based models of single neurons from a different perspective. By adopting a hidden-dynamical-systems formalism, we expressed parameter estimation as an inference problem in these systems, which can then be tackled using a range of well-established statistical inference methods. The particular method we used was Kitagawa's self-organizing state-space model, which was applied on a number of Hodgkin-Huxley-type models using simulated or actual electrophysiological data. We showed that the algorithm can be used to estimate a large number of parameters, including maximal conductances, reversal potentials, kinetics of ionic currents, measurement and intrinsic noise, based on low-dimensional experimental data and sufficiently informative priors in the form of pre-defined constraints imposed on model parameters. The algorithm remained operational even when very noisy experimental data were used. Importantly, by combining the self-organizing state-space model with an adaptive sampling algorithm akin to the Covariance Matrix Adaptation Evolution Strategy, we achieved a significant reduction in the variance of parameter estimates. The algorithm did not require the explicit formulation of a cost function and it was straightforward to apply on compartmental models and multiple data sets. Overall, the proposed methodology is particularly suitable for resolving high-dimensional inference problems based on noisy electrophysiological data and, therefore, a

  18. Space-Wise approach for airborne gravity data modelling

    Science.gov (United States)

    Sampietro, D.; Capponi, M.; Mansi, A. H.; Gatti, A.; Marchetti, P.; Sansò, F.

    2017-05-01

    Regional gravity field modelling by means of remove-compute-restore procedure is nowadays widely applied in different contexts: it is the most used technique for regional gravimetric geoid determination, and it is also used in exploration geophysics to predict grids of gravity anomalies (Bouguer, free-air, isostatic, etc.), which are useful to understand and map geological structures in a specific region. Considering this last application, due to the required accuracy and resolution, airborne gravity observations are usually adopted. However, due to the relatively high acquisition velocity, presence of atmospheric turbulence, aircraft vibration, instrumental drift, etc., airborne data are usually contaminated by a very high observation error. For this reason, a proper procedure to filter the raw observations in both the low and high frequencies should be applied to recover valuable information. In this work, a software to filter and grid raw airborne observations is presented: the proposed solution consists in a combination of an along-track Wiener filter and a classical Least Squares Collocation technique. Basically, the proposed procedure is an adaptation to airborne gravimetry of the Space-Wise approach, developed by Politecnico di Milano to process data coming from the ESA satellite mission GOCE. Among the main differences with respect to the satellite application of this approach, there is the fact that, while in processing GOCE data the stochastic characteristics of the observation error can be considered a-priori well known, in airborne gravimetry, due to the complex environment in which the observations are acquired, these characteristics are unknown and should be retrieved from the dataset itself. The presented solution is suited for airborne data analysis in order to be able to quickly filter and grid gravity observations in an easy way. Some innovative theoretical aspects focusing in particular on the theoretical covariance modelling are presented too

  19. Importance-truncated shell model for multi-shell valence spaces

    Energy Technology Data Exchange (ETDEWEB)

    Stumpf, Christina; Vobig, Klaus; Roth, Robert [Institut fuer Kernphysik, TU Darmstadt (Germany)

    2016-07-01

    The valence-space shell model is one of the work horses in nuclear structure theory. In traditional applications, shell-model calculations are carried out using effective interactions constructed in a phenomenological framework for rather small valence spaces, typically spanned by one major shell. We improve on this traditional approach addressing two main aspects. First, we use new effective interactions derived in an ab initio approach and, thus, establish a connection to the underlying nuclear interaction providing access to single- and multi-shell valence spaces. Second, we extend the shell model to larger valence spaces by applying an importance-truncation scheme based on a perturbative importance measure. In this way, we reduce the model space to the relevant basis states for the description of a few target eigenstates and solve the eigenvalue problem in this physics-driven truncated model space. In particular multi-shell valence spaces are not tractable otherwise. We combine the importance-truncated shell model with refined extrapolation schemes to approximately recover the exact result. We present first results obtained in the importance-truncated shell model with the newly derived ab initio effective interactions for multi-shell valence spaces, e.g., the sdpf shell.

  20. Shell model in large spaces and statistical spectroscopy

    International Nuclear Information System (INIS)

    Kota, V.K.B.

    1996-01-01

    For many nuclear structure problems of current interest it is essential to deal with shell model in large spaces. For this, three different approaches are now in use and two of them are: (i) the conventional shell model diagonalization approach but taking into account new advances in computer technology; (ii) the shell model Monte Carlo method. A brief overview of these two methods is given. Large space shell model studies raise fundamental questions regarding the information content of the shell model spectrum of complex nuclei. This led to the third approach- the statistical spectroscopy methods. The principles of statistical spectroscopy have their basis in nuclear quantum chaos and they are described (which are substantiated by large scale shell model calculations) in some detail. (author)

  1. Simplicial models for trace spaces

    DEFF Research Database (Denmark)

    Raussen, Martin

    Directed Algebraic Topology studies topological spaces in which certain directed paths (d-paths) - in general irreversible - are singled out. The main interest concerns the spaces of directed paths between given end points - and how those vary under variation of the end points. The original...... motivation stems from certain models for concurrent computation. So far, spaces of d-paths and their topological invariants have only been determined in cases that were elementary to overlook. In this paper, we develop a systematic approach describing spaces of directed paths - up to homotopy equivalence...

  2. Allocating city space to multiple transportation modes: A new modeling approach consistent with the physics of transport

    OpenAIRE

    Gonzales, Eric J.; Geroliminis, Nikolas; Cassidy, Michael J.; Daganzo, Carlos F.

    2008-01-01

    A macroscopic modeling approach is proposed for allocating a city’s road space among competing transport modes. In this approach, a city or neighborhood street network is viewed as a reservoir with aggregated traffic. Taking the number of vehicles (accumulation) in a reservoir as input, we show how one can reliably predict system performance in terms of person and vehicle hours spent in the system and person and vehicle kilometers traveled. The approach is used here to unveil two important ...

  3. Space Sustainment: A New Approach for America in Space

    Science.gov (United States)

    2014-12-01

    international community toward promoting market incentives in international space law. This would open up the competitive space for new entrants ...announces- new -space-situational-awareness-satellite-program.aspx. 29. Gruss, “U.S. Space Assets Face Growing Threat .” 30. McDougall, Heavens and the...November–December 2014 Air & Space Power Journal | 117 SCHRIEVER ESSAY WINNER SECOND PLACE Space Sustainment A New Approach for America in Space Lt

  4. Modeling solvation effects in real-space and real-time within density functional approaches

    Energy Technology Data Exchange (ETDEWEB)

    Delgado, Alain [Istituto Nanoscienze - CNR, Centro S3, via Campi 213/A, 41125 Modena (Italy); Centro de Aplicaciones Tecnológicas y Desarrollo Nuclear, Calle 30 # 502, 11300 La Habana (Cuba); Corni, Stefano; Pittalis, Stefano; Rozzi, Carlo Andrea [Istituto Nanoscienze - CNR, Centro S3, via Campi 213/A, 41125 Modena (Italy)

    2015-10-14

    The Polarizable Continuum Model (PCM) can be used in conjunction with Density Functional Theory (DFT) and its time-dependent extension (TDDFT) to simulate the electronic and optical properties of molecules and nanoparticles immersed in a dielectric environment, typically liquid solvents. In this contribution, we develop a methodology to account for solvation effects in real-space (and real-time) (TD)DFT calculations. The boundary elements method is used to calculate the solvent reaction potential in terms of the apparent charges that spread over the van der Waals solute surface. In a real-space representation, this potential may exhibit a Coulomb singularity at grid points that are close to the cavity surface. We propose a simple approach to regularize such singularity by using a set of spherical Gaussian functions to distribute the apparent charges. We have implemented the proposed method in the OCTOPUS code and present results for the solvation free energies and solvatochromic shifts for a representative set of organic molecules in water.

  5. State-Space Modeling and Performance Analysis of Variable-Speed Wind Turbine Based on a Model Predictive Control Approach

    Directory of Open Access Journals (Sweden)

    H. Bassi

    2017-04-01

    Full Text Available Advancements in wind energy technologies have led wind turbines from fixed speed to variable speed operation. This paper introduces an innovative version of a variable-speed wind turbine based on a model predictive control (MPC approach. The proposed approach provides maximum power point tracking (MPPT, whose main objective is to capture the maximum wind energy in spite of the variable nature of the wind’s speed. The proposed MPC approach also reduces the constraints of the two main functional parts of the wind turbine: the full load and partial load segments. The pitch angle for full load and the rotating force for the partial load have been fixed concurrently in order to balance power generation as well as to reduce the operations of the pitch angle. A mathematical analysis of the proposed system using state-space approach is introduced. The simulation results using MATLAB/SIMULINK show that the performance of the wind turbine with the MPC approach is improved compared to the traditional PID controller in both low and high wind speeds.

  6. A non-linear state space approach to model groundwater fluctuations

    NARCIS (Netherlands)

    Berendrecht, W.L.; Heemink, A.W.; Geer, F.C. van; Gehrels, J.C.

    2006-01-01

    A non-linear state space model is developed for describing groundwater fluctuations. Non-linearity is introduced by modeling the (unobserved) degree of water saturation of the root zone. The non-linear relations are based on physical concepts describing the dependence of both the actual

  7. State-space approach for evaluating the soil-plant-atmosphere system

    International Nuclear Information System (INIS)

    Timm, L.C.; Reichardt, K.; Cassaro, F.A.M.; Tominaga, T.T.; Bacchi, O.O.S.; Oliveira, J.C.M.; Dourado-Neto, D.

    2004-01-01

    Using as examples one sugarcane and one forage oat experiment, both carried out in the State of Sao Paulo, Brazil, this chapter presents recent state-space approaches used to evaluate the relation between soil and plant properties. A contrast is made between classical statistics methodologies that do not take into account the sampling position coordinates, and the more recently used methodologies which include the position coordinates, and allow a better interpretation of the field-sampled data. Classical concepts are first introduced, followed by spatially referenced methodologies like the autocorrelation function, the cross correlation function, and the state-space approach. Two variations of the state-space approach are given: one emphasizes the evolution of the state system while the other based on the bayesian formulation emphasizes the evolution of the estimated observations. It is concluded that these state-space analyses using dynamic regression models improve data analyses and are therefore recommended for analyzing time and space data series related to the performance of a given soil-plant-atmosphere system. (author)

  8. Nuclear spectroscopy in large shell model spaces: recent advances

    International Nuclear Information System (INIS)

    Kota, V.K.B.

    1995-01-01

    Three different approaches are now available for carrying out nuclear spectroscopy studies in large shell model spaces and they are: (i) the conventional shell model diagonalization approach but taking into account new advances in computer technology; (ii) the recently introduced Monte Carlo method for the shell model; (iii) the spectral averaging theory, based on central limit theorems, in indefinitely large shell model spaces. The various principles, recent applications and possibilities of these three methods are described and the similarity between the Monte Carlo method and the spectral averaging theory is emphasized. (author). 28 refs., 1 fig., 5 tabs

  9. Simplicial models of trace spaces

    DEFF Research Database (Denmark)

    Raussen, Martin

    2010-01-01

    variation of the end points. The original motivation stems from certain models for concurrent computation. So far, homotopy types of spaces of d-paths and their topological invariants have only been determined in cases that were elementary to overlook. In this paper, we develop a systematic approach...

  10. Solar chimney: A sustainable approach for ventilation and building space conditioning

    Directory of Open Access Journals (Sweden)

    Lal, S.,

    2013-03-01

    Full Text Available The residential and commercial buildings demand increase with rapidly growing population. It leads to the vertical growth of the buildings and needs proper ventilation and day-lighting. The natural air ventilation system is not significantly works in conventional structure, so fans and air conditioners are mandatory to meet the proper ventilation and space conditioning. Globally building sector consumed largest energy and utmost consumed in heating, ventilation and space conditioning. This load can be reduced by application of solar chimney and integrated approaches in buildings for heating, ventilation and space conditioning. It is a sustainable approach for these applications in buildings. The authors are reviewed the concept, various method of evaluation, modelings and performance of solar chimney variables, applications and integrated approaches.

  11. Habitability Concept Models for Living in Space

    Science.gov (United States)

    Ferrino, M.

    2002-01-01

    As growing trends show, living in "space" has acquired new meanings, especially considering the utilization of the International Space Station (ISS) with regard to group interaction as well as individual needs in terms of time, space and crew accommodations. In fact, for the crew, the Spaced Station is a combined Laboratory-Office/Home and embodies ethical, social, and cultural aspects as additional parameters to be assessed to achieve a user centered architectural design of crew workspace. Habitability Concept Models can improve the methods and techniques used to support the interior design and layout of space architectures and at the same time guarantee a human focused approach. This paper discusses and illustrates some of the results obtained for the interior design of a Habitation Module for the ISS. In this work, two different but complementary approaches are followed. The first is "object oriented" and based on Video Data (American and Russian) supported by Proxemic methods (Edward T. Hall, 1963 and Francesca Pregnolato, 1998). This approach offers flexible and adaptive design solutions. The second is "subject oriented" and based on a Virtual Reality environment. With this approach human perception and cognitive aspects related to a specific crew task are considered. Data obtained from these two approaches are used to verify requirements and advance the design of the Habitation Module for aspects related to man machine interfaces (MMI), ergonomics, work and free-time. It is expected that the results achieved can be applied to future space related projects.

  12. A practical approach for exploration and modeling of the design space of a bacterial vaccine cultivation process.

    Science.gov (United States)

    Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M

    2009-10-15

    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.

  13. Robust mode space approach for atomistic modeling of realistically large nanowire transistors

    Science.gov (United States)

    Huang, Jun Z.; Ilatikhameneh, Hesameddin; Povolotskyi, Michael; Klimeck, Gerhard

    2018-01-01

    Nanoelectronic transistors have reached 3D length scales in which the number of atoms is countable. Truly atomistic device representations are needed to capture the essential functionalities of the devices. Atomistic quantum transport simulations of realistically extended devices are, however, computationally very demanding. The widely used mode space (MS) approach can significantly reduce the numerical cost, but a good MS basis is usually very hard to obtain for atomistic full-band models. In this work, a robust and parallel algorithm is developed to optimize the MS basis for atomistic nanowires. This enables engineering-level, reliable tight binding non-equilibrium Green's function simulation of nanowire metal-oxide-semiconductor field-effect transistor (MOSFET) with a realistic cross section of 10 nm × 10 nm using a small computer cluster. This approach is applied to compare the performance of InGaAs and Si nanowire n-type MOSFETs (nMOSFETs) with various channel lengths and cross sections. Simulation results with full-band accuracy indicate that InGaAs nanowire nMOSFETs have no drive current advantage over their Si counterparts for cross sections up to about 10 nm × 10 nm.

  14. Effective hamiltonian calculations using incomplete model spaces

    International Nuclear Information System (INIS)

    Koch, S.; Mukherjee, D.

    1987-01-01

    It appears that the danger of encountering ''intruder states'' is substantially reduced if an effective hamiltonian formalism is developed for incomplete model spaces (IMS). In a Fock-space approach, the proof a ''connected diagram theorem'' is fairly straightforward with exponential-type of ansatze for the wave-operator W, provided the normalization chosen for W is separable. Operationally, one just needs a suitable categorization of the Fock-space operators into ''diagonal'' and ''non-diagonal'' parts that is generalization of the corresponding procedure for the complete model space. The formalism is applied to prototypical 2-electron systems. The calculations have been performed on the Cyber 205 super-computer. The authors paid special attention to an efficient vectorization for the construction and solution of the resulting coupled non-linear equations

  15. Learning the Task Management Space of an Aircraft Approach Model

    Science.gov (United States)

    Krall, Joseph; Menzies, Tim; Davies, Misty

    2014-01-01

    Validating models of airspace operations is a particular challenge. These models are often aimed at finding and exploring safety violations, and aim to be accurate representations of real-world behavior. However, the rules governing the behavior are quite complex: nonlinear physics, operational modes, human behavior, and stochastic environmental concerns all determine the responses of the system. In this paper, we present a study on aircraft runway approaches as modeled in Georgia Tech's Work Models that Compute (WMC) simulation. We use a new learner, Genetic-Active Learning for Search-Based Software Engineering (GALE) to discover the Pareto frontiers defined by cognitive structures. These cognitive structures organize the prioritization and assignment of tasks of each pilot during approaches. We discuss the benefits of our approach, and also discuss future work necessary to enable uncertainty quantification.

  16. Approach to Integrate Global-Sun Models of Magnetic Flux Emergence and Transport for Space Weather Studies

    Science.gov (United States)

    Mansour, Nagi N.; Wray, Alan A.; Mehrotra, Piyush; Henney, Carl; Arge, Nick; Godinez, H.; Manchester, Ward; Koller, J.; Kosovichev, A.; Scherrer, P.; hide

    2013-01-01

    The Sun lies at the center of space weather and is the source of its variability. The primary input to coronal and solar wind models is the activity of the magnetic field in the solar photosphere. Recent advancements in solar observations and numerical simulations provide a basis for developing physics-based models for the dynamics of the magnetic field from the deep convection zone of the Sun to the corona with the goal of providing robust near real-time boundary conditions at the base of space weather forecast models. The goal is to develop new strategic capabilities that enable characterization and prediction of the magnetic field structure and flow dynamics of the Sun by assimilating data from helioseismology and magnetic field observations into physics-based realistic magnetohydrodynamics (MHD) simulations. The integration of first-principle modeling of solar magnetism and flow dynamics with real-time observational data via advanced data assimilation methods is a new, transformative step in space weather research and prediction. This approach will substantially enhance an existing model of magnetic flux distribution and transport developed by the Air Force Research Lab. The development plan is to use the Space Weather Modeling Framework (SWMF) to develop Coupled Models for Emerging flux Simulations (CMES) that couples three existing models: (1) an MHD formulation with the anelastic approximation to simulate the deep convection zone (FSAM code), (2) an MHD formulation with full compressible Navier-Stokes equations and a detailed description of radiative transfer and thermodynamics to simulate near-surface convection and the photosphere (Stagger code), and (3) an MHD formulation with full, compressible Navier-Stokes equations and an approximate description of radiative transfer and heating to simulate the corona (Module in BATS-R-US). CMES will enable simulations of the emergence of magnetic structures from the deep convection zone to the corona. Finally, a plan

  17. Physical models on discrete space and time

    International Nuclear Information System (INIS)

    Lorente, M.

    1986-01-01

    The idea of space and time quantum operators with a discrete spectrum has been proposed frequently since the discovery that some physical quantities exhibit measured values that are multiples of fundamental units. This paper first reviews a number of these physical models. They are: the method of finite elements proposed by Bender et al; the quantum field theory model on discrete space-time proposed by Yamamoto; the finite dimensional quantum mechanics approach proposed by Santhanam et al; the idea of space-time as lattices of n-simplices proposed by Kaplunovsky et al; and the theory of elementary processes proposed by Weizsaecker and his colleagues. The paper then presents a model proposed by the authors and based on the (n+1)-dimensional space-time lattice where fundamental entities interact among themselves 1 to 2n in order to build up a n-dimensional cubic lattice as a ground field where the physical interactions take place. The space-time coordinates are nothing more than the labelling of the ground field and take only discrete values. 11 references

  18. Estimation methods for nonlinear state-space models in ecology

    DEFF Research Database (Denmark)

    Pedersen, Martin Wæver; Berg, Casper Willestofte; Thygesen, Uffe Høgsbro

    2011-01-01

    The use of nonlinear state-space models for analyzing ecological systems is increasing. A wide range of estimation methods for such models are available to ecologists, however it is not always clear, which is the appropriate method to choose. To this end, three approaches to estimation in the theta...... logistic model for population dynamics were benchmarked by Wang (2007). Similarly, we examine and compare the estimation performance of three alternative methods using simulated data. The first approach is to partition the state-space into a finite number of states and formulate the problem as a hidden...... Markov model (HMM). The second method uses the mixed effects modeling and fast numerical integration framework of the AD Model Builder (ADMB) open-source software. The third alternative is to use the popular Bayesian framework of BUGS. The study showed that state and parameter estimation performance...

  19. Simulation of the space debris environment in LEO using a simplified approach

    Science.gov (United States)

    Kebschull, Christopher; Scheidemann, Philipp; Hesselbach, Sebastian; Radtke, Jonas; Braun, Vitali; Krag, H.; Stoll, Enrico

    2017-01-01

    Several numerical approaches exist to simulate the evolution of the space debris environment. These simulations usually rely on the propagation of a large population of objects in order to determine the collision probability for each object. Explosion and collision events are triggered randomly using a Monte-Carlo (MC) approach. So in many different scenarios different objects are fragmented and contribute to a different version of the space debris environment. The results of the single Monte-Carlo runs therefore represent the whole spectrum of possible evolutions of the space debris environment. For the comparison of different scenarios, in general the average of all MC runs together with its standard deviation is used. This method is computationally very expensive due to the propagation of thousands of objects over long timeframes and the application of the MC method. At the Institute of Space Systems (IRAS) a model capable of describing the evolution of the space debris environment has been developed and implemented. The model is based on source and sink mechanisms, where yearly launches as well as collisions and explosions are considered as sources. The natural decay and post mission disposal measures are the only sink mechanisms. This method reduces the computational costs tremendously. In order to achieve this benefit a few simplifications have been applied. The approach of the model partitions the Low Earth Orbit (LEO) region into altitude shells. Only two kinds of objects are considered, intact bodies and fragments, which are also divided into diameter bins. As an extension to a previously presented model the eccentricity has additionally been taken into account with 67 eccentricity bins. While a set of differential equations has been implemented in a generic manner, the Euler method was chosen to integrate the equations for a given time span. For this paper parameters have been derived so that the model is able to reflect the results of the numerical MC

  20. Study of Modern Approach to Build the Functional Models of Managerial and Engineering Systems in Training Specialists for Space Industry

    Directory of Open Access Journals (Sweden)

    N. V. Arhipova

    2016-01-01

    Full Text Available The SM8 Chair at Bauman Moscow State Technological University (BMSTU trains specialists majoring not only in design and manufacture, but also in operation and maintenance of space ground-based infrastructure.The learning courses in design, production, and operation of components of the missile and space technology, give much prominence to modeling. The same attention should be given to the modeling of managerial and engineering systems, with which deal both an expert and a leadman. It is important to choose the modeling tools for managerial and engineering systems with which they are to work and to learn how to apply these tools.The study of modern approach to functional modeling of managerial and engineering systems is held in the format of business game in laboratory class. A structural analysis and design technique (IDEFØ is considered as the means of modeling.The article stresses the IDEFØ approach advantages, namely: comprehensible graphical language, applicability to all-types and all-levels-of-hierarchy managerial and engineering systems modeling, popularity, version control means, teamwork tools. Moreover, the IDEFØ allows us to illustrate such notions, as point of view, system bounders, structure, control, feedback as applied to the managerial and engineering systems.The article offers a modified procedure to create an IDEFØ model in the context of training session. It also suggests a step-by-step procedure of the instruction session to be held, as well as of student self-training to have study credits, and a procedure of the work defense (final test.The approach under consideration can be applied to other training courses. The article proves it giving information about positive experience of its application.

  1. A Reparametrization Approach for Dynamic Space-Time Models

    OpenAIRE

    Lee, Hyeyoung; Ghosh, Sujit K.

    2008-01-01

    Researchers in diverse areas such as environmental and health sciences are increasingly working with data collected across space and time. The space-time processes that are generally used in practice are often complicated in the sense that the auto-dependence structure across space and time is non-trivial, often non-separable and non-stationary in space and time. Moreover, the dimension of such data sets across both space and time can be very large leading to computational difficulties due to...

  2. A real-space stochastic density matrix approach for density functional electronic structure.

    Science.gov (United States)

    Beck, Thomas L

    2015-12-21

    The recent development of real-space grid methods has led to more efficient, accurate, and adaptable approaches for large-scale electrostatics and density functional electronic structure modeling. With the incorporation of multiscale techniques, linear-scaling real-space solvers are possible for density functional problems if localized orbitals are used to represent the Kohn-Sham energy functional. These methods still suffer from high computational and storage overheads, however, due to extensive matrix operations related to the underlying wave function grid representation. In this paper, an alternative stochastic method is outlined that aims to solve directly for the one-electron density matrix in real space. In order to illustrate aspects of the method, model calculations are performed for simple one-dimensional problems that display some features of the more general problem, such as spatial nodes in the density matrix. This orbital-free approach may prove helpful considering a future involving increasingly parallel computing architectures. Its primary advantage is the near-locality of the random walks, allowing for simultaneous updates of the density matrix in different regions of space partitioned across the processors. In addition, it allows for testing and enforcement of the particle number and idempotency constraints through stabilization of a Feynman-Kac functional integral as opposed to the extensive matrix operations in traditional approaches.

  3. A state space approach for the eigenvalue problem of marine risers

    KAUST Repository

    Alfosail, Feras; Nayfeh, Ali H.; Younis, Mohammad I.

    2017-01-01

    A numerical state-space approach is proposed to examine the natural frequencies and critical buckling limits of marine risers. A large axial tension in the riser model causes numerical limitations. These limitations are overcome by using

  4. Reliability modeling of a hard real-time system using the path-space approach

    International Nuclear Information System (INIS)

    Kim, Hagbae

    2000-01-01

    A hard real-time system, such as a fly-by-wire system, fails catastrophically (e.g. losing stability) if its control inputs are not updated by its digital controller computer within a certain timing constraint called the hard deadline. To assess and validate those systems' reliabilities by using a semi-Markov model that explicitly contains the deadline information, we propose a path-space approach deriving the upper and lower bounds of the probability of system failure. These bounds are derived by using only simple parameters, and they are especially suitable for highly reliable systems which should recover quickly. Analytical bounds are derived for both exponential and Wobble failure distributions encountered commonly, which have proven effective through numerical examples, while considering three repair strategies: repair-as-good-as-new, repair-as-good-as-old, and repair-better-than-old

  5. An Effective Approach Control Scheme for the Tethered Space Robot System

    Directory of Open Access Journals (Sweden)

    Zhongjie Meng

    2014-09-01

    Full Text Available The tethered space robot system (TSR, which is composed of a platform, a gripper and a space tether, has great potential in future space missions. Given the relative motion among the platform, tether, gripper and the target, an integrated approach model is derived. Then, a novel coordinated approach control scheme is presented, in which the tether tension, thrusters and the reaction wheel are all utilized. It contains the open-loop trajectory optimization, the feedback trajectory control and attitude control. The numerical simulation results show that the rendezvous between TSR and the target can be realized by the proposed coordinated control scheme, and the propellant consumption is efficiently reduced. Moreover, the control scheme performs well in the presence of the initial state's perturbations, actuator characteristics and sensor errors.

  6. Coset Space Dimensional Reduction approach to the Standard Model

    International Nuclear Information System (INIS)

    Farakos, K.; Kapetanakis, D.; Koutsoumbas, G.; Zoupanos, G.

    1988-01-01

    We present a unified theory in ten dimensions based on the gauge group E 8 , which is dimensionally reduced to the Standard Mode SU 3c xSU 2 -LxU 1 , which breaks further spontaneously to SU 3L xU 1em . The model gives similar predictions for sin 2 θ w and proton decay as the minimal SU 5 G.U.T., while a natural choice of the coset space radii predicts light Higgs masses a la Coleman-Weinberg

  7. Representative Model of the Learning Process in Virtual Spaces Supported by ICT

    Science.gov (United States)

    Capacho, José

    2014-01-01

    This paper shows the results of research activities for building the representative model of the learning process in virtual spaces (e-Learning). The formal basis of the model are supported in the analysis of models of learning assessment in virtual spaces and specifically in Dembo´s teaching learning model, the systemic approach to evaluating…

  8. Approaching control for tethered space robot based on disturbance observer using super twisting law

    Science.gov (United States)

    Hu, Yongxin; Huang, Panfeng; Meng, Zhongjie; Wang, Dongke; Lu, Yingbo

    2018-05-01

    Approaching control is a key mission for the tethered space robot to perform the task of removing space debris. But the uncertainties of the TSR such as the change of model parameter have an important effect on the approaching mission. Considering the space tether and the attitude of the gripper, the dynamic model of the TSR is derived using Lagrange method. Then a disturbance observer is designed to estimate the uncertainty based on STW control method. Using the disturbance observer, a controller is designed, and the performance is compared with the dynamic inverse controller which turns out that the proposed controller performs better. Numerical simulation validates the feasibility of the proposed controller on the position and attitude tracking of the TSR.

  9. Grms or graphical representation of model spaces. Vol. I Basics

    International Nuclear Information System (INIS)

    Duch, W.

    1986-01-01

    This book presents a novel approach to the many-body problem in quantum chemistry, nuclear shell-theory and solid-state theory. Many-particle model spaces are visualized using graphs, each path of a graph labeling a single basis function or a subspace of functions. Spaces of a very high dimension are represented by small graphs. Model spaces have structure that is reflected in the architecture of the corresponding graphs, that in turn is reflected in the structure of the matrices corresponding to operators acting in these spaces. Insight into this structure leads to formulation of very efficient computer algorithms. Calculation of matrix elements is reduced to comparison of paths in a graph, without ever looking at the functions themselves. Using only very rudimentary mathematical tools graphical rules of matrix element calculation in abelian cases are derived, in particular segmentation rules obtained in the unitary group approached are rederived. The graphs are solutions of Diophantine equations of the type appearing in different branches of applied mathematics. Graphical representation of model spaces should find as many applications as has been found for diagramatical methods in perturbation theory

  10. Representing a Model Using Data Mining Approach for Maximizing Profit with Considering Product Assortment and Space Allocation Decisions

    Directory of Open Access Journals (Sweden)

    Manoochehr Ansari

    2016-12-01

    Full Text Available The choice of which products to stock among numerous competing products and how much space to allocate to those products are central decisions for retailers. This study aimed to apply data mining approach so that, we got needed information from large datasets of sale transactions to find the relations between products and to make product assortments. Thus, we represented a model for product assortment and space allocation. Research population was transactional data of a store, the sample included transactional data of one-month period in the time series. Data were collected in October and November, 2015 from Shaghayegh store. 525 transactions with regard to 79 different products were analyzed. Based on the result 10 product assortments formed although some products were allocated to more than 1 product category. By solving profit equation and finding volume increase indices we allocated spaces for each product assortment.

  11. Elements of a pragmatic approach for dealing with bias and uncertainty in experiments through predictions : experiment design and data conditioning; %22real space%22 model validation and conditioning; hierarchical modeling and extrapolative prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose

    2011-11-01

    This report explores some important considerations in devising a practical and consistent framework and methodology for utilizing experiments and experimental data to support modeling and prediction. A pragmatic and versatile 'Real Space' approach is outlined for confronting experimental and modeling bias and uncertainty to mitigate risk in modeling and prediction. The elements of experiment design and data analysis, data conditioning, model conditioning, model validation, hierarchical modeling, and extrapolative prediction under uncertainty are examined. An appreciation can be gained for the constraints and difficulties at play in devising a viable end-to-end methodology. Rationale is given for the various choices underlying the Real Space end-to-end approach. The approach adopts and refines some elements and constructs from the literature and adds pivotal new elements and constructs. Crucially, the approach reflects a pragmatism and versatility derived from working many industrial-scale problems involving complex physics and constitutive models, steady-state and time-varying nonlinear behavior and boundary conditions, and various types of uncertainty in experiments and models. The framework benefits from a broad exposure to integrated experimental and modeling activities in the areas of heat transfer, solid and structural mechanics, irradiated electronics, and combustion in fluids and solids.

  12. Optimal Decision-Making in Fuzzy Economic Order Quantity (EOQ Model under Restricted Space: A Non-Linear Programming Approach

    Directory of Open Access Journals (Sweden)

    M. Pattnaik

    2013-08-01

    Full Text Available In this paper the concept of fuzzy Non-Linear Programming Technique is applied to solve an economic order quantity (EOQ model under restricted space. Since various types of uncertainties and imprecision are inherent in real inventory problems they are classically modeled using the approaches from the probability theory. However, there are uncertainties that cannot be appropriately treated by usual probabilistic models. The questions how to define inventory optimization tasks in such environment how to interpret optimal solutions arise. This paper allows the modification of the Single item EOQ model in presence of fuzzy decision making process where demand is related to the unit price and the setup cost varies with the quantity produced/Purchased. This paper considers the modification of objective function and storage area in the presence of imprecisely estimated parameters. The model is developed for the problem by employing different modeling approaches over an infinite planning horizon. It incorporates all concepts of a fuzzy arithmetic approach, the quantity ordered and the demand per unit compares both fuzzy non linear and other models. Investigation of the properties of an optimal solution allows developing an algorithm whose validity is illustrated through an example problem and ugh MATLAB (R2009a version software, the two and three dimensional diagrams are represented to the application. Sensitivity analysis of the optimal solution is also studied with respect to changes in different parameter values and to draw managerial insights of the decision problem.

  13. A reference model for space data system interconnection services

    Science.gov (United States)

    Pietras, John; Theis, Gerhard

    1993-01-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  14. Requirements for high level models supporting design space exploration in model-based systems engineering

    NARCIS (Netherlands)

    Haveman, Steven; Bonnema, Gerrit Maarten

    2013-01-01

    Most formal models are used in detailed design and focus on a single domain. Few effective approaches exist that can effectively tie these lower level models to a high level system model during design space exploration. This complicates the validation of high level system requirements during

  15. 3rd International Conference on Particle Physics Beyond the Standard Model : Accelerator, Non-Accelerator and Space Approaches

    CERN Document Server

    Beyond The Desert 2002

    2003-01-01

    The third conference on particle physics beyond the Standard Model (BEYOND THE DESERT'02 - Accelerator, Non-accelerator and Space Approaches) was held during 2--7 June, 2002 at the Finish town of Oulu, almost at the northern Arctic Circle. It was the first of the BEYOND conference series held outside Germany (CERN Courier March 2003, pp. 29-30). Traditionally the Scientific Programme of BEYOND conferences, brought into life in 1997 (see CERN Courier, November 1997, pp.16-18), covers almost all topics of modern particle physics (see contents).

  16. Space-time modeling of electricity spot prices

    DEFF Research Database (Denmark)

    Abate, Girum Dagnachew; Haldrup, Niels

    In this paper we derive a space-time model for electricity spot prices. A general spatial Durbin model that incorporates the temporal as well as spatial lags of spot prices is presented. Joint modeling of space-time effects is necessarily important when prices and loads are determined in a network...... in the spot price dynamics. Estimation of the spatial Durbin model show that the spatial lag variable is as important as the temporal lag variable in describing the spot price dynamics. We use the partial derivatives impact approach to decompose the price impacts into direct and indirect effects and we show...... that price effects transmit to neighboring markets and decline with distance. In order to examine the evolution of the spatial correlation over time, a time varying parameters spot price spatial Durbin model is estimated using recursive estimation. It is found that the spatial correlation within the Nord...

  17. ML-Space: Hybrid Spatial Gillespie and Particle Simulation of Multi-Level Rule-Based Models in Cell Biology.

    Science.gov (United States)

    Bittig, Arne T; Uhrmacher, Adelinde M

    2017-01-01

    Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.

  18. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications

    Science.gov (United States)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray

    2012-01-01

    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.

  19. Modeling and Analysis of Space Based Transceivers

    Science.gov (United States)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  20. Dynamical modeling approach to risk assessment for radiogenic leukemia among astronauts engaged in interplanetary space missions.

    Science.gov (United States)

    Smirnova, Olga A; Cucinotta, Francis A

    2018-02-01

    A recently developed biologically motivated dynamical model of the assessment of the excess relative risk (ERR) for radiogenic leukemia among acutely/continuously irradiated humans (Smirnova, 2015, 2017) is applied to estimate the ERR for radiogenic leukemia among astronauts engaged in long-term interplanetary space missions. Numerous scenarios of space radiation exposure during space missions are used in the modeling studies. The dependence of the ERR for leukemia among astronauts on several mission parameters including the dose equivalent rates of galactic cosmic rays (GCR) and large solar particle events (SPEs), the number of large SPEs, the time interval between SPEs, mission duration, the degree of astronaut's additional shielding during SPEs, the degree of their additional 12-hour's daily shielding, as well as the total mission dose equivalent, is examined. The results of the estimation of ERR for radiogenic leukemia among astronauts, which are obtained in the framework of the developed dynamical model for various scenarios of space radiation exposure, are compared with the corresponding results, computed by the commonly used linear model. It is revealed that the developed dynamical model along with the linear model can be applied to estimate ERR for radiogenic leukemia among astronauts engaged in long-term interplanetary space missions in the range of applicability of the latter. In turn, the developed dynamical model is capable of predicting the ERR for leukemia among astronauts for the irradiation regimes beyond the applicability range of the linear model in emergency cases. As a supplement to the estimations of cancer incidence and death (REIC and REID) (Cucinotta et al., 2013, 2017), the developed dynamical model for the assessment of the ERR for leukemia can be employed on the pre-mission design phase for, e.g., the optimization of the regimes of astronaut's additional shielding in the course of interplanetary space missions. The developed model can

  1. A state space approach for the eigenvalue problem of marine risers

    KAUST Repository

    Alfosail, Feras

    2017-10-05

    A numerical state-space approach is proposed to examine the natural frequencies and critical buckling limits of marine risers. A large axial tension in the riser model causes numerical limitations. These limitations are overcome by using the modified Gram–Schmidt orthonormalization process as an intermediate step during the numerical integration process with the fourth-order Runge–Kutta scheme. The obtained results are validated against those obtained with other numerical methods, such as the finite-element, Galerkin, and power-series methods, and are found to be in good agreement. The state-space approach is shown to be computationally more efficient than the other methods. Also, we investigate the effect of a high applied tension, a high apparent weight, and higher-order modes on the accuracy of the numerical scheme. We demonstrate that, by applying the orthonormalization process, the stability and convergence of the approach are significantly improved.

  2. Contaminant ingress into multizone buildings: An analytical state-space approach

    KAUST Repository

    Parker, Simon

    2013-08-13

    The ingress of exterior contaminants into buildings is often assessed by treating the building interior as a single well-mixed space. Multizone modelling provides an alternative way of representing buildings that can estimate concentration time series in different internal locations. A state-space approach is adopted to represent the concentration dynamics within multizone buildings. Analysis based on this approach is used to demonstrate that the exposure in every interior location is limited to the exterior exposure in the absence of removal mechanisms. Estimates are also developed for the short term maximum concentration and exposure in a multizone building in response to a step-change in concentration. These have considerable potential for practical use. The analytical development is demonstrated using a simple two-zone building with an inner zone and a range of existing multizone models of residential buildings. Quantitative measures are provided of the standard deviation of concentration and exposure within a range of residential multizone buildings. Ratios of the maximum short term concentrations and exposures to single zone building estimates are also provided for the same buildings. © 2013 Tsinghua University Press and Springer-Verlag Berlin Heidelberg.

  3. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  4. Modal-space reference-model-tracking fuzzy control of earthquake excited structures

    Science.gov (United States)

    Park, Kwan-Soon; Ok, Seung-Yong

    2015-01-01

    This paper describes an adaptive modal-space reference-model-tracking fuzzy control technique for the vibration control of earthquake-excited structures. In the proposed approach, the fuzzy logic is introduced to update optimal control force so that the controlled structural response can track the desired response of a reference model. For easy and practical implementation, the reference model is constructed by assigning the target damping ratios to the first few dominant modes in modal space. The numerical simulation results demonstrate that the proposed approach successfully achieves not only the adaptive fault-tolerant control system against partial actuator failures but also the robust performance against the variations of the uncertain system properties by redistributing the feedback control forces to the available actuators.

  5. Robust control of uncertain dynamic systems a linear state space approach

    CERN Document Server

    Yedavalli, Rama K

    2014-01-01

    This textbook aims to provide a clear understanding of the various tools of analysis and design for robust stability and performance of uncertain dynamic systems. In model-based control design and analysis, mathematical models can never completely represent the “real world” system that is being modeled, and thus it is imperative to incorporate and accommodate a level of uncertainty into the models. This book directly addresses these issues from a deterministic uncertainty viewpoint and focuses on the interval parameter characterization of uncertain systems. Various tools of analysis and design are presented in a consolidated manner. This volume fills a current gap in published works by explicitly addressing the subject of control of dynamic systems from linear state space framework, namely using a time-domain, matrix-theory based approach. This book also: Presents and formulates the robustness problem in a linear state space model framework Illustrates various systems level methodologies with examples and...

  6. An Approach to Quality Estimation in Model-Based Development

    DEFF Research Database (Denmark)

    Holmegaard, Jens Peter; Koch, Peter; Ravn, Anders Peter

    2004-01-01

    We present an approach to estimation of parameters for design space exploration in Model-Based Development, where synthesis of a system is done in two stages. Component qualities like space, execution time or power consumption are defined in a repository by platform dependent values. Connectors...

  7. Approach to developing reliable space reactor power systems

    International Nuclear Information System (INIS)

    Mondt, J.F.; Shinbrot, C.H.

    1991-01-01

    The Space Reactor Power System Project is in the engineering development phase of a three-phase program. During Phase II, the Engineering Development Phase, the SP-100 Project has defined and is pursuing a new approach to developing reliable power systems. The approach to developing such a system during the early technology phase is described in this paper along with some preliminary examples to help explain the approach. Developing reliable components to meet space reactor power system requirements is based on a top down systems approach which includes a point design based on a detailed technical specification of a 100 kW power system

  8. Discrete- vs. Continuous-Time Modeling of Unequally Spaced Experience Sampling Method Data

    Directory of Open Access Journals (Sweden)

    Silvia de Haan-Rietdijk

    2017-10-01

    Full Text Available The Experience Sampling Method is a common approach in psychological research for collecting intensive longitudinal data with high ecological validity. One characteristic of ESM data is that it is often unequally spaced, because the measurement intervals within a day are deliberately varied, and measurement continues over several days. This poses a problem for discrete-time (DT modeling approaches, which are based on the assumption that all measurements are equally spaced. Nevertheless, DT approaches such as (vector autoregressive modeling are often used to analyze ESM data, for instance in the context of affective dynamics research. There are equivalent continuous-time (CT models, but they are more difficult to implement. In this paper we take a pragmatic approach and evaluate the practical relevance of the violated model assumption in DT AR(1 and VAR(1 models, for the N = 1 case. We use simulated data under an ESM measurement design to investigate the bias in the parameters of interest under four different model implementations, ranging from the true CT model that accounts for all the exact measurement times, to the crudest possible DT model implementation, where even the nighttime is treated as a regular interval. An analysis of empirical affect data illustrates how the differences between DT and CT modeling can play out in practice. We find that the size and the direction of the bias in DT (VAR models for unequally spaced ESM data depend quite strongly on the true parameter in addition to data characteristics. Our recommendation is to use CT modeling whenever possible, especially now that new software implementations have become available.

  9. State Space Modeling Using SAS

    Directory of Open Access Journals (Sweden)

    Rajesh Selukar

    2011-05-01

    Full Text Available This article provides a brief introduction to the state space modeling capabilities in SAS, a well-known statistical software system. SAS provides state space modeling in a few different settings. SAS/ETS, the econometric and time series analysis module of the SAS system, contains many procedures that use state space models to analyze univariate and multivariate time series data. In addition, SAS/IML, an interactive matrix language in the SAS system, provides Kalman filtering and smoothing routines for stationary and nonstationary state space models. SAS/IML also provides support for linear algebra and nonlinear function optimization, which makes it a convenient environment for general-purpose state space modeling.

  10. Space Environment Modeling

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Collection includes presentation materials and outputs from operational space environment models produced by the NOAA Space Weather Prediction Center (SWPC) and...

  11. Space-time latent component Modeling of Geo-referenced health data

    OpenAIRE

    Lawson, Andrew B.; Song, Hae-Ryoung; Cai, Bo; Hossain, Md Monir; Huang, Kun

    2010-01-01

    Latent structure models have been proposed in many applications. For space time health data it is often important to be able to find underlying trends in time which are supported by subsets of small areas. Latent structure modeling is one approach to this analysis. This paper presents a mixture-based approach that can be appied to component selction. The analysis of a Georgia ambulatory asthma county level data set is presented and a simulation-based evaluation is made.

  12. Accounting for sampling error when inferring population synchrony from time-series data: a Bayesian state-space modelling approach with applications.

    Directory of Open Access Journals (Sweden)

    Hugues Santin-Janin

    Full Text Available BACKGROUND: Data collected to inform time variations in natural population size are tainted by sampling error. Ignoring sampling error in population dynamics models induces bias in parameter estimators, e.g., density-dependence. In particular, when sampling errors are independent among populations, the classical estimator of the synchrony strength (zero-lag correlation is biased downward. However, this bias is rarely taken into account in synchrony studies although it may lead to overemphasizing the role of intrinsic factors (e.g., dispersal with respect to extrinsic factors (the Moran effect in generating population synchrony as well as to underestimating the extinction risk of a metapopulation. METHODOLOGY/PRINCIPAL FINDINGS: The aim of this paper was first to illustrate the extent of the bias that can be encountered in empirical studies when sampling error is neglected. Second, we presented a space-state modelling approach that explicitly accounts for sampling error when quantifying population synchrony. Third, we exemplify our approach with datasets for which sampling variance (i has been previously estimated, and (ii has to be jointly estimated with population synchrony. Finally, we compared our results to those of a standard approach neglecting sampling variance. We showed that ignoring sampling variance can mask a synchrony pattern whatever its true value and that the common practice of averaging few replicates of population size estimates poorly performed at decreasing the bias of the classical estimator of the synchrony strength. CONCLUSION/SIGNIFICANCE: The state-space model used in this study provides a flexible way of accurately quantifying the strength of synchrony patterns from most population size data encountered in field studies, including over-dispersed count data. We provided a user-friendly R-program and a tutorial example to encourage further studies aiming at quantifying the strength of population synchrony to account for

  13. Model-Based Trade Space Exploration for Near-Earth Space Missions

    Science.gov (United States)

    Cohen, Ronald H.; Boncyk, Wayne; Brutocao, James; Beveridge, Iain

    2005-01-01

    We developed a capability for model-based trade space exploration to be used in the conceptual design of Earth-orbiting space missions. We have created a set of reusable software components to model various subsystems and aspects of space missions. Several example mission models were created to test the tools and process. This technique and toolset has demonstrated itself to be valuable for space mission architectural design.

  14. A perturbative approach to the redshift space power spectrum: beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Bose, Benjamin; Koyama, Kazuya, E-mail: benjamin.bose@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk [Institute of Cosmology and Gravitation, University of Portsmouth, Burnaby Road, Portsmouth, Hampshire, PO1 3FX (United Kingdom)

    2016-08-01

    We develop a code to produce the power spectrum in redshift space based on standard perturbation theory (SPT) at 1-loop order. The code can be applied to a wide range of modified gravity and dark energy models using a recently proposed numerical method by A.Taruya to find the SPT kernels. This includes Horndeski's theory with a general potential, which accommodates both chameleon and Vainshtein screening mechanisms and provides a non-linear extension of the effective theory of dark energy up to the third order. Focus is on a recent non-linear model of the redshift space power spectrum which has been shown to model the anisotropy very well at relevant scales for the SPT framework, as well as capturing relevant non-linear effects typical of modified gravity theories. We provide consistency checks of the code against established results and elucidate its application within the light of upcoming high precision RSD data.

  15. Constructive approaches to the space NPP designing

    International Nuclear Information System (INIS)

    Eremin, A.G.; Korobkov, L.S.; Matveev, A.V.; Trukhanov, Yu.L.; Pyshko, A.P.

    2000-01-01

    An example of designing a space NPP intended for power supply of telecommunication satellite is considered. It is shown that the designing approach based on the introduction of a leading criterion and dividing the design problems in two independent groups (reactor with radiation shield and equipment module) permits to develop the optimal design of a space NPP [ru

  16. Computational Fluid Dynamics Model for Saltstone Vault 4 Vapor Space

    International Nuclear Information System (INIS)

    Lee, Si Young

    2005-01-01

    Computational fluid dynamics (CFD) methods have been used to estimate the flow patterns for vapor space inside the Saltstone Vault No.4 under different operating scenarios. The purpose of this work is to examine the gas motions inside the vapor space under the current vault configurations. A CFD model took three-dimensional transient momentum-energy coupled approach for the vapor space domain of the vault. The modeling calculations were based on prototypic vault geometry and expected normal operating conditions as defined by Waste Solidification Engineering. The modeling analysis was focused on the air flow patterns near the ventilated corner zones of the vapor space inside the Saltstone vault. The turbulence behavior and natural convection mechanism used in the present model were benchmarked against the literature information and theoretical results. The verified model was applied to the Saltstone vault geometry for the transient assessment of the air flow patterns inside the vapor space of the vault region using the boundary conditions as provided by the customer. The present model considered two cases for the estimations of the flow patterns within the vapor space. One is the reference baseline case. The other is for the negative temperature gradient between the roof inner and top grout surface temperatures intended for the potential bounding condition. The flow patterns of the vapor space calculated by the CFD model demonstrate that the ambient air comes into the vapor space of the vault through the lower-end ventilation hole, and it gets heated up by the Benard-cell type circulation before leaving the vault via the higher-end ventilation hole. The calculated results are consistent with the literature information

  17. A Description Of Space Relations In An NLP Model: The ABBYY Compreno Approach

    Directory of Open Access Journals (Sweden)

    Aleksey Leontyev

    2015-12-01

    Full Text Available The current paper is devoted to a formal analysis of the space category and, especially, to questions bound with the presentation of space relations in a formal NLP model. The aim is to demonstrate how linguistic and cognitive problems relating to spatial categorization, definition of spatial entities, and the expression of different locative senses in natural languages can be solved in an artificial intelligence system. We offer a description of the locative groups in the ABBYY Compreno formalism – an integral NLP framework applied for machine translation, semantic search, fact extraction, and other tasks based on the semantic analysis of texts. The model is based on a universal semantic hierarchy of the thesaurus type and includes a description of all possible semantic and syntactic links every word can attach. In this work we define the set of semantic locative relations between words, suggest different tools for their syntactic presentation, give formal restrictions for the word classes that can denote spaces, and show different strategies of dealing with locative prepositions, especially as far as the problem of their machine translation is concerned.

  18. Review of NASA approach to space radiation risk assessments for Mars exploration.

    Science.gov (United States)

    Cucinotta, Francis A

    2015-02-01

    Long duration space missions present unique radiation protection challenges due to the complexity of the space radiation environment, which includes high charge and energy particles and other highly ionizing radiation such as neutrons. Based on a recommendation by the National Council on Radiation Protection and Measurements, a 3% lifetime risk of exposure-induced death for cancer has been used as a basis for risk limitation by the National Aeronautics and Space Administration (NASA) for low-Earth orbit missions. NASA has developed a risk-based approach to radiation exposure limits that accounts for individual factors (age, gender, and smoking history) and assesses the uncertainties in risk estimates. New radiation quality factors with associated probability distribution functions to represent the quality factor's uncertainty have been developed based on track structure models and recent radiobiology data for high charge and energy particles. The current radiation dose limits are reviewed for spaceflight and the various qualitative and quantitative uncertainties that impact the risk of exposure-induced death estimates using the NASA Space Cancer Risk (NSCR) model. NSCR estimates of the number of "safe days" in deep space to be within exposure limits and risk estimates for a Mars exploration mission are described.

  19. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  20. Field-theoretic approach to gravity in the flat space-time

    Energy Technology Data Exchange (ETDEWEB)

    Cavalleri, G [Centro Informazioni Studi Esperienze, Milan (Italy); Milan Univ. (Italy). Ist. di Fisica); Spinelli, G [Istituto di Matematica del Politecnico di Milano, Milano (Italy)

    1980-01-01

    In this paper it is discussed how the field-theoretical approach to gravity starting from the flat space-time is wider than the Einstein approach. The flat approach is able to predict the structure of the observable space as a consequence of the behaviour of the particle proper masses. The field equations are formally equal to Einstein's equations without the cosmological term.

  1. Space-time latent component modeling of geo-referenced health data.

    Science.gov (United States)

    Lawson, Andrew B; Song, Hae-Ryoung; Cai, Bo; Hossain, Md Monir; Huang, Kun

    2010-08-30

    Latent structure models have been proposed in many applications. For space-time health data it is often important to be able to find the underlying trends in time, which are supported by subsets of small areas. Latent structure modeling is one such approach to this analysis. This paper presents a mixture-based approach that can be applied to component selection. The analysis of a Georgia ambulatory asthma county-level data set is presented and a simulation-based evaluation is made. Copyright (c) 2010 John Wiley & Sons, Ltd.

  2. Optimization of the graph model of the water conduit network, based on the approach of search space reducing

    Science.gov (United States)

    Korovin, Iakov S.; Tkachenko, Maxim G.

    2018-03-01

    In this paper we present a heuristic approach, improving the efficiency of methods, used for creation of efficient architecture of water distribution networks. The essence of the approach is a procedure of search space reduction the by limiting the range of available pipe diameters that can be used for each edge of the network graph. In order to proceed the reduction, two opposite boundary scenarios for the distribution of flows are analysed, after which the resulting range is further narrowed by applying a flow rate limitation for each edge of the network. The first boundary scenario provides the most uniform distribution of the flow in the network, the opposite scenario created the net with the highest possible flow level. The parameters of both distributions are calculated by optimizing systems of quadratic functions in a confined space, which can be effectively performed with small time costs. This approach was used to modify the genetic algorithm (GA). The proposed GA provides a variable number of variants of each gene, according to the number of diameters in list, taking into account flow restrictions. The proposed approach was implemented to the evaluation of a well-known test network - the Hanoi water distribution network [1], the results of research were compared with a classical GA with an unlimited search space. On the test data, the proposed trip significantly reduced the search space and provided faster and more obvious convergence in comparison with the classical version of GA.

  3. Accessibility of green space in urban areas: an examination of various approaches to measure it

    OpenAIRE

    Zhang, Xin

    2007-01-01

    In the present research, we attempt to improve the methods used for measuring accessibility of green spaces by combining two components of accessibility-distance and demand relative to supply. Three modified approaches (Joseph and Bantock gravity model measure, the two-step floating catchment area measure and a measure based on kernel densities) will be applied for measuring accessibility to green spaces. We select parks and public open spaces (metropolitan open land) of south London as a cas...

  4. Requirements for High Level Models Supporting Design Space Exploration in Model-based Systems Engineering

    OpenAIRE

    Haveman, Steven P.; Bonnema, G. Maarten

    2013-01-01

    Most formal models are used in detailed design and focus on a single domain. Few effective approaches exist that can effectively tie these lower level models to a high level system model during design space exploration. This complicates the validation of high level system requirements during detailed design. In this paper, we define requirements for a high level model that is firstly driven by key systems engineering challenges present in industry and secondly connects to several formal and d...

  5. Space ecoliteracy- five informal education models for community empowerment

    Science.gov (United States)

    Venkataramaiah, Jagannatha; Jagannath, Sahana; J, Spandana; J, Sadhana; Jagannath, Shobha

    Space ecoliteracy is a historical necessity and vital aspect of space age.Space Situational Awareness has taught lessons for mankind to look inward while stretching beyond cradle in human endeavours. Quality of life for every one on the only home of mankind-TERRA shall be a feasibility only after realizing Space ecoliteracy amongst all stakeholders in space quest. Objectives of Informal Environmental Education(UNESCO/UNEP/IEEP,1977) mandates awareness, attitude, knowledge, skill and participation at Individual and Community domains. Application of Space Technology at both Telecommunications and Remote Sensing domain have started making the fact that mankind has a challenge to learn and affirm earthmanship. Community empowerment focus after Earth Summit 1992 mandate of Sustainable Development has demonstrated a deluge of best practices in Agriculture,Urban, Industries and service sectors all over the globe. Further, deployment of Space technologies have proved the immense potential only after pre-empting the participatory approach at individual and community levels.Indian Space Programme with its 44th year of space service to national development has demonstrated self reliance in space technology for human development. Space technology for the most underdeveloped is a success story both in communication and information tools for quality of life. In this presentation Five Space Ecoliteracy models designed and validated since 1985 till date on informal environmental education namely 1) Ecological Environmental Studies by Students-EESS (1988): cited as one of the 20 best eco -education models by Earth Day Network,2)Community Eco Literacy Campaign-CEL,(2000): cited as a partner under Clean Up the World Campaign,UN, 3) Space Eco Literacy(2011)-an informa 8 week space eco literacy training reported at 39th COSPAR 12 assembly and 4) Space Eco Literacy by Practice(2014)- interface with formal education at institutions and 5) Space Ecoliteracy Mission as a space out reach in

  6. Modeling Physarum space exploration using memristors

    International Nuclear Information System (INIS)

    Ntinas, V; Sirakoulis, G Ch; Vourkas, I; Adamatzky, A I

    2017-01-01

    Slime mold Physarum polycephalum optimizes its foraging behaviour by minimizing the distances between the sources of nutrients it spans. When two sources of nutrients are present, the slime mold connects the sources, with its protoplasmic tubes, along the shortest path. We present a two-dimensional mesh grid memristor based model as an approach to emulate Physarum’s foraging strategy, which includes space exploration and reinforcement of the optimally formed interconnection network in the presence of multiple aliment sources. The proposed algorithmic approach utilizes memristors and LC contours and is tested in two of the most popular computational challenges for Physarum, namely maze and transportation networks. Furthermore, the presented model is enriched with the notion of noise presence, which positively contributes to a collective behavior and enables us to move from deterministic to robust results. Consequently, the corresponding simulation results manage to reproduce, in a much better qualitative way, the expected transportation networks. (paper)

  7. Real-space grids and the Octopus code as tools for the development of new simulation approaches for electronic systems

    Science.gov (United States)

    Andrade, Xavier; Strubbe, David; De Giovannini, Umberto; Larsen, Ask Hjorth; Oliveira, Micael J. T.; Alberdi-Rodriguez, Joseba; Varas, Alejandro; Theophilou, Iris; Helbig, Nicole; Verstraete, Matthieu J.; Stella, Lorenzo; Nogueira, Fernando; Aspuru-Guzik, Alán; Castro, Alberto; Marques, Miguel A. L.; Rubio, Angel

    Real-space grids are a powerful alternative for the simulation of electronic systems. One of the main advantages of the approach is the flexibility and simplicity of working directly in real space where the different fields are discretized on a grid, combined with competitive numerical performance and great potential for parallelization. These properties constitute a great advantage at the time of implementing and testing new physical models. Based on our experience with the Octopus code, in this article we discuss how the real-space approach has allowed for the recent development of new ideas for the simulation of electronic systems. Among these applications are approaches to calculate response properties, modeling of photoemission, optimal control of quantum systems, simulation of plasmonic systems, and the exact solution of the Schr\\"odinger equation for low-dimensionality systems.

  8. Bayesian state space models for dynamic genetic network construction across multiple tissues.

    Science.gov (United States)

    Liang, Yulan; Kelemen, Arpad

    2016-08-01

    Construction of gene-gene interaction networks and potential pathways is a challenging and important problem in genomic research for complex diseases while estimating the dynamic changes of the temporal correlations and non-stationarity are the keys in this process. In this paper, we develop dynamic state space models with hierarchical Bayesian settings to tackle this challenge for inferring the dynamic profiles and genetic networks associated with disease treatments. We treat both the stochastic transition matrix and the observation matrix time-variant and include temporal correlation structures in the covariance matrix estimations in the multivariate Bayesian state space models. The unevenly spaced short time courses with unseen time points are treated as hidden state variables. Hierarchical Bayesian approaches with various prior and hyper-prior models with Monte Carlo Markov Chain and Gibbs sampling algorithms are used to estimate the model parameters and the hidden state variables. We apply the proposed Hierarchical Bayesian state space models to multiple tissues (liver, skeletal muscle, and kidney) Affymetrix time course data sets following corticosteroid (CS) drug administration. Both simulation and real data analysis results show that the genomic changes over time and gene-gene interaction in response to CS treatment can be well captured by the proposed models. The proposed dynamic Hierarchical Bayesian state space modeling approaches could be expanded and applied to other large scale genomic data, such as next generation sequence (NGS) combined with real time and time varying electronic health record (EHR) for more comprehensive and robust systematic and network based analysis in order to transform big biomedical data into predictions and diagnostics for precision medicine and personalized healthcare with better decision making and patient outcomes.

  9. Reflected kinetics model for nuclear space reactor kinetics and control scoping calculations

    Energy Technology Data Exchange (ETDEWEB)

    Washington, K.E.

    1986-05-01

    The objective of this research is to develop a model that offers an alternative to the point kinetics (PK) modelling approach in the analysis of space reactor kinetics and control studies. Modelling effort will focus on the explicit treatment of control drums as reactivity input devices so that the transition to automatic control can be smoothly done. The proposed model is developed for the specific integration of automatic control and the solution of the servo mechanism problem. The integration of the kinetics model with an automatic controller will provide a useful tool for performing space reactor scoping studies for different designs and configurations. Such a tool should prove to be invaluable in the design phase of a space nuclear system from the point of view of kinetics and control limitations.

  10. Reflected kinetics model for nuclear space reactor kinetics and control scoping calculations

    International Nuclear Information System (INIS)

    Washington, K.E.

    1986-05-01

    The objective of this research is to develop a model that offers an alternative to the point kinetics (PK) modelling approach in the analysis of space reactor kinetics and control studies. Modelling effort will focus on the explicit treatment of control drums as reactivity input devices so that the transition to automatic control can be smoothly done. The proposed model is developed for the specific integration of automatic control and the solution of the servo mechanism problem. The integration of the kinetics model with an automatic controller will provide a useful tool for performing space reactor scoping studies for different designs and configurations. Such a tool should prove to be invaluable in the design phase of a space nuclear system from the point of view of kinetics and control limitations

  11. Three-Dimensional Crane Modelling and Control Using Euler-Lagrange State-Space Approach and Anti-Swing Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Aksjonov Andrei

    2015-12-01

    Full Text Available The mathematical model of the three-dimensional crane using the Euler-Lagrange approach is derived. A state-space representation of the derived model is proposed and explored in the Simulink® environment and on the laboratory stand. The obtained control design was simulated, analyzed and compared with existing encoder-based system provided by the three-dimensional (3D Crane manufacturer Inteco®. As well, an anti-swing fuzzy logic control has been developed, simulated, and analyzed. Obtained control algorithm is compared with the existing anti-swing proportional-integral controller designed by the 3D crane manufacturer Inteco®. 5-degree of freedom (5DOF control schemes are designed, examined and compared with the various load masses. The topicality of the problem is due to the wide usage of gantry cranes in industry. The solution is proposed for the future research in sensorless and intelligent control of complex motor driven application.

  12. The algebraic approach to space-time geometry

    International Nuclear Information System (INIS)

    Heller, M.; Multarzynski, P.; Sasin, W.

    1989-01-01

    A differential manifold can be defined in terms of smooth real functions carried by it. By rejecting the postulate, in such a definition, demanding the local diffeomorphism of a manifold to the Euclidean space, one obtains the so-called differential space concept. Every subset of R n turns out to be a differential space. Extensive parts of differential geometry on differential spaces, developed by Sikorski, are reviewed and adapted to relativistic purposes. Differential space as a new model of space-time is proposed. The Lorentz structure and Einstein's field equations on differential spaces are discussed. 20 refs. (author)

  13. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    Science.gov (United States)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

  14. Modeling nonstationarity in space and time.

    Science.gov (United States)

    Shand, Lyndsay; Li, Bo

    2017-09-01

    We propose to model a spatio-temporal random field that has nonstationary covariance structure in both space and time domains by applying the concept of the dimension expansion method in Bornn et al. (2012). Simulations are conducted for both separable and nonseparable space-time covariance models, and the model is also illustrated with a streamflow dataset. Both simulation and data analyses show that modeling nonstationarity in both space and time can improve the predictive performance over stationary covariance models or models that are nonstationary in space but stationary in time. © 2017, The International Biometric Society.

  15. A Principled Approach to the Specification of System Architectures for Space Missions

    Science.gov (United States)

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad

    2015-01-01

    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  16. Space debris: modeling and detectability

    Science.gov (United States)

    Wiedemann, C.; Lorenz, J.; Radtke, J.; Kebschull, C.; Horstmann, A.; Stoll, E.

    2017-01-01

    High precision orbit determination is required for the detection and removal of space debris. Knowledge of the distribution of debris objects in orbit is necessary for orbit determination by active or passive sensors. The results can be used to investigate the orbits on which objects of a certain size at a certain frequency can be found. The knowledge of the orbital distribution of the objects as well as their properties in accordance with sensor performance models provide the basis for estimating the expected detection rates. Comprehensive modeling of the space debris environment is required for this. This paper provides an overview of the current state of knowledge about the space debris environment. In particular non-cataloged small objects are evaluated. Furthermore, improvements concerning the update of the current space debris model are addressed. The model of the space debris environment is based on the simulation of historical events, such as fragmentations due to explosions and collisions that actually occurred in Earth orbits. The orbital distribution of debris is simulated by propagating the orbits considering all perturbing forces up to a reference epoch. The modeled object population is compared with measured data and validated. The model provides a statistical distribution of space objects, according to their size and number. This distribution is based on the correct consideration of orbital mechanics. This allows for a realistic description of the space debris environment. Subsequently, a realistic prediction can be provided concerning the question, how many pieces of debris can be expected on certain orbits. To validate the model, a software tool has been developed which allows the simulation of the observation behavior of ground-based or space-based sensors. Thus, it is possible to compare the results of published measurement data with simulated detections. This tool can also be used for the simulation of sensor measurement campaigns. It is

  17. Semiclassical moment of inertia shell-structure within the phase-space approach

    International Nuclear Information System (INIS)

    Gorpinchenko, D V; Magner, A G; Bartel, J; Blocki, J P

    2015-01-01

    The moment of inertia for nuclear collective rotations is derived within a semiclassical approach based on the cranking model and the Strutinsky shell-correction method by using the non-perturbative periodic-orbit theory in the phase-space variables. This moment of inertia for adiabatic (statistical-equilibrium) rotations can be approximated by the generalized rigid-body moment of inertia accounting for the shell corrections of the particle density. A semiclassical phase-space trace formula allows us to express the shell components of the moment of inertia quite accurately in terms of the free-energy shell corrections for integrable and partially chaotic Fermi systems, which is in good agreement with the corresponding quantum calculations. (paper)

  18. Collaborative Approaches in Developing Environmental and Safety Management Systems for Commercial Space Transportation

    Science.gov (United States)

    Zee, Stacey; Murray, D.

    2009-01-01

    The Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST) licenses and permits U.S. commercial space launch and reentry activities, and licenses the operation of non-federal launch and reentry sites. ASTs mission is to ensure the protection of the public, property, and the national security and foreign policy interests of the United States during commercial space transportation activities and to encourage, facilitate, and promote U.S. commercial space transportation. AST faces unique challenges of ensuring the protection of public health and safety while facilitating and promoting U.S. commercial space transportation. AST has developed an Environmental Management System (EMS) and a Safety Management System (SMS) to help meet its mission. Although the EMS and SMS were developed independently, the systems share similar elements. Both systems follow a Plan-Do-Act-Check model in identifying potential environmental aspects or public safety hazards, assessing significance in terms of severity and likelihood of occurrence, developing approaches to reduce risk, and verifying that the risk is reduced. This paper will describe the similarities between ASTs EMS and SMS elements and how AST is building a collaborative approach in environmental and safety management to reduce impacts to the environment and risks to the public.

  19. Topology-based description of the NCA cathode configurational space and an approach of its effective reduction

    Directory of Open Access Journals (Sweden)

    Zolotarev Pavel

    2018-01-01

    Full Text Available Modification of existing solid electrolyte and cathode materialsis a topic of interest for theoreticians and experimentalists. In particular, itrequires elucidation of the influence of dopants on the characteristics of thestudying materials. For the reason of high complexity of theconfigurational space of doped/deintercalated systems, application of thecomputer modeling approaches is hindered, despite significant advances ofcomputational facilities in last decades. In this study, we propose a scheme,which allows to reduce a set of structures of a modeled configurationalspace for the subsequent study by means of the time-consuming quantumchemistry methods. Application of the proposed approach is exemplifiedthrough the study of the configurational space of the commercialLiNi0.8Co0.15Al0.05O2 (NCA cathode material approximant.

  20. Topology-based description of the NCA cathode configurational space and an approach of its effective reduction

    Science.gov (United States)

    Zolotarev, Pavel; Eremin, Roman

    2018-04-01

    Modification of existing solid electrolyte and cathode materialsis a topic of interest for theoreticians and experimentalists. In particular, itrequires elucidation of the influence of dopants on the characteristics of thestudying materials. For the reason of high complexity of theconfigurational space of doped/deintercalated systems, application of thecomputer modeling approaches is hindered, despite significant advances ofcomputational facilities in last decades. In this study, we propose a scheme,which allows to reduce a set of structures of a modeled configurationalspace for the subsequent study by means of the time-consuming quantumchemistry methods. Application of the proposed approach is exemplifiedthrough the study of the configurational space of the commercialLiNi0.8Co0.15Al0.05O2 (NCA) cathode material approximant.

  1. Space nuclear reactor system diagnosis: Knowledge-based approach

    International Nuclear Information System (INIS)

    Ting, Y.T.D.

    1990-01-01

    SP-100 space nuclear reactor system development is a joint effort by the Department of Energy, the Department of Defense and the National Aeronautics and Space Administration. The system is designed to operate in isolation for many years, and is possibly subject to little or no remote maintenance. This dissertation proposes a knowledge based diagnostic system which, in principle, can diagnose the faults which can either cause reactor shutdown or lead to another serious problem. This framework in general can be applied to the fully specified system if detailed design information becomes available. The set of faults considered herein is identified based on heuristic knowledge about the system operation. The suitable approach to diagnostic problem solving is proposed after investigating the most prevalent methodologies in Artificial Intelligence as well as the causal analysis of the system. Deep causal knowledge modeling based on digraph, fault-tree or logic flowgraph methodology would present a need for some knowledge representation to handle the time dependent system behavior. A proposed qualitative temporal knowledge modeling methodology, using rules with specified time delay among the process variables, has been proposed and is used to develop the diagnostic sufficient rule set. The rule set has been modified by using a time zone approach to have a robust system design. The sufficient rule set is transformed to a sufficient and necessary one by searching the whole knowledge base. Qualitative data analysis is proposed in analyzing the measured data if in a real time situation. An expert system shell - Intelligence Compiler is used to develop the prototype system. Frames are used for the process variables. Forward chaining rules are used in monitoring and backward chaining rules are used in diagnosis

  2. Space-time least-squares Petrov-Galerkin projection in nonlinear model reduction.

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Youngsoo [Sandia National Laboratories (SNL-CA), Livermore, CA (United States). Extreme-scale Data Science and Analytics Dept.; Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Carlberg, Kevin Thomas [Sandia National Laboratories (SNL-CA), Livermore, CA (United States). Extreme-scale Data Science and Analytics Dept.

    2017-09-01

    Our work proposes a space-time least-squares Petrov-Galerkin (ST-LSPG) projection method for model reduction of nonlinear dynamical systems. In contrast to typical nonlinear model-reduction methods that first apply Petrov-Galerkin projection in the spatial dimension and subsequently apply time integration to numerically resolve the resulting low-dimensional dynamical system, the proposed method applies projection in space and time simultaneously. To accomplish this, the method first introduces a low-dimensional space-time trial subspace, which can be obtained by computing tensor decompositions of state-snapshot data. The method then computes discrete-optimal approximations in this space-time trial subspace by minimizing the residual arising after time discretization over all space and time in a weighted ℓ2-norm. This norm can be de ned to enable complexity reduction (i.e., hyper-reduction) in time, which leads to space-time collocation and space-time GNAT variants of the ST-LSPG method. Advantages of the approach relative to typical spatial-projection-based nonlinear model reduction methods such as Galerkin projection and least-squares Petrov-Galerkin projection include: (1) a reduction of both the spatial and temporal dimensions of the dynamical system, (2) the removal of spurious temporal modes (e.g., unstable growth) from the state space, and (3) error bounds that exhibit slower growth in time. Numerical examples performed on model problems in fluid dynamics demonstrate the ability of the method to generate orders-of-magnitude computational savings relative to spatial-projection-based reduced-order models without sacrificing accuracy.

  3. Combining Statistical Methodologies in Water Quality Monitoring in a Hydrological Basin - Space and Time Approaches

    OpenAIRE

    Costa, Marco; A. Manuela Gonçalves

    2012-01-01

    In this work are discussed some statistical approaches that combine multivariate statistical techniques and time series analysis in order to describe and model spatial patterns and temporal evolution by observing hydrological series of water quality variables recorded in time and space. These approaches are illustrated with a data set collected in the River Ave hydrological basin located in the Northwest region of Portugal.

  4. Data Model Management for Space Information Systems

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Ramirez, Paul; Mattmann, chris

    2006-01-01

    The Reference Architecture for Space Information Management (RASIM) suggests the separation of the data model from software components to promote the development of flexible information management systems. RASIM allows the data model to evolve independently from the software components and results in a robust implementation that remains viable as the domain changes. However, the development and management of data models within RASIM are difficult and time consuming tasks involving the choice of a notation, the capture of the model, its validation for consistency, and the export of the model for implementation. Current limitations to this approach include the lack of ability to capture comprehensive domain knowledge, the loss of significant modeling information during implementation, the lack of model visualization and documentation capabilities, and exports being limited to one or two schema types. The advent of the Semantic Web and its demand for sophisticated data models has addressed this situation by providing a new level of data model management in the form of ontology tools. In this paper we describe the use of a representative ontology tool to capture and manage a data model for a space information system. The resulting ontology is implementation independent. Novel on-line visualization and documentation capabilities are available automatically, and the ability to export to various schemas can be added through tool plug-ins. In addition, the ingestion of data instances into the ontology allows validation of the ontology and results in a domain knowledge base. Semantic browsers are easily configured for the knowledge base. For example the export of the knowledge base to RDF/XML and RDFS/XML and the use of open source metadata browsers provide ready-made user interfaces that support both text- and facet-based search. This paper will present the Planetary Data System (PDS) data model as a use case and describe the import of the data model into an ontology tool

  5. A state-space model for estimating detailed movements and home range from acoustic receiver data

    DEFF Research Database (Denmark)

    Pedersen, Martin Wæver; Weng, Kevin

    2013-01-01

    We present a state-space model for acoustic receiver data to estimate detailed movement and home range of individual fish while accounting for spatial bias. An integral part of the approach is the detection function, which models the probability of logging tag transmissions as a function of dista......We present a state-space model for acoustic receiver data to estimate detailed movement and home range of individual fish while accounting for spatial bias. An integral part of the approach is the detection function, which models the probability of logging tag transmissions as a function...... that the location error scales log-linearly with detection range and movement speed. This result can be used as guideline for designing network layout when species movement capacity and acoustic environment are known or can be estimated prior to network deployment. Finally, as an example, the state-space model...... is used to estimate home range and movement of a reef fish in the Pacific Ocean....

  6. A Mellin space approach to the conformal bootstrap

    Energy Technology Data Exchange (ETDEWEB)

    Gopakumar, Rajesh [International Centre for Theoretical Sciences (ICTS-TIFR),Survey No. 151, Shivakote, Hesaraghatta Hobli, Bangalore North 560 089 (India); Kaviraj, Apratim [Centre for High Energy Physics, Indian Institute of Science,C.V. Raman Avenue, Bangalore 560012 (India); Sen, Kallol [Centre for High Energy Physics, Indian Institute of Science,C.V. Raman Avenue, Bangalore 560012 (India); Kavli Institute for the Physics and Mathematics of the Universe (WPI),The University of Tokyo Institutes for Advanced Study, Kashiwa, Chiba 277-8583 (Japan); Sinha, Aninda [Centre for High Energy Physics, Indian Institute of Science,C.V. Raman Avenue, Bangalore 560012 (India)

    2017-05-05

    We describe in more detail our approach to the conformal bootstrap which uses the Mellin representation of CFT{sub d} four point functions and expands them in terms of crossing symmetric combinations of AdS{sub d+1} Witten exchange functions. We consider arbitrary external scalar operators and set up the conditions for consistency with the operator product expansion. Namely, we demand cancellation of spurious powers (of the cross ratios, in position space) which translate into spurious poles in Mellin space. We discuss two contexts in which we can immediately apply this method by imposing the simplest set of constraint equations. The first is the epsilon expansion. We mostly focus on the Wilson-Fisher fixed point as studied in an epsilon expansion about d=4. We reproduce Feynman diagram results for operator dimensions to O(ϵ{sup 3}) rather straightforwardly. This approach also yields new analytic predictions for OPE coefficients to the same order which fit nicely with recent numerical estimates for the Ising model (at ϵ=1). We will also mention some leading order results for scalar theories near three and six dimensions. The second context is a large spin expansion, in any dimension, where we are able to reproduce and go a bit beyond some of the results recently obtained using the (double) light cone expansion. We also have a preliminary discussion about numerical implementation of the above bootstrap scheme in the absence of a small parameter.

  7. Generalized Wigner functions in curved spaces: A new approach

    International Nuclear Information System (INIS)

    Kandrup, H.E.

    1988-01-01

    It is well known that, given a quantum field in Minkowski space, one can define Wigner functions f/sub W//sup N/(x 1 ,p 1 ,...,x/sub N/,p/sub N/) which (a) are convenient to analyze since, unlike the field itself, they are c-number quantities and (b) can be interpreted in a limited sense as ''quantum distribution functions.'' Recently, Winter and Calzetta, Habib and Hu have shown one way in which these flat-space Wigner functions can be generalized to a curved-space setting, deriving thereby approximate kinetic equations which make sense ''quasilocally'' for ''short-wavelength modes.'' This paper suggests a completely orthogonal approach for defining curved-space Wigner functions which generalizes instead an object such as the Fourier-transformed f/sub W/ 1 (k,p), which is effectively a two-point function viewed in terms of the ''natural'' creation and annihilation operators a/sup dagger/(p-(12k) and a(p+(12k). The approach suggested here lacks the precise phase-space interpretation implicit in the approach of Winter or Calzetta, Habib, and Hu, but it is useful in that (a) it is geared to handle any ''natural'' mode decomposition, so that (b) it can facilitate exact calculations at least in certain limits, such as for a source-free linear field in a static spacetime

  8. Compact state-space models for complex superconducting radio-frequency structures based on model order reduction and concatenation methods

    International Nuclear Information System (INIS)

    Flisgen, Thomas

    2015-01-01

    The modeling of large chains of superconducting cavities with couplers is a challenging task in computational electrical engineering. The direct numerical treatment of these structures can easily lead to problems with more than ten million degrees of freedom. Problems of this complexity are typically solved with the help of parallel programs running on supercomputing infrastructures. However, these infrastructures are expensive to purchase, to operate, and to maintain. The aim of this thesis is to introduce and to validate an approach which allows for modeling large structures on a standard workstation. The novel technique is called State-Space Concatenations and is based on the decomposition of the complete structure into individual segments. The radio-frequency properties of the generated segments are described by a set of state-space equations which either emerge from analytical considerations or from numerical discretization schemes. The model order of these equations is reduced using dedicated model order reduction techniques. In a final step, the reduced-order state-space models of the segments are concatenated in accordance with the topology of the complete structure. The concatenation is based on algebraic continuity constraints of electric and magnetic fields on the decomposition planes and results in a compact state-space system of the complete radio-frequency structure. Compared to the original problem, the number of degrees of freedom is drastically reduced, i.e. a problem with more than ten million degrees of freedom can be reduced on a standard workstation to a problem with less than one thousand degrees of freedom. The final state-space system allows for determining frequency-domain transfer functions, field distributions, resonances, and quality factors of the complete structure in a convenient manner. This thesis presents the theory of the state-space concatenation approach and discusses several validation and application examples. The examples

  9. IASM: Individualized activity space modeler

    Science.gov (United States)

    Hasanzadeh, Kamyar

    2018-01-01

    Researchers from various disciplines have long been interested in analyzing and describing human mobility patterns. Activity space (AS), defined as an area encapsulating daily human mobility and activities, has been at the center of this interest. However, given the applied nature of research in this field and the complexity that advanced geographical modeling can pose to its users, the proposed models remain simplistic and inaccurate in many cases. Individualized Activity Space Modeler (IASM) is a geographic information system (GIS) toolbox, written in Python programming language using ESRI's Arcpy module, comprising four tools aiming to facilitate the use of advanced activity space models in empirical research. IASM provides individual-based and context-sensitive tools to estimate home range distances, delineate activity spaces, and model place exposures using individualized geographical data. In this paper, we describe the design and functionality of IASM, and provide an example of how it performs on a spatial dataset collected through an online map-based survey.

  10. Dynamic Model Averaging in Large Model Spaces Using Dynamic Occam's Window.

    Science.gov (United States)

    Onorante, Luca; Raftery, Adrian E

    2016-01-01

    Bayesian model averaging has become a widely used approach to accounting for uncertainty about the structural form of the model generating the data. When data arrive sequentially and the generating model can change over time, Dynamic Model Averaging (DMA) extends model averaging to deal with this situation. Often in macroeconomics, however, many candidate explanatory variables are available and the number of possible models becomes too large for DMA to be applied in its original form. We propose a new method for this situation which allows us to perform DMA without considering the whole model space, but using a subset of models and dynamically optimizing the choice of models at each point in time. This yields a dynamic form of Occam's window. We evaluate the method in the context of the problem of nowcasting GDP in the Euro area. We find that its forecasting performance compares well with that of other methods.

  11. A Systems Approach to Developing an Affordable Space Ground Transportation Architecture using a Commonality Approach

    Science.gov (United States)

    Garcia, Jerry L.; McCleskey, Carey M.; Bollo, Timothy R.; Rhodes, Russel E.; Robinson, John W.

    2012-01-01

    This paper presents a structured approach for achieving a compatible Ground System (GS) and Flight System (FS) architecture that is affordable, productive and sustainable. This paper is an extension of the paper titled "Approach to an Affordable and Productive Space Transportation System" by McCleskey et al. This paper integrates systems engineering concepts and operationally efficient propulsion system concepts into a structured framework for achieving GS and FS compatibility in the mid-term and long-term time frames. It also presents a functional and quantitative relationship for assessing system compatibility called the Architecture Complexity Index (ACI). This paper: (1) focuses on systems engineering fundamentals as it applies to improving GS and FS compatibility; (2) establishes mid-term and long-term spaceport goals; (3) presents an overview of transitioning a spaceport to an airport model; (4) establishes a framework for defining a ground system architecture; (5) presents the ACI concept; (6) demonstrates the approach by presenting a comparison of different GS architectures; and (7) presents a discussion on the benefits of using this approach with a focus on commonality.

  12. Parametric cost models for space telescopes

    Science.gov (United States)

    Stahl, H. Philip; Henrichs, Todd; Dollinger, Courtnay

    2017-11-01

    Multivariable parametric cost models for space telescopes provide several benefits to designers and space system project managers. They identify major architectural cost drivers and allow high-level design trades. They enable cost-benefit analysis for technology development investment. And, they provide a basis for estimating total project cost. A survey of historical models found that there is no definitive space telescope cost model. In fact, published models vary greatly [1]. Thus, there is a need for parametric space telescopes cost models. An effort is underway to develop single variable [2] and multi-variable [3] parametric space telescope cost models based on the latest available data and applying rigorous analytical techniques. Specific cost estimating relationships (CERs) have been developed which show that aperture diameter is the primary cost driver for large space telescopes; technology development as a function of time reduces cost at the rate of 50% per 17 years; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and increasing mass reduces cost.

  13. Parametric Cost Models for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Henrichs, Todd; Dollinger, Courtney

    2010-01-01

    Multivariable parametric cost models for space telescopes provide several benefits to designers and space system project managers. They identify major architectural cost drivers and allow high-level design trades. They enable cost-benefit analysis for technology development investment. And, they provide a basis for estimating total project cost. A survey of historical models found that there is no definitive space telescope cost model. In fact, published models vary greatly [1]. Thus, there is a need for parametric space telescopes cost models. An effort is underway to develop single variable [2] and multi-variable [3] parametric space telescope cost models based on the latest available data and applying rigorous analytical techniques. Specific cost estimating relationships (CERs) have been developed which show that aperture diameter is the primary cost driver for large space telescopes; technology development as a function of time reduces cost at the rate of 50% per 17 years; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and increasing mass reduces cost.

  14. My Life with State Space Models

    DEFF Research Database (Denmark)

    Lundbye-Christensen, Søren

    2007-01-01

    . The conceptual idea behind the state space model is that the evolution over time in the object we are observing and the measurement process itself are modelled separately. My very first serious analysis of a data set was done using a state space model, and since then I seem to have been "haunted" by state space...

  15. Theory and experiments in model-based space system anomaly management

    Science.gov (United States)

    Kitts, Christopher Adam

    This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.

  16. Short-term wind speed prediction using an unscented Kalman filter based state-space support vector regression approach

    International Nuclear Information System (INIS)

    Chen, Kuilin; Yu, Jie

    2014-01-01

    Highlights: • A novel hybrid modeling method is proposed for short-term wind speed forecasting. • Support vector regression model is constructed to formulate nonlinear state-space framework. • Unscented Kalman filter is adopted to recursively update states under random uncertainty. • The new SVR–UKF approach is compared to several conventional methods for short-term wind speed prediction. • The proposed method demonstrates higher prediction accuracy and reliability. - Abstract: Accurate wind speed forecasting is becoming increasingly important to improve and optimize renewable wind power generation. Particularly, reliable short-term wind speed prediction can enable model predictive control of wind turbines and real-time optimization of wind farm operation. However, this task remains challenging due to the strong stochastic nature and dynamic uncertainty of wind speed. In this study, unscented Kalman filter (UKF) is integrated with support vector regression (SVR) based state-space model in order to precisely update the short-term estimation of wind speed sequence. In the proposed SVR–UKF approach, support vector regression is first employed to formulate a nonlinear state-space model and then unscented Kalman filter is adopted to perform dynamic state estimation recursively on wind sequence with stochastic uncertainty. The novel SVR–UKF method is compared with artificial neural networks (ANNs), SVR, autoregressive (AR) and autoregressive integrated with Kalman filter (AR-Kalman) approaches for predicting short-term wind speed sequences collected from three sites in Massachusetts, USA. The forecasting results indicate that the proposed method has much better performance in both one-step-ahead and multi-step-ahead wind speed predictions than the other approaches across all the locations

  17. Mapping the Hot Spots: A Zoning Approach to Space Analysis and Design

    Science.gov (United States)

    Bunnell, Adam; Carpenter, Russell; Hensley, Emily; Strong, Kelsey; Williams, ReBecca; Winter, Rachel

    2016-01-01

    This article examines a preliminary approach to space design developed and implemented in Eastern Kentucky University's Noel Studio for Academic Creativity. The approach discussed here is entitled "hot spots," which has allowed the research team to observe trends in space usage and composing activities among students. This approach has…

  18. Next Generation Space Interconnect Standard (NGSIS): a modular open standards approach for high performance interconnects for space

    Science.gov (United States)

    Collier, Charles Patrick

    2017-04-01

    The Next Generation Space Interconnect Standard (NGSIS) effort is a Government-Industry collaboration effort to define a set of standards for interconnects between space system components with the goal of cost effectively removing bandwidth as a constraint for future space systems. The NGSIS team has selected the ANSI/VITA 65 OpenVPXTM standard family for the physical baseline. The RapidIO protocol has been selected as the basis for the digital data transport. The NGSIS standards are developed to provide sufficient flexibility to enable users to implement a variety of system configurations, while meeting goals for interoperability and robustness for space. The NGSIS approach and effort represents a radical departure from past approaches to achieve a Modular Open System Architecture (MOSA) for space systems and serves as an exemplar for the civil, commercial, and military Space communities as well as a broader high reliability terrestrial market.

  19. Discrete random walk models for space-time fractional diffusion

    International Nuclear Information System (INIS)

    Gorenflo, Rudolf; Mainardi, Francesco; Moretti, Daniele; Pagnini, Gianni; Paradisi, Paolo

    2002-01-01

    A physical-mathematical approach to anomalous diffusion may be based on generalized diffusion equations (containing derivatives of fractional order in space or/and time) and related random walk models. By space-time fractional diffusion equation we mean an evolution equation obtained from the standard linear diffusion equation by replacing the second-order space derivative with a Riesz-Feller derivative of order α is part of (0,2] and skewness θ (moduleθ≤{α,2-α}), and the first-order time derivative with a Caputo derivative of order β is part of (0,1]. Such evolution equation implies for the flux a fractional Fick's law which accounts for spatial and temporal non-locality. The fundamental solution (for the Cauchy problem) of the fractional diffusion equation can be interpreted as a probability density evolving in time of a peculiar self-similar stochastic process that we view as a generalized diffusion process. By adopting appropriate finite-difference schemes of solution, we generate models of random walk discrete in space and time suitable for simulating random variables whose spatial probability density evolves in time according to this fractional diffusion equation

  20. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Science.gov (United States)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  1. Cost Modeling for Space Telescope

    Science.gov (United States)

    Stahl, H. Philip

    2011-01-01

    Parametric cost models are an important tool for planning missions, compare concepts and justify technology investments. This paper presents on-going efforts to develop single variable and multi-variable cost models for space telescope optical telescope assembly (OTA). These models are based on data collected from historical space telescope missions. Standard statistical methods are used to derive CERs for OTA cost versus aperture diameter and mass. The results are compared with previously published models.

  2. Space and place concepts analysis based on semiology approach in residential architecture

    Directory of Open Access Journals (Sweden)

    Mojtaba Parsaee

    2015-12-01

    Full Text Available Space and place are among the fundamental concepts in architecture about which many discussions have been held and the complexity and importance of these concepts were focused on. This research has introduced an approach to better cognition of the architectural concepts based on theory and method of semiology in linguistics. Hence, at first the research investigates the concepts of space and place and explains their characteristics in architecture. Then, it reviews the semiology theory and explores its concepts and ideas. After obtaining the principles of theory and also the method of semiology, they are redefined in an architectural system based on an adaptive method. Finally, the research offers a conceptual model which is called the semiology approach by considering the architectural system as a system of signs. The approach can be used to decode the content of meanings and forms and analyses of the architectural mechanism in order to obtain its meanings and concepts. In this way and based on this approach, the residential architecture of the traditional city of Bushehr – Iran was analyzed as a case of study and its concepts were extracted. The results of this research demonstrate the effectiveness of this approach in structure detection and identification of an architectural system. Besides, this approach has the capability to be used in processes of sustainable development and also be a basis for deconstruction of architectural texts. The research methods of this study are qualitative based on comparative and descriptive analyses.

  3. Preliminary Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Prince, F. Andrew; Smart, Christian; Stephens, Kyle; Henrichs, Todd

    2009-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. However, great care is required. Some space telescope cost models, such as those based only on mass, lack sufficient detail to support such analysis and may lead to inaccurate conclusions. Similarly, using ground based telescope models which include the dome cost will also lead to inaccurate conclusions. This paper reviews current and historical models. Then, based on data from 22 different NASA space telescopes, this paper tests those models and presents preliminary analysis of single and multi-variable space telescope cost models.

  4. State space model extraction of thermohydraulic systems – Part II: A linear graph approach applied to a Brayton cycle-based power conversion unit

    International Nuclear Information System (INIS)

    Uren, Kenneth Richard; Schoor, George van

    2013-01-01

    This second paper in a two part series presents the application of a developed state space model extraction methodology applied to a Brayton cycle-based PCU (power conversion unit) of a PBMR (pebble bed modular reactor). The goal is to investigate if the state space extraction methodology can cope with larger and more complex thermohydraulic systems. In Part I the state space model extraction methodology for the purpose of control was described in detail and a state space representation was extracted for a U-tube system to illustrate the concept. In this paper a 25th order nonlinear state space representation in terms of the different energy domains is extracted. This state space representation is solved and the responses of a number of important states are compared with results obtained from a PBMR PCU Flownex ® model. Flownex ® is a validated thermo fluid simulation software package. The results show that the state space model closely resembles the dynamics of the PBMR PCU. This kind of model may be used for nonlinear MIMO (multi-input, multi-output) type of control strategies. However, there is still a need for linear state space models since many control system design and analysis techniques require a linear state space model. This issue is also addressed in this paper by showing how a linear state space model can be derived from the extracted nonlinear state space model. The linearised state space model is also validated by comparing the state space model to an existing linear Simulink ® model of the PBMR PCU system. - Highlights: • State space model extraction of a pebble bed modular reactor PCU (power conversion unit). • A 25th order nonlinear time varying state space model is obtained. • Linearisation of a nonlinear state space model for use in power output control. • Non-minimum phase characteristic that is challenging in terms of control. • Models derived are useful for MIMO control strategies

  5. War-gaming application for future space systems acquisition part 2: acquisition and bidding war-gaming modeling and simulation approaches for FFP and FPIF

    Science.gov (United States)

    Nguyen, Tien M.; Guillen, Andy T.

    2017-05-01

    This paper describes cooperative and non-cooperative static Bayesian game models with complete and incomplete information for the development of optimum acquisition strategies associated with the Program and Technical Baseline (PTB) solutions obtained from Part 1 of this paper [1]. The optimum acquisition strategies discussed focus on achieving "Affordability" by incorporating contractors' bidding strategies into the government acquisition strategies for acquiring future space systems. The paper discusses System Engineering (SE) frameworks, analytical and simulation approaches and modeling for developing the optimum acquisition strategies from both the government and contractor perspectives for Firm Fixed Price (FFP) and Fixed Price Incentive Firm (FPIF) contract types.

  6. System resiliency quantification using non-state-space and state-space analytic models

    International Nuclear Information System (INIS)

    Ghosh, Rahul; Kim, DongSeong; Trivedi, Kishor S.

    2013-01-01

    Resiliency is becoming an important service attribute for large scale distributed systems and networks. Key problems in resiliency quantification are lack of consensus on the definition of resiliency and systematic approach to quantify system resiliency. In general, resiliency is defined as the ability of (system/person/organization) to recover/defy/resist from any shock, insult, or disturbance [1]. Many researchers interpret resiliency as a synonym for fault-tolerance and reliability/availability. However, effect of failure/repair on systems is already covered by reliability/availability measures and that of on individual jobs is well covered under the umbrella of performability [2] and task completion time analysis [3]. We use Laprie [4] and Simoncini [5]'s definition in which resiliency is the persistence of service delivery that can justifiably be trusted, when facing changes. The changes we are referring to here are beyond the envelope of system configurations already considered during system design, that is, beyond fault tolerance. In this paper, we outline a general approach for system resiliency quantification. Using examples of non-state-space and state-space stochastic models, we analytically–numerically quantify the resiliency of system performance, reliability, availability and performability measures w.r.t. structural and parametric changes

  7. Quasirelativistic quark model in quasipotential approach

    CERN Document Server

    Matveev, V A; Savrin, V I; Sissakian, A N

    2002-01-01

    The relativistic particles interaction is described within the frames of quasipotential approach. The presentation is based on the so called covariant simultaneous formulation of the quantum field theory, where by the theory is considered on the spatial-like three-dimensional hypersurface in the Minkowski space. Special attention is paid to the methods of plotting various quasipotentials as well as to the applications of the quasipotential approach to describing the characteristics of the relativistic particles interaction in the quark models, namely: the hadrons elastic scattering amplitudes, the mass spectra and widths mesons decays, the cross sections of the deep inelastic leptons scattering on the hadrons

  8. Monitoring Murder Crime in Namibia Using Bayesian Space-Time Models

    Directory of Open Access Journals (Sweden)

    Isak Neema

    2012-01-01

    Full Text Available This paper focuses on the analysis of murder in Namibia using Bayesian spatial smoothing approach with temporal trends. The analysis was based on the reported cases from 13 regions of Namibia for the period 2002–2006 complemented with regional population sizes. The evaluated random effects include space-time structured heterogeneity measuring the effect of regional clustering, unstructured heterogeneity, time, space and time interaction and population density. The model consists of carefully chosen prior and hyper-prior distributions for parameters and hyper-parameters, with inference conducted using Gibbs sampling algorithm and sensitivity test for model validation. The posterior mean estimate of the parameters from the model using DIC as model selection criteria show that most of the variation in the relative risk of murder is due to regional clustering, while the effect of population density and time was insignificant. The sensitivity analysis indicates that both intrinsic and Laplace CAR prior can be adopted as prior distribution for the space-time heterogeneity. In addition, the relative risk map show risk structure of increasing north-south gradient, pointing to low risk in northern regions of Namibia, while Karas and Khomas region experience long-term increase in murder risk.

  9. Modeling Growth and Yield of Schizolobium amazonicum under Different Spacings

    Directory of Open Access Journals (Sweden)

    Gilson Fernandes da Silva

    2013-01-01

    Full Text Available This study aimed to present an approach to model the growth and yield of the species Schizolobium amazonicum (Paricá based on a study of different spacings located in Pará, Brazil. Whole-stand models were employed, and two modeling strategies (Strategies A and B were tested. Moreover, the following three scenarios were evaluated to assess the accuracy of the model in estimating total and commercial volumes at five years of age: complete absence of data (S1; available information about the variables basal area, site index, dominant height, and number of trees at two years of age (S2; and this information available at five years of age (S3. The results indicated that the 3 × 2 spacing has a higher mortality rate than normal, and, in general, greater spacing corresponds to larger diameter and average height and smaller basal area and volume per hectare. In estimating the total and commercial volumes for the three scenarios tested, Strategy B seems to be the most appropriate method to estimate the growth and yield of Paricá plantations in the study region, particularly because Strategy A showed a significant bias in its estimates.

  10. Fractal electrodynamics via non-integer dimensional space approach

    Science.gov (United States)

    Tarasov, Vasily E.

    2015-09-01

    Using the recently suggested vector calculus for non-integer dimensional space, we consider electrodynamics problems in isotropic case. This calculus allows us to describe fractal media in the framework of continuum models with non-integer dimensional space. We consider electric and magnetic fields of fractal media with charges and currents in the framework of continuum models with non-integer dimensional spaces. An application of the fractal Gauss's law, the fractal Ampere's circuital law, the fractal Poisson equation for electric potential, and equation for fractal stream of charges are suggested. Lorentz invariance and speed of light in fractal electrodynamics are discussed. An expression for effective refractive index of non-integer dimensional space is suggested.

  11. Pulses in the Zero-Spacing Limit of the GOY Model

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Jensen, M.H.; Nielsen, J.L.

    2000-01-01

    We study the propagation of localised disturbances in a turbulent, but momentarily quiescent and unforced shell model (an approximation of the Navier-Stokes equations on a set of exponentially spaced momentum shells). These disturbances represent bursts of turbulence travelling down the inertial...... range, which is thought to be responsible for the intermittency observed in turbulence. Starting from the GOY shell model, we go to the limit where the distance between succeeding shells approaches zero ("the zero spacing limit") and helicity conservation is retained. We obtain a discrete field theory...... which is numerically shown to have pulse solutions travelling with constant speed and with unchanged form. We give numerical evidence that the model might even be exactly integrable, although the continuum limit seems to be singular and the pulses show an unusual super exponential decay to zero as exp...

  12. The +vbar breakout during approach to Space Station Freedom

    Science.gov (United States)

    Dunham, Scott D.

    1993-01-01

    A set of burn profiles was developed to provide bounding jet firing histories for a +vbar breakout during approaches to Space Station Freedom. The delta-v sequences were designed to place the Orbiter on a safe trajectory under worst case conditions and to try to minimize plume impingement on Space Station Freedom structure.

  13. a Web Service Approach for Linking Sensors and Cellular Spaces

    Science.gov (United States)

    Isikdag, U.

    2013-09-01

    More and more devices are starting to be connected to the Internet. In the future the Internet will not only be a communication medium for people, it will in fact be a communication environment for devices. The connected devices which are also referred as Things will have an ability to interact with other devices over the Internet, i.) provide information in interoperable form and ii.) consume /utilize such information with the help of sensors embedded in them. This overall concept is known as Internet-of- Things (IoT). This requires new approaches to be investigated for system architectures to establish relations between spaces and sensors. The research presented in this paper elaborates on an architecture developed with this aim, i.e. linking spaces and sensors using a RESTful approach. The objective is making spaces aware of (sensor-embedded) devices, and making devices aware of spaces in a loosely coupled way (i.e. a state/usage/function change in the spaces would not have effect on sensors, similarly a location/state/usage/function change in sensors would not have any effect on spaces). The proposed architecture also enables the automatic assignment of sensors to spaces depending on space geometry and sensor location.

  14. Application of Bayesian approach to estimate average level spacing

    International Nuclear Information System (INIS)

    Huang Zhongfu; Zhao Zhixiang

    1991-01-01

    A method to estimate average level spacing from a set of resolved resonance parameters by using Bayesian approach is given. Using the information given in the distributions of both levels spacing and neutron width, the level missing in measured sample can be corrected more precisely so that better estimate for average level spacing can be obtained by this method. The calculation of s-wave resonance has been done and comparison with other work was carried out

  15. Development of the three dimensional flow model in the SPACE code

    International Nuclear Information System (INIS)

    Oh, Myung Taek; Park, Chan Eok; Kim, Shin Whan

    2014-01-01

    SPACE (Safety and Performance Analysis CodE) is a nuclear plant safety analysis code, which has been developed in the Republic of Korea through a joint research between the Korean nuclear industry and research institutes. The SPACE code has been developed with multi-dimensional capabilities as a requirement of the next generation safety code. It allows users to more accurately model the multi-dimensional flow behavior that can be exhibited in components such as the core, lower plenum, upper plenum and downcomer region. Based on generalized models, the code can model any configuration or type of fluid system. All the geometric quantities of mesh are described in terms of cell volume, centroid, face area, and face center, so that it can naturally represent not only the one dimensional (1D) or three dimensional (3D) Cartesian system, but also the cylindrical mesh system. It is possible to simulate large and complex domains by modelling the complex parts with a 3D approach and the rest of the system with a 1D approach. By 1D/3D co-simulation, more realistic conditions and component models can be obtained, providing a deeper understanding of complex systems, and it is expected to overcome the shortcomings of 1D system codes. (author)

  16. Stochastic modeling and control system designs of the NASA/MSFC Ground Facility for large space structures: The maximum entropy/optimal projection approach

    Science.gov (United States)

    Hsia, Wei-Shen

    1986-01-01

    In the Control Systems Division of the Systems Dynamics Laboratory of the NASA/MSFC, a Ground Facility (GF), in which the dynamics and control system concepts being considered for Large Space Structures (LSS) applications can be verified, was designed and built. One of the important aspects of the GF is to design an analytical model which will be as close to experimental data as possible so that a feasible control law can be generated. Using Hyland's Maximum Entropy/Optimal Projection Approach, a procedure was developed in which the maximum entropy principle is used for stochastic modeling and the optimal projection technique is used for a reduced-order dynamic compensator design for a high-order plant.

  17. Research of features and structure of electoral space of Ukraine in 2014 with the use of synthetic approach

    Directory of Open Access Journals (Sweden)

    M. M. Shelemba

    2015-02-01

    Full Text Available The article is aimed at the ground of expediency of the use of synthetic authorial model for research of features and structure of electoral space of Ukraine in 2014 year. Methodological principles of the use of synthetic model are expounded with the use of quality and quantitative methods researches of electoral space, among that methods of factor and cross­correlation analysis. A synthetic model (approach that is built on the basis of the use of the best scientific approaches takes into account features and progress of electoral space of Ukraine trends. The analysis of features and structure of electoral space of Ukraine is conducted in 2014 with the use of an offer model. The application author synthetic model allows the study of the use of association factor and correlation analysis to justify support to political parties during election campaigns, respectively, depending on the factors and the most important correlates. It was found that electoral choice depends on the actions of those factors in the highest degree the expectations of the region. This article has shown that the use of Ukraine at this stage of the investigated during election campaigns as the most significant social correlates of «Human Development Index» is reasonable and one that makes it possible to obtain reliable results. It is proved that a high level of correlation holds at a high level of support the party and, consequently, high sense of social correlates all variants of expert research.

  18. Qualitative models for space system engineering

    Science.gov (United States)

    Forbus, Kenneth D.

    1990-01-01

    The objectives of this project were: (1) to investigate the implications of qualitative modeling techniques for problems arising in the monitoring, diagnosis, and design of Space Station subsystems and procedures; (2) to identify the issues involved in using qualitative models to enhance and automate engineering functions. These issues include representing operational criteria, fault models, alternate ontologies, and modeling continuous signals at a functional level of description; and (3) to develop a prototype collection of qualitative models for fluid and thermal systems commonly found in Space Station subsystems. Potential applications of qualitative modeling to space-systems engineering, including the notion of intelligent computer-aided engineering are summarized. Emphasis is given to determining which systems of the proposed Space Station provide the most leverage for study, given the current state of the art. Progress on using qualitative models, including development of the molecular collection ontology for reasoning about fluids, the interaction of qualitative and quantitative knowledge in analyzing thermodynamic cycles, and an experiment on building a natural language interface to qualitative reasoning is reported. Finally, some recommendations are made for future research.

  19. Planning additional drilling campaign using two-space genetic algorithm: A game theoretical approach

    Science.gov (United States)

    Kumral, Mustafa; Ozer, Umit

    2013-03-01

    Grade and tonnage are the most important technical uncertainties in mining ventures because of the use of estimations/simulations, which are mostly generated from drill data. Open pit mines are planned and designed on the basis of the blocks representing the entire orebody. Each block has different estimation/simulation variance reflecting uncertainty to some extent. The estimation/simulation realizations are submitted to mine production scheduling process. However, the use of a block model with varying estimation/simulation variances will lead to serious risk in the scheduling. In the medium of multiple simulations, the dispersion variances of blocks can be thought to regard technical uncertainties. However, the dispersion variance cannot handle uncertainty associated with varying estimation/simulation variances of blocks. This paper proposes an approach that generates the configuration of the best additional drilling campaign to generate more homogenous estimation/simulation variances of blocks. In other words, the objective is to find the best drilling configuration in such a way as to minimize grade uncertainty under budget constraint. Uncertainty measure of the optimization process in this paper is interpolation variance, which considers data locations and grades. The problem is expressed as a minmax problem, which focuses on finding the best worst-case performance i.e., minimizing interpolation variance of the block generating maximum interpolation variance. Since the optimization model requires computing the interpolation variances of blocks being simulated/estimated in each iteration, the problem cannot be solved by standard optimization tools. This motivates to use two-space genetic algorithm (GA) approach to solve the problem. The technique has two spaces: feasible drill hole configuration with minimization of interpolation variance and drill hole simulations with maximization of interpolation variance. Two-space interacts to find a minmax solution

  20. The Robust Software Feedback Model: An Effective Waterfall Model Tailoring for Space SW

    Science.gov (United States)

    Tipaldi, Massimo; Gotz, Christoph; Ferraguto, Massimo; Troiano, Luigi; Bruenjes, Bernhard

    2013-08-01

    The selection of the most suitable software life cycle process is of paramount importance in any space SW project. Despite being the preferred choice, the waterfall model is often exposed to some criticism. As matter of fact, its main assumption of moving to a phase only when the preceding one is completed and perfected (and under the demanding SW schedule constraints) is not easily attainable. In this paper, a tailoring of the software waterfall model (named “Robust Software Feedback Model”) is presented. The proposed methodology sorts out these issues by combining a SW waterfall model with a SW prototyping approach. The former is aligned with the SW main production line and is based on the full ECSS-E-ST-40C life-cycle reviews, whereas the latter is carried out in advance versus the main SW streamline (so as to inject its lessons learnt into the main streamline) and is based on a lightweight approach.

  1. The manifold model for space-time

    International Nuclear Information System (INIS)

    Heller, M.

    1981-01-01

    Physical processes happen on a space-time arena. It turns out that all contemporary macroscopic physical theories presuppose a common mathematical model for this arena, the so-called manifold model of space-time. The first part of study is an heuristic introduction to the concept of a smooth manifold, starting with the intuitively more clear concepts of a curve and a surface in the Euclidean space. In the second part the definitions of the Csub(infinity) manifold and of certain structures, which arise in a natural way from the manifold concept, are given. The role of the enveloping Euclidean space (i.e. of the Euclidean space appearing in the manifold definition) in these definitions is stressed. The Euclidean character of the enveloping space induces to the manifold local Euclidean (topological and differential) properties. A suggestion is made that replacing the enveloping Euclidean space by a discrete non-Euclidean space would be a correct way towards the quantization of space-time. (author)

  2. A Hybrid 3D Indoor Space Model

    Directory of Open Access Journals (Sweden)

    A. Jamali

    2016-10-01

    Full Text Available GIS integrates spatial information and spatial analysis. An important example of such integration is for emergency response which requires route planning inside and outside of a building. Route planning requires detailed information related to indoor and outdoor environment. Indoor navigation network models including Geometric Network Model (GNM, Navigable Space Model, sub-division model and regular-grid model lack indoor data sources and abstraction methods. In this paper, a hybrid indoor space model is proposed. In the proposed method, 3D modeling of indoor navigation network is based on surveying control points and it is less dependent on the 3D geometrical building model. This research proposes a method of indoor space modeling for the buildings which do not have proper 2D/3D geometrical models or they lack semantic or topological information. The proposed hybrid model consists of topological, geometrical and semantical space.

  3. Computational Modeling of Space Physiology

    Science.gov (United States)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  4. Dynamic Model Averaging in Large Model Spaces Using Dynamic Occam’s Window*

    Science.gov (United States)

    Onorante, Luca; Raftery, Adrian E.

    2015-01-01

    Bayesian model averaging has become a widely used approach to accounting for uncertainty about the structural form of the model generating the data. When data arrive sequentially and the generating model can change over time, Dynamic Model Averaging (DMA) extends model averaging to deal with this situation. Often in macroeconomics, however, many candidate explanatory variables are available and the number of possible models becomes too large for DMA to be applied in its original form. We propose a new method for this situation which allows us to perform DMA without considering the whole model space, but using a subset of models and dynamically optimizing the choice of models at each point in time. This yields a dynamic form of Occam’s window. We evaluate the method in the context of the problem of nowcasting GDP in the Euro area. We find that its forecasting performance compares well with that of other methods. PMID:26917859

  5. Pump Component Model in SPACE Code

    International Nuclear Information System (INIS)

    Kim, Byoung Jae; Kim, Kyoung Doo

    2010-08-01

    This technical report describes the pump component model in SPACE code. A literature survey was made on pump models in existing system codes. The models embedded in SPACE code were examined to check the confliction with intellectual proprietary rights. Design specifications, computer coding implementation, and test results are included in this report

  6. A behavioral approach to shared mapping of peripersonal space between oneself and others.

    Science.gov (United States)

    Teramoto, Wataru

    2018-04-03

    Recent physiological studies have showed that some visuotactile brain areas respond to other's peripersonal spaces (PPS) as they would their own. This study investigates this PPS remapping phenomenon in terms of human behavior. Participants placed their left hands on a tabletop screen where visual stimuli were projected. A vibrotactile stimulator was attached to the tip of their index finger. While a white disk approached or receded from the hand in the participant's near or far space, the participant was instructed to quickly detect a target (vibrotactile stimulation, change in the moving disk's color or both). When performing this task alone, the participants exhibited shorter detection times when the disk approached the hand in their near space. In contrast, when performing the task with a partner across the table, the participants exhibited shorter detection times both when the disk approached their own hand in their near space and when it approached the partner's hand in the partner's near space but the participants' far space. This phenomenon was also observed when the body parts from which the visual stimuli approached/receded differed between the participant and partner. These results suggest that humans can share PPS representations and/or body-derived attention/arousal mechanisms with others.

  7. Mentoring SFRM: A New Approach to International Space Station Flight Control Training

    Science.gov (United States)

    Huning, Therese; Barshi, Immanuel; Schmidt, Lacey

    2009-01-01

    The Mission Operations Directorate (MOD) of the Johnson Space Center is responsible for providing continuous operations support for the International Space Station (ISS). Operations support requires flight controllers who are skilled in team performance as well as the technical operations of the ISS. Space Flight Resource Management (SFRM), a NASA adapted variant of Crew Resource Management (CRM), is the competency model used in the MOD. ISS flight controller certification has evolved to include a balanced focus on development of SFRM and technical expertise. The latest challenge the MOD faces is how to certify an ISS flight controller (Operator) to a basic level of effectiveness in 1 year. SFRM training uses a twopronged approach to expediting operator certification: 1) imbed SFRM skills training into all Operator technical training and 2) use senior flight controllers as mentors. This paper focuses on how the MOD uses senior flight controllers as mentors to train SFRM skills.

  8. Recording and Modelling of MONUMENTS' Interior Space Using Range and Optical Sensors

    Science.gov (United States)

    Georgiadis, Charalampos; Patias, Petros; Tsioukas, Vasilios

    2016-06-01

    Three dimensional modelling of artefacts and building interiors is a highly active research field in our days. Several techniques are being utilized to perform such a task, spanning from traditional surveying techniques and photogrammetry to structured light scanners, laser scanners and so on. New technological advancements in both hardware and software create new recording techniques, tools and approaches. In this paper we present a new recording and modelling approach based on the SwissRanger SR4000 range camera coupled with a Canon 400D dSLR camera. The hardware component of our approach consists of a fixed base, which encloses the range and SLR cameras. The two sensors are fully calibrated and registered to each other thus we were able to produce colorized point clouds acquired from the range camera. In this paper we present the initial design and calibration of the system along with experimental data regarding the accuracy of the proposed approach. We are also providing results regarding the modelling of interior spaces and artefacts accompanied with accuracy tests from other modelling approaches based on photogrammetry and laser scanning.

  9. Emulating a flexible space structure: Modeling

    Science.gov (United States)

    Waites, H. B.; Rice, S. C.; Jones, V. L.

    1988-01-01

    Control Dynamics, in conjunction with Marshall Space Flight Center, has participated in the modeling and testing of Flexible Space Structures. Through the series of configurations tested and the many techniques used for collecting, analyzing, and modeling the data, many valuable insights have been gained and important lessons learned. This paper discusses the background of the Large Space Structure program, Control Dynamics' involvement in testing and modeling of the configurations (especially the Active Control Technique Evaluation for Spacecraft (ACES) configuration), the results from these two processes, and insights gained from this work.

  10. Space-time trajectories of wind power generation: Parameterized precision matrices under a Gaussian copula approach

    DEFF Research Database (Denmark)

    Tastu, Julija; Pinson, Pierre; Madsen, Henrik

    2015-01-01

    -correlations. Estimation is performed in a maximum likelihood framework. Based on a test case application in Denmark, with spatial dependencies over 15 areas and temporal ones for 43 hourly lead times (hence, for a dimension of n = 645), it is shown that accounting for space-time effects is crucial for generating skilful......Emphasis is placed on generating space-time trajectories of wind power generation, consisting of paths sampled from high-dimensional joint predictive densities, describing wind power generation at a number of contiguous locations and successive lead times. A modelling approach taking advantage...

  11. A model that allows teachers to reflect on their ict approaches

    DEFF Research Database (Denmark)

    Kjeldsen, Lars Peter Bech; Kjærgaard, Hanne Wacher

    2016-01-01

    The increased global availability of technology and its entry onto the educational stage of Higher Education (HE) requires changes in the way we think of education and learning. This article will briefly describe and shed light on the new conditions for learning that are challenging our traditional...... pedagogical principles and present a model for pedagogical reflection that we call the Convergent Learning Space (CLS) consisting of the elements: Learning approaches; learning tools; learning spaces; availability; lifeworlds. The model reflects the choices and priorities teachers must make in relation...

  12. A general-model-space diagrammatic perturbation theory

    International Nuclear Information System (INIS)

    Hose, G.; Kaldor, U.

    1980-01-01

    A diagrammatic many-body perturbation theory applicable to arbitrary model spaces is presented. The necessity of having a complete model space (all possible occupancies of the partially-filled shells) is avoided. This requirement may be troublesome for systems with several well-spaced open shells, such as most atomic and molecular excited states, as a complete model space spans a very broad energy range and leaves out states within that range, leading to poor or no convergence of the perturbation series. The method presented here would be particularly useful for such states. The solution of a model problem (He 2 excited Σ + sub(g) states) is demonstrated. (Auth.)

  13. A simulation based optimization approach to model and design life support systems for manned space missions

    Science.gov (United States)

    Aydogan, Selen

    This dissertation considers the problem of process synthesis and design of life-support systems for manned space missions. A life-support system is a set of technologies to support human life for short and long-term spaceflights, via providing the basic life-support elements, such as oxygen, potable water, and food. The design of the system needs to meet the crewmember demand for the basic life-support elements (products of the system) and it must process the loads generated by the crewmembers. The system is subject to a myriad of uncertainties because most of the technologies involved are still under development. The result is high levels of uncertainties in the estimates of the model parameters, such as recovery rates or process efficiencies. Moreover, due to the high recycle rates within the system, the uncertainties are amplified and propagated within the system, resulting in a complex problem. In this dissertation, two algorithms have been successfully developed to help making design decisions for life-support systems. The algorithms utilize a simulation-based optimization approach that combines a stochastic discrete-event simulation and a deterministic mathematical programming approach to generate multiple, unique realizations of the controlled evolution of the system. The timelines are analyzed using time series data mining techniques and statistical tools to determine the necessary technologies, their deployment schedules and capacities, and the necessary basic life-support element amounts to support crew life and activities for the mission duration.

  14. Keeping it real: revisiting a real-space approach to running ensembles of cosmological N-body simulations

    International Nuclear Information System (INIS)

    Orban, Chris

    2013-01-01

    In setting up initial conditions for ensembles of cosmological N-body simulations there are, fundamentally, two choices: either maximizing the correspondence of the initial density field to the assumed fourier-space clustering or, instead, matching to real-space statistics and allowing the DC mode (i.e. overdensity) to vary from box to box as it would in the real universe. As a stringent test of both approaches, I perform ensembles of simulations using power law and a ''powerlaw times a bump'' model inspired by baryon acoustic oscillations (BAO), exploiting the self-similarity of these initial conditions to quantify the accuracy of the matter-matter two-point correlation results. The real-space method, which was originally proposed by Pen 1997 [1] and implemented by Sirko 2005 [2], performed well in producing the expected self-similar behavior and corroborated the non-linear evolution of the BAO feature observed in conventional simulations, even in the strongly-clustered regime (σ 8 ∼>1). In revisiting the real-space method championed by [2], it was also noticed that this earlier study overlooked an important integral constraint correction to the correlation function in results from the conventional approach that can be important in ΛCDM simulations with L box ∼ −1 Gpc and on scales r∼>L box /10. Rectifying this issue shows that the fourier space and real space methods are about equally accurate and efficient for modeling the evolution and growth of the correlation function, contrary to previous claims. An appendix provides a useful independent-of-epoch analytic formula for estimating the importance of the integral constraint bias on correlation function measurements in ΛCDM simulations

  15. A Novel Approach to Implement Takagi-Sugeno Fuzzy Models.

    Science.gov (United States)

    Chang, Chia-Wen; Tao, Chin-Wang

    2017-09-01

    This paper proposes new algorithms based on the fuzzy c-regressing model algorithm for Takagi-Sugeno (T-S) fuzzy modeling of the complex nonlinear systems. A fuzzy c-regression state model (FCRSM) algorithm is a T-S fuzzy model in which the functional antecedent and the state-space-model-type consequent are considered with the available input-output data. The antecedent and consequent forms of the proposed FCRSM consists mainly of two advantages: one is that the FCRSM has low computation load due to only one input variable is considered in the antecedent part; another is that the unknown system can be modeled to not only the polynomial form but also the state-space form. Moreover, the FCRSM can be extended to FCRSM-ND and FCRSM-Free algorithms. An algorithm FCRSM-ND is presented to find the T-S fuzzy state-space model of the nonlinear system when the input-output data cannot be precollected and an assumed effective controller is available. In the practical applications, the mathematical model of controller may be hard to be obtained. In this case, an online tuning algorithm, FCRSM-FREE, is designed such that the parameters of a T-S fuzzy controller and the T-S fuzzy state model of an unknown system can be online tuned simultaneously. Four numerical simulations are given to demonstrate the effectiveness of the proposed approach.

  16. Modeling Fluid’s Dynamics with Master Equations in Ultrametric Spaces Representing the Treelike Structure of Capillary Networks

    Directory of Open Access Journals (Sweden)

    Andrei Khrennikov

    2016-07-01

    Full Text Available We present a new conceptual approach for modeling of fluid flows in random porous media based on explicit exploration of the treelike geometry of complex capillary networks. Such patterns can be represented mathematically as ultrametric spaces and the dynamics of fluids by ultrametric diffusion. The images of p-adic fields, extracted from the real multiscale rock samples and from some reference images, are depicted. In this model the porous background is treated as the environment contributing to the coefficients of evolutionary equations. For the simplest trees, these equations are essentially less complicated than those with fractional differential operators which are commonly applied in geological studies looking for some fractional analogs to conventional Euclidean space but with anomalous scaling and diffusion properties. It is possible to solve the former equation analytically and, in particular, to find stationary solutions. The main aim of this paper is to attract the attention of researchers working on modeling of geological processes to the novel utrametric approach and to show some examples from the petroleum reservoir static and dynamic characterization, able to integrate the p-adic approach with multifractals, thermodynamics and scaling. We also present a non-mathematician friendly review of trees and ultrametric spaces and pseudo-differential operators on such spaces.

  17. On discrete models of space-time

    International Nuclear Information System (INIS)

    Horzela, A.; Kempczynski, J.; Kapuscik, E.; Georgia Univ., Athens, GA; Uzes, Ch.

    1992-02-01

    Analyzing the Einstein radiolocation method we come to the conclusion that results of any measurement of space-time coordinates should be expressed in terms of rational numbers. We show that this property is Lorentz invariant and may be used in the construction of discrete models of space-time different from the models of the lattice type constructed in the process of discretization of continuous models. (author)

  18. A Conceptual Approach for Optimising Bus Stop Spacing

    Science.gov (United States)

    Johar, Amita; Jain, S. S.; Garg, P. k.

    2017-06-01

    An efficient public transportation system is essential of any country. The growth, development and shape of the urban areas are mainly due to availability of good transportation (Shah et al. in Inst Town Plan India J 5(3):50-59, 1). In developing countries, like India, travel by local bus in a city is very common. The accidents, congestion, pollution and appropriate location of bus stops are the major problems arising in metropolitan cities. Among all the metropolitan cities in India, Delhi has highest percentage of growth of population and vehicles. Therefore, it is important to adopt efficient and effective ways to improve mobility in different metropolitan cities in order to overcome the problem and to reduce the number of private vehicles on the road. The primary objective of this paper is to present a methodology for developing a model for optimum bus stop spacing (OBSS). It describes the evaluation of existing urban bus route, data collection, development of model for optimizing urban bus route and application of model. In this work, the bus passenger generalized cost method is used to optimize the spacing between bus stops. For the development of model, a computer program is required to be written. The applicability of the model has been evaluated by taking the data of urban bus route of Delhi Transport Corporation (DTC) in Excel sheet in first phase. Later on, it is proposed to develop a programming in C++ language. The developed model is expected to be useful to transport planner for rational design of the spacing of bus stops to save travel time and to generalize operating cost. After analysis it is found that spacing between the bus stop comes out to be between 250 and 500 m. The Proposed Spacing of bus stops is done considering the points that they don't come nearer to metro/rail station, entry or exit of flyover and near traffic signal.

  19. Formulating state space models in R with focus on longitudinal regression models

    DEFF Research Database (Denmark)

    Dethlefsen, Claus; Lundbye-Christensen, Søren

      We provide a language for formulating a range of state space models. The described methodology is implemented in the R -package sspir available from cran.r-project.org . A state space model is specified similarly to a generalized linear model in R , by marking the time-varying terms in the form......  We provide a language for formulating a range of state space models. The described methodology is implemented in the R -package sspir available from cran.r-project.org . A state space model is specified similarly to a generalized linear model in R , by marking the time-varying terms...

  20. Space Science Cloud: a Virtual Space Science Research Platform Based on Cloud Model

    Science.gov (United States)

    Hu, Xiaoyan; Tong, Jizhou; Zou, Ziming

    Through independent and co-operational science missions, Strategic Pioneer Program (SPP) on Space Science, the new initiative of space science program in China which was approved by CAS and implemented by National Space Science Center (NSSC), dedicates to seek new discoveries and new breakthroughs in space science, thus deepen the understanding of universe and planet earth. In the framework of this program, in order to support the operations of space science missions and satisfy the demand of related research activities for e-Science, NSSC is developing a virtual space science research platform based on cloud model, namely the Space Science Cloud (SSC). In order to support mission demonstration, SSC integrates interactive satellite orbit design tool, satellite structure and payloads layout design tool, payload observation coverage analysis tool, etc., to help scientists analyze and verify space science mission designs. Another important function of SSC is supporting the mission operations, which runs through the space satellite data pipelines. Mission operators can acquire and process observation data, then distribute the data products to other systems or issue the data and archives with the services of SSC. In addition, SSC provides useful data, tools and models for space researchers. Several databases in the field of space science are integrated and an efficient retrieve system is developing. Common tools for data visualization, deep processing (e.g., smoothing and filtering tools), analysis (e.g., FFT analysis tool and minimum variance analysis tool) and mining (e.g., proton event correlation analysis tool) are also integrated to help the researchers to better utilize the data. The space weather models on SSC include magnetic storm forecast model, multi-station middle and upper atmospheric climate model, solar energetic particle propagation model and so on. All the services above-mentioned are based on the e-Science infrastructures of CAS e.g. cloud storage and

  1. Hierarchical Bayesian modeling of the space - time diffusion patterns of cholera epidemic in Kumasi, Ghana

    NARCIS (Netherlands)

    Osei, Frank B.; Osei, F.B.; Duker, Alfred A.; Stein, A.

    2011-01-01

    This study analyses the joint effects of the two transmission routes of cholera on the space-time diffusion dynamics. Statistical models are developed and presented to investigate the transmission network routes of cholera diffusion. A hierarchical Bayesian modelling approach is employed for a joint

  2. Understanding space weather with new physical, mathematical and philosophical approaches

    Science.gov (United States)

    Mateev, Lachezar; Velinov, Peter; Tassev, Yordan

    2016-07-01

    The actual problems of solar-terrestrial physics, in particular of space weather are related to the prediction of the space environment state and are solved by means of different analyses and models. The development of these investigations can be considered also from another side. This is the philosophical and mathematical approach towards this physical reality. What does it constitute? We have a set of physical processes which occur in the Sun and interplanetary space. All these processes interact with each other and simultaneously participate in the general process which forms the space weather. Let us now consider the Leibniz's monads (G.W. von Leibniz, 1714, Monadologie, Wien; Id., 1710, Théodicée, Amsterdam) and use some of their properties. There are total 90 theses for monads in the Leibniz's work (1714), f.e. "(1) The Monad, of which we shall here speak, is nothing but a simple substance, which enters into compounds. By 'simple' is meant 'without parts'. (Theod. 10.); … (56) Now this connexion or adaptation of all created things to each and of each to all, means that each simple substance has relations which express all the others, and, consequently, that it is a perpetual living mirror of the universe. (Theod. 130, 360.); (59) … this universal harmony, according to which every substance exactly expresses all others through the relations it has with them. (63) … every Monad is, in its own way, a mirror of the universe, and the universe is ruled according to a perfect order. (Theod. 403.)", etc. Let us introduce in the properties of monads instead of the word "monad" the word "process". We obtain the following statement: Each process reflects all other processes and all other processes reflect this process. This analogy is not formal at all, it reflects accurately the relation between the physical processes and their unity. The category monad which in the Leibniz's Monadology reflects generally the philosophical sense is fully identical with the

  3. Space Weather Models at the CCMC And Their Capabilities

    Science.gov (United States)

    Hesse, Michael; Rastatter, Lutz; MacNeice, Peter; Kuznetsova, Masha

    2007-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second focus of CCMC activities is on validation and verification of space weather models, and on the transition of appropriate models to space weather forecast centers. As part of the latter activity, the CCMC develops real-time simulation systems that stress models through routine execution. A by-product of these real-time calculations is the ability to derive model products, which may be useful for space weather operators. In this presentation, we will provide an overview of the community-provided, space weather-relevant, model suite, which resides at CCMC. We will discuss current capabilities, and analyze expected future developments of space weather related modeling.

  4. An Hilbert space approach for a class of arbitrage free implied volatilities models

    OpenAIRE

    Brace, A.; Fabbri, G.; Goldys, B.

    2007-01-01

    We present an Hilbert space formulation for a set of implied volatility models introduced in \\cite{BraceGoldys01} in which the authors studied conditions for a family of European call options, varying the maturing time and the strike price $T$ an $K$, to be arbitrage free. The arbitrage free conditions give a system of stochastic PDEs for the evolution of the implied volatility surface ${\\hat\\sigma}_t(T,K)$. We will focus on the family obtained fixing a strike $K$ and varying $T$. In order to...

  5. Modeling of space environment impact on nanostructured materials. General principles

    Science.gov (United States)

    Voronina, Ekaterina; Novikov, Lev

    2016-07-01

    In accordance with the resolution of ISO TC20/SC14 WG4/WG6 joint meeting, Technical Specification (TS) 'Modeling of space environment impact on nanostructured materials. General principles' which describes computer simulation methods of space environment impact on nanostructured materials is being prepared. Nanomaterials surpass traditional materials for space applications in many aspects due to their unique properties associated with nanoscale size of their constituents. This superiority in mechanical, thermal, electrical and optical properties will evidently inspire a wide range of applications in the next generation spacecraft intended for the long-term (~15-20 years) operation in near-Earth orbits and the automatic and manned interplanetary missions. Currently, ISO activity on developing standards concerning different issues of nanomaterials manufacturing and applications is high enough. Most such standards are related to production and characterization of nanostructures, however there is no ISO documents concerning nanomaterials behavior in different environmental conditions, including the space environment. The given TS deals with the peculiarities of the space environment impact on nanostructured materials (i.e. materials with structured objects which size in at least one dimension lies within 1-100 nm). The basic purpose of the document is the general description of the methodology of applying computer simulation methods which relate to different space and time scale to modeling processes occurring in nanostructured materials under the space environment impact. This document will emphasize the necessity of applying multiscale simulation approach and present the recommendations for the choice of the most appropriate methods (or a group of methods) for computer modeling of various processes that can occur in nanostructured materials under the influence of different space environment components. In addition, TS includes the description of possible

  6. Geodetic Space Weather Monitoring by means of Ionosphere Modelling

    Science.gov (United States)

    Schmidt, Michael

    2017-04-01

    The term space weather indicates physical processes and phenomena in space caused by radiation of energy mainly from the Sun. Manifestations of space weather are (1) variations of the Earth's magnetic field, (2) the polar lights in the northern and southern hemisphere, (3) variations within the ionosphere as part of the upper atmosphere characterized by the existence of free electrons and ions, (4) the solar wind, i.e. the permanent emission of electrons and photons, (5) the interplanetary magnetic field, and (6) electric currents, e.g. the van Allen radiation belt. It can be stated that ionosphere disturbances are often caused by so-called solar storms. A solar storm comprises solar events such as solar flares and coronal mass ejections (CMEs) which have different effects on the Earth. Solar flares may cause disturbances in positioning, navigation and communication. CMEs can effect severe disturbances and in extreme cases damages or even destructions of modern infrastructure. Examples are interruptions to satellite services including the global navigation satellite systems (GNSS), communication systems, Earth observation and imaging systems or a potential failure of power networks. Currently the measurements of solar satellite missions such as STEREO and SOHO are used to forecast solar events. Besides these measurements the Earth's ionosphere plays another key role in monitoring the space weather, because it responses to solar storms with an increase of the electron density. Space-geodetic observation techniques, such as terrestrial GNSS, satellite altimetry, space-borne GPS (radio occultation), DORIS and VLBI provide valuable global information about the state of the ionosphere. Additionally geodesy has a long history and large experience in developing and using sophisticated analysis and combination techniques as well as empirical and physical modelling approaches. Consequently, geodesy is predestinated for strongly supporting space weather monitoring via

  7. Adaptive Modeling of the International Space Station Electrical Power System

    Science.gov (United States)

    Thomas, Justin Ray

    2007-01-01

    Software simulations provide NASA engineers the ability to experiment with spacecraft systems in a computer-imitated environment. Engineers currently develop software models that encapsulate spacecraft system behavior. These models can be inaccurate due to invalid assumptions, erroneous operation, or system evolution. Increasing accuracy requires manual calibration and domain-specific knowledge. This thesis presents a method for automatically learning system models without any assumptions regarding system behavior. Data stream mining techniques are applied to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). We also explore a knowledge fusion approach that uses traditional engineered EPS models to supplement the learned models. We observed that these engineered EPS models provide useful background knowledge to reduce predictive error spikes when confronted with making predictions in situations that are quite different from the training scenarios used when learning the model. Evaluations using ISS sensor data and existing EPS models demonstrate the success of the adaptive approach. Our experimental results show that adaptive modeling provides reductions in model error anywhere from 80% to 96% over these existing models. Final discussions include impending use of adaptive modeling technology for ISS mission operations and the need for adaptive modeling in future NASA lunar and Martian exploration.

  8. Modeling volatility using state space models.

    Science.gov (United States)

    Timmer, J; Weigend, A S

    1997-08-01

    In time series problems, noise can be divided into two categories: dynamic noise which drives the process, and observational noise which is added in the measurement process, but does not influence future values of the system. In this framework, we show that empirical volatilities (the squared relative returns of prices) exhibit a significant amount of observational noise. To model and predict their time evolution adequately, we estimate state space models that explicitly include observational noise. We obtain relaxation times for shocks in the logarithm of volatility ranging from three weeks (for foreign exchange) to three to five months (for stock indices). In most cases, a two-dimensional hidden state is required to yield residuals that are consistent with white noise. We compare these results with ordinary autoregressive models (without a hidden state) and find that autoregressive models underestimate the relaxation times by about two orders of magnitude since they do not distinguish between observational and dynamic noise. This new interpretation of the dynamics of volatility in terms of relaxators in a state space model carries over to stochastic volatility models and to GARCH models, and is useful for several problems in finance, including risk management and the pricing of derivative securities. Data sets used: Olsen & Associates high frequency DEM/USD foreign exchange rates (8 years). Nikkei 225 index (40 years). Dow Jones Industrial Average (25 years).

  9. A novel variable selection approach that iteratively optimizes variable space using weighted binary matrix sampling.

    Science.gov (United States)

    Deng, Bai-chuan; Yun, Yong-huan; Liang, Yi-zeng; Yi, Lun-zhao

    2014-10-07

    In this study, a new optimization algorithm called the Variable Iterative Space Shrinkage Approach (VISSA) that is based on the idea of model population analysis (MPA) is proposed for variable selection. Unlike most of the existing optimization methods for variable selection, VISSA statistically evaluates the performance of variable space in each step of optimization. Weighted binary matrix sampling (WBMS) is proposed to generate sub-models that span the variable subspace. Two rules are highlighted during the optimization procedure. First, the variable space shrinks in each step. Second, the new variable space outperforms the previous one. The second rule, which is rarely satisfied in most of the existing methods, is the core of the VISSA strategy. Compared with some promising variable selection methods such as competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variable elimination (MCUVE) and iteratively retaining informative variables (IRIV), VISSA showed better prediction ability for the calibration of NIR data. In addition, VISSA is user-friendly; only a few insensitive parameters are needed, and the program terminates automatically without any additional conditions. The Matlab codes for implementing VISSA are freely available on the website: https://sourceforge.net/projects/multivariateanalysis/files/VISSA/.

  10. A discrete-space urban model with environmental amenities

    Science.gov (United States)

    Liaila Tajibaeva; Robert G. Haight; Stephen Polasky

    2008-01-01

    This paper analyzes the effects of providing environmental amenities associated with open space in a discrete-space urban model and characterizes optimal provision of open space across a metropolitan area. The discrete-space model assumes distinct neighborhoods in which developable land is homogeneous within a neighborhood but heterogeneous across neighborhoods. Open...

  11. Lateral skull base approaches in the management of benign parapharyngeal space tumors.

    Science.gov (United States)

    Prasad, Sampath Chandra; Piccirillo, Enrico; Chovanec, Martin; La Melia, Claudio; De Donato, Giuseppe; Sanna, Mario

    2015-06-01

    To evaluate the role of lateral skull base approaches in the management of benign parapharyngeal space tumors and to propose an algorithm for their surgical approach. Retrospective study of patients with benign parapharyngeal space tumors. The clinical features, radiology and preoperative management of skull base neurovasculature, the surgical approaches and overall results were recorded. 46 patients presented with 48 tumors. 12 were prestyloid and 36 poststyloid. 19 (39.6%) tumors were paragangliomas, 15 (31.25%) were schwannomas and 11 (23%) were pleomorphic adenomas. Preoperative embolization was performed in 19, stenting of the internal carotid artery in 4 and permanent balloon occlusion in 2 patients. 19 tumors were approached by the transcervical, 13 by transcervical-transparotid, 5 by transcervical-transmastoid, 6, 1 and 2 tumors by the infratemporal fossa approach types A, B and D, respectively. Total radical tumor removal was achieved in 46 (96%) of the cases. Lateral skull base approaches have an advantage over other approaches in the management of benign tumors of the parapharyngeal space due to the fact that they provide excellent exposure with less morbidity. The use of microscope combined with bipolar cautery reduces morbidity. Stenting of internal carotid artery gives a chance for complete tumor removal with arterial preservation. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Application of the Quality by Design Approach to the Freezing Step of Freeze-Drying: Building the Design Space.

    Science.gov (United States)

    Arsiccio, Andrea; Pisano, Roberto

    2018-06-01

    The present work shows a rational method for the development of the freezing step of a freeze-drying cycle. The current approach to the selection of freezing conditions is still empirical and nonsystematic, thus resulting in poor robustness of control strategy. The final aim of this work is to fill this gap, describing a rational procedure, based on mathematical modeling, for properly choosing the freezing conditions. Mechanistic models are used for the prediction of temperature profiles during freezing and dimension of ice crystals being formed. Mathematical description of the drying phase of freeze-drying is also coupled with the results obtained by freezing models, thus providing a comprehensive characterization of the lyophilization process. In this framework, deep understanding of the phenomena involved is required, and according to the Quality by Design approach, this knowledge can be used to build the design space. The step-by-step procedure for building the design space for freezing is thus described, and examples of applications are provided. The calculated design space is validated upon experimental data, and we show that it allows easy control of the freezing process and fast selection of appropriate operating conditions. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  13. Approach to transaction management for Space Station Freedom

    Science.gov (United States)

    Easton, C. R.; Cressy, Phil; Ohnesorge, T. E.; Hector, Garland

    1989-01-01

    An approach to managing the operations of the Space Station Freedom based on their external effects is described. It is assumed that there is a conflict-free schedule that, if followed, will allow only appropriate operations to occur. The problem is then reduced to that of ensuring that the operations initiated are within the limits allowed by the schedule, or that the external effects of such operations are within those allowed by the schedule. The main features of the currently adopted transaction management approach are discussed.

  14. Query Language for Location-Based Services: A Model Checking Approach

    Science.gov (United States)

    Hoareau, Christian; Satoh, Ichiro

    We present a model checking approach to the rationale, implementation, and applications of a query language for location-based services. Such query mechanisms are necessary so that users, objects, and/or services can effectively benefit from the location-awareness of their surrounding environment. The underlying data model is founded on a symbolic model of space organized in a tree structure. Once extended to a semantic model for modal logic, we regard location query processing as a model checking problem, and thus define location queries as hybrid logicbased formulas. Our approach is unique to existing research because it explores the connection between location models and query processing in ubiquitous computing systems, relies on a sound theoretical basis, and provides modal logic-based query mechanisms for expressive searches over a decentralized data structure. A prototype implementation is also presented and will be discussed.

  15. An evaluation of behavior inferences from Bayesian state-space models: A case study with the Pacific walrus

    Science.gov (United States)

    Beatty, William; Jay, Chadwick V.; Fischbach, Anthony S.

    2016-01-01

    State-space models offer researchers an objective approach to modeling complex animal location data sets, and state-space model behavior classifications are often assumed to have a link to animal behavior. In this study, we evaluated the behavioral classification accuracy of a Bayesian state-space model in Pacific walruses using Argos satellite tags with sensors to detect animal behavior in real time. We fit a two-state discrete-time continuous-space Bayesian state-space model to data from 306 Pacific walruses tagged in the Chukchi Sea. We matched predicted locations and behaviors from the state-space model (resident, transient behavior) to true animal behavior (foraging, swimming, hauled out) and evaluated classification accuracy with kappa statistics (κ) and root mean square error (RMSE). In addition, we compared biased random bridge utilization distributions generated with resident behavior locations to true foraging behavior locations to evaluate differences in space use patterns. Results indicated that the two-state model fairly classified true animal behavior (0.06 ≤ κ ≤ 0.26, 0.49 ≤ RMSE ≤ 0.59). Kernel overlap metrics indicated utilization distributions generated with resident behavior locations were generally smaller than utilization distributions generated with true foraging behavior locations. Consequently, we encourage researchers to carefully examine parameters and priors associated with behaviors in state-space models, and reconcile these parameters with the study species and its expected behaviors.

  16. Space Weather in the Machine Learning Era: A Multidisciplinary Approach

    Science.gov (United States)

    Camporeale, E.; Wing, S.; Johnson, J.; Jackman, C. M.; McGranaghan, R.

    2018-01-01

    The workshop entitled Space Weather: A Multidisciplinary Approach took place at the Lorentz Center, University of Leiden, Netherlands, on 25-29 September 2017. The aim of this workshop was to bring together members of the Space Weather, Mathematics, Statistics, and Computer Science communities to address the use of advanced techniques such as Machine Learning, Information Theory, and Deep Learning, to better understand the Sun-Earth system and to improve space weather forecasting. Although individual efforts have been made toward this goal, the community consensus is that establishing interdisciplinary collaborations is the most promising strategy for fully utilizing the potential of these advanced techniques in solving Space Weather-related problems.

  17. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    International Nuclear Information System (INIS)

    Ahmadi, Rouhollah; Khamehchi, Ehsan

    2013-01-01

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data

  18. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com [Amirkabir University of Technology, PhD Student at Reservoir Engineering, Department of Petroleum Engineering (Iran, Islamic Republic of); Khamehchi, Ehsan [Amirkabir University of Technology, Faculty of Petroleum Engineering (Iran, Islamic Republic of)

    2013-12-15

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.

  19. A model-based approach to predict muscle synergies using optimization: application to feedback control

    Directory of Open Access Journals (Sweden)

    Reza eSharif Razavian

    2015-10-01

    Full Text Available This paper presents a new model-based method to define muscle synergies. Unlike the conventional factorization approach, which extracts synergies from electromyographic data, the proposed method employs a biomechanical model and formally defines the synergies as the solution of an optimal control problem. As a result, the number of required synergies is directly related to the dimensions of the operational space. The estimated synergies are posture-dependent, which correlate well with the results of standard factorization methods. Two examples are used to showcase this method: a two-dimensional forearm model, and a three-dimensional driver arm model. It has been shown here that the synergies need to be task-specific (i.e. they are defined for the specific operational spaces: the elbow angle and the steering wheel angle in the two systems. This functional definition of synergies results in a low-dimensional control space, in which every force in the operational space is accurately created by a unique combination of synergies. As such, there is no need for extra criteria (e.g., minimizing effort in the process of motion control. This approach is motivated by the need for fast and bio-plausible feedback control of musculoskeletal systems, and can have important implications in engineering, motor control, and biomechanics.

  20. A model-based approach to predict muscle synergies using optimization: application to feedback control.

    Science.gov (United States)

    Sharif Razavian, Reza; Mehrabi, Naser; McPhee, John

    2015-01-01

    This paper presents a new model-based method to define muscle synergies. Unlike the conventional factorization approach, which extracts synergies from electromyographic data, the proposed method employs a biomechanical model and formally defines the synergies as the solution of an optimal control problem. As a result, the number of required synergies is directly related to the dimensions of the operational space. The estimated synergies are posture-dependent, which correlate well with the results of standard factorization methods. Two examples are used to showcase this method: a two-dimensional forearm model, and a three-dimensional driver arm model. It has been shown here that the synergies need to be task-specific (i.e., they are defined for the specific operational spaces: the elbow angle and the steering wheel angle in the two systems). This functional definition of synergies results in a low-dimensional control space, in which every force in the operational space is accurately created by a unique combination of synergies. As such, there is no need for extra criteria (e.g., minimizing effort) in the process of motion control. This approach is motivated by the need for fast and bio-plausible feedback control of musculoskeletal systems, and can have important implications in engineering, motor control, and biomechanics.

  1. Near-Earth Space Radiation Models

    Science.gov (United States)

    Xapsos, Michael A.; O'Neill, Patrick M.; O'Brien, T. Paul

    2012-01-01

    Review of models of the near-Earth space radiation environment is presented, including recent developments in trapped proton and electron, galactic cosmic ray and solar particle event models geared toward spacecraft electronics applications.

  2. Modeling microbial community structure and functional diversity across time and space.

    Science.gov (United States)

    Larsen, Peter E; Gibbons, Sean M; Gilbert, Jack A

    2012-07-01

    Microbial communities exhibit exquisitely complex structure. Many aspects of this complexity, from the number of species to the total number of interactions, are currently very difficult to examine directly. However, extraordinary efforts are being made to make these systems accessible to scientific investigation. While recent advances in high-throughput sequencing technologies have improved accessibility to the taxonomic and functional diversity of complex communities, monitoring the dynamics of these systems over time and space - using appropriate experimental design - is still expensive. Fortunately, modeling can be used as a lens to focus low-resolution observations of community dynamics to enable mathematical abstractions of functional and taxonomic dynamics across space and time. Here, we review the approaches for modeling bacterial diversity at both the very large and the very small scales at which microbial systems interact with their environments. We show that modeling can help to connect biogeochemical processes to specific microbial metabolic pathways. © 2012 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  3. General background and approach to multibody dynamics for space applications

    Science.gov (United States)

    Santini, Paolo; Gasbarri, Paolo

    2009-06-01

    Multibody dynamics for space applications is dictated by space environment such as space-varying gravity forces, orbital and attitude perturbations, control forces if any. Several methods and formulations devoted to the modeling of flexible bodies undergoing large overall motions were developed in recent years. Most of these different formulations were aimed to face one of the main problems concerning the analysis of spacecraft dynamics namely the reduction of computer simulation time. By virtue of this, the use of symbolic manipulation, recursive formulation and parallel processing algorithms were proposed. All these approaches fall into two categories, the one based on Newton/Euler methods and the one based on Lagrangian methods; both of them have their advantages and disadvantages although in general, Newtonian approaches lend to a better understanding of the physics of problems and in particular of the magnitude of the reactions and of the corresponding structural stresses. Another important issue which must be addressed carefully in multibody space dynamics is relevant to a correct choice of kinematics variables. In fact, when dealing with flexible multibody system the resulting equations include two different types of state variables, the ones associated with large (rigid) displacements and the ones associated with elastic deformations. These two sets of variables have generally two different time scales if we think of the attitude motion of a satellite whose period of oscillation, due to the gravity gradient effects, is of the same order of magnitude as the orbital period, which is much bigger than the one associated with the structural vibration of the satellite itself. Therefore, the numerical integration of the equations of the system represents a challenging problem. This was the abstract and some of the arguments that Professor Paolo Santini intended to present for the Breakwell Lecture; unfortunately a deadly disease attacked him and shortly took him

  4. An introduction to Space Weather Integrated Modeling

    Science.gov (United States)

    Zhong, D.; Feng, X.

    2012-12-01

    The need for a software toolkit that integrates space weather models and data is one of many challenges we are facing with when applying the models to space weather forecasting. To meet this challenge, we have developed Space Weather Integrated Modeling (SWIM) that is capable of analysis and visualizations of the results from a diverse set of space weather models. SWIM has a modular design and is written in Python, by using NumPy, matplotlib, and the Visualization ToolKit (VTK). SWIM provides data management module to read a variety of spacecraft data products and a specific data format of Solar-Interplanetary Conservation Element/Solution Element MHD model (SIP-CESE MHD model) for the study of solar-terrestrial phenomena. Data analysis, visualization and graphic user interface modules are also presented in a user-friendly way to run the integrated models and visualize the 2-D and 3-D data sets interactively. With these tools we can locally or remotely analysis the model result rapidly, such as extraction of data on specific location in time-sequence data sets, plotting interplanetary magnetic field lines, multi-slicing of solar wind speed, volume rendering of solar wind density, animation of time-sequence data sets, comparing between model result and observational data. To speed-up the analysis, an in-situ visualization interface is used to support visualizing the data 'on-the-fly'. We also modified some critical time-consuming analysis and visualization methods with the aid of GPU and multi-core CPU. We have used this tool to visualize the data of SIP-CESE MHD model in real time, and integrated the Database Model of shock arrival, Shock Propagation Model, Dst forecasting model and SIP-CESE MHD model developed by SIGMA Weather Group at State Key Laboratory of Space Weather/CAS.

  5. A Proposal for the Common Safety Approach of Space Programs

    Science.gov (United States)

    Grimard, Max

    2002-01-01

    For all applications, business and systems related to Space programs, Quality is mandatory and is a key factor for the technical as well as the economical performances. Up to now the differences of applications (launchers, manned space-flight, sciences, telecommunications, Earth observation, planetary exploration, etc.) and the difference of technical culture and background of the leading countries (USA, Russia, Europe) have generally led to different approaches in terms of standards and processes for Quality. At a time where international cooperation is quite usual for the institutional programs and globalization is the key word for the commercial business, it is considered of prime importance to aim at common standards and approaches for Quality in Space Programs. For that reason, the International Academy of Astronautics has set up a Study Group which mandate is to "Make recommendations to improve the Quality, Reliability, Efficiency, and Safety of space programmes, taking into account the overall environment in which they operate : economical constraints, harsh environments, space weather, long life, no maintenance, autonomy, international co-operation, norms and standards, certification." The paper will introduce the activities of this Study Group, describing a first list of topics which should be addressed : Through this paper it is expected to open the discussion to update/enlarge this list of topics and to call for contributors to this Study Group.

  6. Statistical sampling approaches for soil monitoring

    NARCIS (Netherlands)

    Brus, D.J.

    2014-01-01

    This paper describes three statistical sampling approaches for regional soil monitoring, a design-based, a model-based and a hybrid approach. In the model-based approach a space-time model is exploited to predict global statistical parameters of interest such as the space-time mean. In the hybrid

  7. A latent low-dimensional common input drives a pool of motor neurons: a probabilistic latent state-space model.

    Science.gov (United States)

    Feeney, Daniel F; Meyer, François G; Noone, Nicholas; Enoka, Roger M

    2017-10-01

    Motor neurons appear to be activated with a common input signal that modulates the discharge activity of all neurons in the motor nucleus. It has proven difficult for neurophysiologists to quantify the variability in a common input signal, but characterization of such a signal may improve our understanding of how the activation signal varies across motor tasks. Contemporary methods of quantifying the common input to motor neurons rely on compiling discrete action potentials into continuous time series, assuming the motor pool acts as a linear filter, and requiring signals to be of sufficient duration for frequency analysis. We introduce a space-state model in which the discharge activity of motor neurons is modeled as inhomogeneous Poisson processes and propose a method to quantify an abstract latent trajectory that represents the common input received by motor neurons. The approach also approximates the variation in synaptic noise in the common input signal. The model is validated with four data sets: a simulation of 120 motor units, a pair of integrate-and-fire neurons with a Renshaw cell providing inhibitory feedback, the discharge activity of 10 integrate-and-fire neurons, and the discharge times of concurrently active motor units during an isometric voluntary contraction. The simulations revealed that a latent state-space model is able to quantify the trajectory and variability of the common input signal across all four conditions. When compared with the cumulative spike train method of characterizing common input, the state-space approach was more sensitive to the details of the common input current and was less influenced by the duration of the signal. The state-space approach appears to be capable of detecting rather modest changes in common input signals across conditions. NEW & NOTEWORTHY We propose a state-space model that explicitly delineates a common input signal sent to motor neurons and the physiological noise inherent in synaptic signal

  8. A morphing technique for signal modelling in a multidimensional space of coupling parameters

    CERN Document Server

    The ATLAS collaboration

    2015-01-01

    This note describes a morphing method that produces signal models for fits to data in which both the affected event yields and kinematic distributions are simultaneously taken into account. The signal model is morphed in a continuous manner through the available multi-dimensional parameter space. Searches for deviations from Standard Model predictions for Higgs boson properties have so far used information either from event yields or kinematic distributions. The combined approach described here is expected to substantially enhance the sensitivity to beyond the Standard Model contributions.

  9. State space model approach for forecasting the use of electrical energy (a case study on: PT. PLN (Persero) district of Kroya)

    Science.gov (United States)

    Kurniati, Devi; Hoyyi, Abdul; Widiharih, Tatik

    2018-05-01

    Time series data is a series of data taken or measured based on observations at the same time interval. Time series data analysis is used to perform data analysis considering the effect of time. The purpose of time series analysis is to know the characteristics and patterns of a data and predict a data value in some future period based on data in the past. One of the forecasting methods used for time series data is the state space model. This study discusses the modeling and forecasting of electric energy consumption using the state space model for univariate data. The modeling stage is began with optimal Autoregressive (AR) order selection, determination of state vector through canonical correlation analysis, estimation of parameter, and forecasting. The result of this research shows that modeling of electric energy consumption using state space model of order 4 with Mean Absolute Percentage Error (MAPE) value 3.655%, so the model is very good forecasting category.

  10. Calculational models of close-spaced thermionic converters

    International Nuclear Information System (INIS)

    McVey, J.B.

    1983-01-01

    Two new calculational models have been developed in conjunction with the SAVTEC experimental program. These models have been used to analyze data from experimental close-spaced converters, providing values for spacing, electrode work functions, and converter efficiency. They have also been used to make performance predictions for such converters over a wide range of conditions. Both models are intended for use in the collisionless (Knudsen) regime. They differ from each other in that the simpler one uses a Langmuir-type formulation which only considers electrons emitted from the emitter. This approach is implemented in the LVD (Langmuir Vacuum Diode) computer program, which has the virtue of being both simple and fast. The more complex model also includes both Saha-Langmuir emission of positive cesium ions from the emitter and collector back emission. Computer implementation is by the KMD1 (Knudsen Mode Diode) program. The KMD1 model derives the particle distribution functions from the Vlasov equation. From these the particle densities are found for various interelectrode motive shapes. Substituting the particle densities into Poisson's equation gives a second order differential equation for potential. This equation can be integrated once analytically. The second integration, which gives the interelectrode motive, is performed numerically by the KMD1 program. This is complicated by the fact that the integrand is often singular at one end point of the integration interval. The program performs a transformation on the integrand to make it finite over the entire interval. Once the motive has been computed, the output voltage, current density, power density, and efficiency are found. The program is presently unable to operate when the ion richness ratio β is between about .8 and 1.0, due to the occurrence of oscillatory motives

  11. Exploiting Orbital Data and Observation Campaigns to Improve Space Debris Models

    Science.gov (United States)

    Braun, V.; Horstmann, A.; Reihs, B.; Lemmens, S.; Merz, K.; Krag, H.

    The European Space Agency (ESA) has been developing the Meteoroid and Space Debris Terrestrial Environment Reference (MASTER) software as the European reference model for space debris for more than 25 years. It is an event-based simulation of all known individual debris-generating events since 1957, including breakups, solid rocket motor firings and nuclear reactor core ejections. In 2014, the upgraded Debris Risk Assessment and Mitigation Analysis (DRAMA) tool suite was released. In the same year an ESA instruction made the standard ISO 24113:2011 on space debris mitigation requirements, adopted via the European Cooperation for Space Standardization (ECSS), applicable to all ESA missions. In order to verify the compliance of a space mission with those requirements, the DRAMA software is used to assess collision avoidance statistics, estimate the remaining orbital lifetime and evaluate the on-ground risk for controlled and uncontrolled reentries. In this paper, the approach to validate the MASTER and DRAMA tools is outlined. For objects larger than 1 cm, thus potentially being observable from ground, the MASTER model has been validated through dedicated observation campaigns. Recent campaign results shall be discussed. Moreover, catalogue data from the Space Surveillance Network (SSN) has been used to correlate the larger objects. In DRAMA, the assessment of collision avoidance statistics is based on orbit uncertainty information derived from Conjunction Data Messages (CDM) provided by the Joint Space Operations Center (JSpOC). They were collected for more than 20 ESA spacecraft in the recent years. The way this information is going to be used in a future DRAMA version is outlined and the comparison of estimated manoeuvre rates with real manoeuvres from the operations of ESA spacecraft is shown.

  12. Toward a global space exploration program: A stepping stone approach

    Science.gov (United States)

    Ehrenfreund, Pascale; McKay, Chris; Rummel, John D.; Foing, Bernard H.; Neal, Clive R.; Masson-Zwaan, Tanja; Ansdell, Megan; Peter, Nicolas; Zarnecki, John; Mackwell, Steve; Perino, Maria Antionetta; Billings, Linda; Mankins, John; Race, Margaret

    2012-01-01

    In response to the growing importance of space exploration in future planning, the Committee on Space Research (COSPAR) Panel on Exploration (PEX) was chartered to provide independent scientific advice to support the development of exploration programs and to safeguard the potential scientific assets of solar system objects. In this report, PEX elaborates a stepwise approach to achieve a new level of space cooperation that can help develop world-wide capabilities in space science and exploration and support a transition that will lead to a global space exploration program. The proposed stepping stones are intended to transcend cross-cultural barriers, leading to the development of technical interfaces and shared legal frameworks and fostering coordination and cooperation on a broad front. Input for this report was drawn from expertise provided by COSPAR Associates within the international community and via the contacts they maintain in various scientific entities. The report provides a summary and synthesis of science roadmaps and recommendations for planetary exploration produced by many national and international working groups, aiming to encourage and exploit synergies among similar programs. While science and technology represent the core and, often, the drivers for space exploration, several other disciplines and their stakeholders (Earth science, space law, and others) should be more robustly interlinked and involved than they have been to date. The report argues that a shared vision is crucial to this linkage, and to providing a direction that enables new countries and stakeholders to join and engage in the overall space exploration effort. Building a basic space technology capacity within a wider range of countries, ensuring new actors in space act responsibly, and increasing public awareness and engagement are concrete steps that can provide a broader interest in space exploration, worldwide, and build a solid basis for program sustainability. By engaging

  13. A state-space modeling approach to estimating canopy conductance and associated uncertainties from sap flux density data.

    Science.gov (United States)

    Bell, David M; Ward, Eric J; Oishi, A Christopher; Oren, Ram; Flikkema, Paul G; Clark, James S

    2015-07-01

    Uncertainties in ecophysiological responses to environment, such as the impact of atmospheric and soil moisture conditions on plant water regulation, limit our ability to estimate key inputs for ecosystem models. Advanced statistical frameworks provide coherent methodologies for relating observed data, such as stem sap flux density, to unobserved processes, such as canopy conductance and transpiration. To address this need, we developed a hierarchical Bayesian State-Space Canopy Conductance (StaCC) model linking canopy conductance and transpiration to tree sap flux density from a 4-year experiment in the North Carolina Piedmont, USA. Our model builds on existing ecophysiological knowledge, but explicitly incorporates uncertainty in canopy conductance, internal tree hydraulics and observation error to improve estimation of canopy conductance responses to atmospheric drought (i.e., vapor pressure deficit), soil drought (i.e., soil moisture) and above canopy light. Our statistical framework not only predicted sap flux observations well, but it also allowed us to simultaneously gap-fill missing data as we made inference on canopy processes, marking a substantial advance over traditional methods. The predicted and observed sap flux data were highly correlated (mean sensor-level Pearson correlation coefficient = 0.88). Variations in canopy conductance and transpiration associated with environmental variation across days to years were many times greater than the variation associated with model uncertainties. Because some variables, such as vapor pressure deficit and soil moisture, were correlated at the scale of days to weeks, canopy conductance responses to individual environmental variables were difficult to interpret in isolation. Still, our results highlight the importance of accounting for uncertainty in models of ecophysiological and ecosystem function where the process of interest, canopy conductance in this case, is not observed directly. The StaCC modeling

  14. An Autonomous Sensor Tasking Approach for Large Scale Space Object Cataloging

    Science.gov (United States)

    Linares, R.; Furfaro, R.

    The field of Space Situational Awareness (SSA) has progressed over the last few decades with new sensors coming online, the development of new approaches for making observations, and new algorithms for processing them. Although there has been success in the development of new approaches, a missing piece is the translation of SSA goals to sensors and resource allocation; otherwise known as the Sensor Management Problem (SMP). This work solves the SMP using an artificial intelligence approach called Deep Reinforcement Learning (DRL). Stable methods for training DRL approaches based on neural networks exist, but most of these approaches are not suitable for high dimensional systems. The Asynchronous Advantage Actor-Critic (A3C) method is a recently developed and effective approach for high dimensional systems, and this work leverages these results and applies this approach to decision making in SSA. The decision space for the SSA problems can be high dimensional, even for tasking of a single telescope. Since the number of SOs in space is relatively high, each sensor will have a large number of possible actions at a given time. Therefore, efficient DRL approaches are required when solving the SMP for SSA. This work develops a A3C based method for DRL applied to SSA sensor tasking. One of the key benefits of DRL approaches is the ability to handle high dimensional data. For example DRL methods have been applied to image processing for the autonomous car application. For example, a 256x256 RGB image has 196608 parameters (256*256*3=196608) which is very high dimensional, and deep learning approaches routinely take images like this as inputs. Therefore, when applied to the whole catalog the DRL approach offers the ability to solve this high dimensional problem. This work has the potential to, for the first time, solve the non-myopic sensor tasking problem for the whole SO catalog (over 22,000 objects) providing a truly revolutionary result.

  15. Analyzing energy consumption of wireless networks. A model-based approach

    Energy Technology Data Exchange (ETDEWEB)

    Yue, Haidi

    2013-03-04

    During the last decades, wireless networking has been continuously a hot topic both in academy and in industry. Many different wireless networks have been introduced like wireless local area networks, wireless personal networks, wireless ad hoc networks, and wireless sensor networks. If these networks want to have a long term usability, the power consumed by the wireless devices in each of these networks needs to be managed efficiently. Hence, a lot of effort has been carried out for the analysis and improvement of energy efficiency, either for a specific network layer (protocol), or new cross-layer designs. In this thesis, we apply model-based approach for the analysis of energy consumption of different wireless protocols. The protocols under consideration are: one leader election protocol, one routing protocol, and two medium access control protocols. By model-based approach we mean that all these four protocols are formalized as some formal models, more precisely, as discrete-time Markov chains (DTMCs), Markov decision processes (MDPs), or stochastic timed automata (STA). For the first two models, DTMCs and MDPs, we model them in PRISM, a prominent model checker for probabilistic model checking, and apply model checking technique to analyze them. Model checking belongs to the family of formal methods. It discovers exhaustively all possible (reachable) states of the models, and checks whether these models meet a given specification. Specifications are system properties that we want to study, usually expressed by some logics, for instance, probabilistic computer tree logic (PCTL). However, while model checking relies on rigorous mathematical foundations and automatically explores the entire state space of a model, its applicability is also limited by the so-called state space explosion problem -- even systems of moderate size often yield models with an exponentially larger state space that thwart their analysis. Hence for the STA models in this thesis, since there

  16. Modeling beams with elements in phase space

    International Nuclear Information System (INIS)

    Nelson, E.M.

    1998-01-01

    Conventional particle codes represent beams as a collection of macroparticles. An alternative is to represent the beam as a collection of current carrying elements in phase space. While such a representation has limitations, it may be less noisy than a macroparticle model, and it may provide insights about the transport of space charge dominated beams which would otherwise be difficult to gain from macroparticle simulations. The phase space element model of a beam is described, and progress toward an implementation and difficulties with this implementation are discussed. A simulation of an axisymmetric beam using 1d elements in phase space is demonstrated

  17. An alternative approach for modeling strength differential effect in sheet metals with symmetric yield functions

    Science.gov (United States)

    Kurukuri, Srihari; Worswick, Michael J.

    2013-12-01

    An alternative approach is proposed to utilize symmetric yield functions for modeling the tension-compression asymmetry commonly observed in hcp materials. In this work, the strength differential (SD) effect is modeled by choosing separate symmetric plane stress yield functions (for example, Barlat Yld 2000-2d) for the tension i.e., in the first quadrant of principal stress space, and compression i.e., third quadrant of principal stress space. In the second and fourth quadrants, the yield locus is constructed by adopting interpolating functions between uniaxial tensile and compressive stress states. In this work, different interpolating functions are chosen and the predictive capability of each approach is discussed. The main advantage of this proposed approach is that the yield locus parameters are deterministic and relatively easy to identify when compared to the Cazacu family of yield functions commonly used for modeling SD effect observed in hcp materials.

  18. SWIFF: Space weather integrated forecasting framework

    Directory of Open Access Journals (Sweden)

    Frederiksen Jacob Trier

    2013-02-01

    Full Text Available SWIFF is a project funded by the Seventh Framework Programme of the European Commission to study the mathematical-physics models that form the basis for space weather forecasting. The phenomena of space weather span a tremendous scale of densities and temperature with scales ranging 10 orders of magnitude in space and time. Additionally even in local regions there are concurrent processes developing at the electron, ion and global scales strongly interacting with each other. The fundamental challenge in modelling space weather is the need to address multiple physics and multiple scales. Here we present our approach to take existing expertise in fluid and kinetic models to produce an integrated mathematical approach and software infrastructure that allows fluid and kinetic processes to be modelled together. SWIFF aims also at using this new infrastructure to model specific coupled processes at the Solar Corona, in the interplanetary space and in the interaction at the Earth magnetosphere.

  19. An integrated mission approach to the space exploration initiative will ensure success

    Science.gov (United States)

    Coomes, Edmund P.; Dagle, Jefferey E.; Bamberger, Judith A.; Noffsinger, Kent E.

    1991-01-01

    The direction of the American space program, as defined by President Bush and the National Commission on Space, is to expand human presence into the solar system. Landing an American on Mars by the 50th anniversary of the Apollo 11 lunar landing is the goal. This challenge has produced a level of excitement among young Americans not seen for nearly three decades. The exploration and settlement of the space frontier will occupy the creative thoughts and energies of generations of Americans well into the next century. The return of Americans to the moon and beyond must be viewed as a national effort with strong public support if it is to become a reality. Key to making this an actuality is the mission approach selected. Developing a permanent presence in space requires a continual stepping outward from Earch in a logical progressive manner. If we seriously plan to go and to stay, then not only must we plan what we are to do and how we are to do it, we must address the logistic support infrastructure that will allow us to stay there once we arrive. A fully integrated approach to mission planning is needed if the Space exploration Initiative (SEI) is to be successful. Only in this way can a permanent human presence in space be sustained. An integrated infrastructure approach would reduce the number of new systems and technologies requiring development. The resultant horizontal commonality of systems and hardware would reduce the direct economic impact of SEI while an early return on investment through technology spin-offs would be an economic benefit by greatly enhancing our international technical competitiveness. If the exploration, development, and colonization of space is to be affordable and acceptable, careful consideration must be given to such things as ``return on investment'' and ``commercial product potential'' of the technologies developed. This integrated approach will win the Congressional support needed to secure the financial backing necessary to assure

  20. The space-time model according to dimensional continuous space-time theory

    International Nuclear Information System (INIS)

    Martini, Luiz Cesar

    2014-01-01

    This article results from the Dimensional Continuous Space-Time Theory for which the introductory theoretician was presented in [1]. A theoretical model of the Continuous Space-Time is presented. The wave equation of time into absolutely stationary empty space referential will be described in detail. The complex time, that is the time fixed on the infinite phase time speed referential, is deduced from the New View of Relativity Theory that is being submitted simultaneously with this article in this congress. Finally considering the inseparable Space-Time is presented the duality equation wave-particle.

  1. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    Science.gov (United States)

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (pEEG patterns such as generalized periodic discharges (pEEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Space-Hotel Early Bird - An Educational and Public Outreach Approach

    Science.gov (United States)

    Amekrane, R.; Holze, C.

    2002-01-01

    In April 2001 the German Aerospace Society DGLR e.V. in cooperation with the Technical University of Darmstadt, Germany initiated an interdisciplinary students contest, under the patronage of Mr. Joerg Feustel-Buechl, the Director of Manned Spaceflight and Microgravity, European Space Agency (ESA), for the summer term 2001. It was directed to graduated architecture students, who had to conceive and design a space-hotel with specific technical, economical and social requirements. The to be developed Space Hotel for a low earth orbit has to accommodate 220 guests. It was of utmost importance that this contest becomes an integral part of the student's tuition and that professors of the different academic and industrial institutions supported the project idea. During the summer term 2001 about fifty students occupied themselves with the topic, "design of an innovative space-hotel". The overall challenge was to create rooms used under microgravity environment, which means to overcome existing definitions and to find a new definition of living space. Because none of the students were able to experience such a room under microgravity they were forced to use the power of their imagination capability. The students attended moreover a number of lectures on different technical subjects focusing on space and went on several space-related excursions. Having specialists, as volunteers, in the field of space in charge meant that it could be ensured that the designs reflected a certain possibility of being able to be realized. Within the summer term seventeen major designs developed from the conceptual status to high sophisticated concepts and later on also to respective models. A competition combined with a public exhibition, that took place within the Annual German Aeronautics and Astronautics Congress, and intense media relations finalized this project. The project idea of "Early Bird - Visions of a Space Hotel" which was developed within six month is a remarkable example, how

  3. Framework for analyzing ecological trait-based models in multidimensional niche spaces

    Science.gov (United States)

    Biancalani, Tommaso; DeVille, Lee; Goldenfeld, Nigel

    2015-05-01

    We develop a theoretical framework for analyzing ecological models with a multidimensional niche space. Our approach relies on the fact that ecological niches are described by sequences of symbols, which allows us to include multiple phenotypic traits. Ecological drivers, such as competitive exclusion, are modeled by introducing the Hamming distance between two sequences. We show that a suitable transform diagonalizes the community interaction matrix of these models, making it possible to predict the conditions for niche differentiation and, close to the instability onset, the asymptotically long time population distributions of niches. We exemplify our method using the Lotka-Volterra equations with an exponential competition kernel.

  4. A Belief-Space Approach to Integrated Intelligence - Research Area 10.3: Intelligent Networks

    Science.gov (United States)

    2017-12-05

    A Belief-Space Approach to Integrated Intelligence- Research Area 10.3: Intelligent Networks The views, opinions and/or findings contained in this...Technology (MIT) Title: A Belief-Space Approach to Integrated Intelligence- Research Area 10.3: Intelligent Networks Report Term: 0-Other Email: tlp...students presented progress and received feedback from the research group . o wrote papers on their research and submitted them to leading conferences

  5. [Optimize dropping process of Ginkgo biloba dropping pills by using design space approach].

    Science.gov (United States)

    Shen, Ji-Chen; Wang, Qing-Qing; Chen, An; Pan, Fang-Lai; Gong, Xing-Chu; Qu, Hai-Bin

    2017-07-01

    In this paper, a design space approach was applied to optimize the dropping process of Ginkgo biloba dropping pills. Firstly, potential critical process parameters and potential process critical quality attributes were determined through literature research and pre-experiments. Secondly, experiments were carried out according to Box-Behnken design. Then the critical process parameters and critical quality attributes were determined based on the experimental results. Thirdly, second-order polynomial models were used to describe the quantitative relationships between critical process parameters and critical quality attributes. Finally, a probability-based design space was calculated and verified. The verification results showed that efficient production of Ginkgo biloba dropping pills can be guaranteed by operating within the design space parameters. The recommended operation ranges for the critical dropping process parameters of Ginkgo biloba dropping pills were as follows: dropping distance of 5.5-6.7 cm, and dropping speed of 59-60 drops per minute, providing a reference for industrial production of Ginkgo biloba dropping pills. Copyright© by the Chinese Pharmaceutical Association.

  6. Application of Gauss's law space-charge limited emission model in iterative particle tracking method

    Energy Technology Data Exchange (ETDEWEB)

    Altsybeyev, V.V., E-mail: v.altsybeev@spbu.ru; Ponomarev, V.A.

    2016-11-01

    The particle tracking method with a so-called gun iteration for modeling the space charge is discussed in the following paper. We suggest to apply the emission model based on the Gauss's law for the calculation of the space charge limited current density distribution using considered method. Based on the presented emission model we have developed a numerical algorithm for this calculations. This approach allows us to perform accurate and low time consumpting numerical simulations for different vacuum sources with the curved emitting surfaces and also in the presence of additional physical effects such as bipolar flows and backscattered electrons. The results of the simulations of the cylindrical diode and diode with elliptical emitter with the use of axysimmetric coordinates are presented. The high efficiency and accuracy of the suggested approach are confirmed by the obtained results and comparisons with the analytical solutions.

  7. Review of hardware cost estimation methods, models and tools applied to early phases of space mission planning

    Science.gov (United States)

    Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.

    2012-08-01

    The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation

  8. Task-space separation principle: a force-field approach to motion planning for redundant manipulators.

    Science.gov (United States)

    Tommasino, Paolo; Campolo, Domenico

    2017-02-03

    In this work, we address human-like motor planning in redundant manipulators. Specifically, we want to capture postural synergies such as Donders' law, experimentally observed in humans during kinematically redundant tasks, and infer a minimal set of parameters to implement similar postural synergies in a kinematic model. For the model itself, although the focus of this paper is to solve redundancy by implementing postural strategies derived from experimental data, we also want to ensure that such postural control strategies do not interfere with other possible forms of motion control (in the task-space), i.e. solving the posture/movement problem. The redundancy problem is framed as a constrained optimization problem, traditionally solved via the method of Lagrange multipliers. The posture/movement problem can be tackled via the separation principle which, derived from experimental evidence, posits that the brain processes static torques (i.e. posture-dependent, such as gravitational torques) separately from dynamic torques (i.e. velocity-dependent). The separation principle has traditionally been applied at a joint torque level. Our main contribution is to apply the separation principle to Lagrange multipliers, which act as task-space force fields, leading to a task-space separation principle. In this way, we can separate postural control (implementing Donders' law) from various types of tasks-space movement planners. As an example, the proposed framework is applied to the (redundant) task of pointing with the human wrist. Nonlinear inverse optimization (NIO) is used to fit the model parameters and to capture motor strategies displayed by six human subjects during pointing tasks. The novelty of our NIO approach is that (i) the fitted motor strategy, rather than raw data, is used to filter and down-sample human behaviours; (ii) our framework is used to efficiently simulate model behaviour iteratively, until it converges towards the experimental human strategies.

  9. A BRDF statistical model applying to space target materials modeling

    Science.gov (United States)

    Liu, Chenghao; Li, Zhi; Xu, Can; Tian, Qichen

    2017-10-01

    In order to solve the problem of poor effect in modeling the large density BRDF measured data with five-parameter semi-empirical model, a refined statistical model of BRDF which is suitable for multi-class space target material modeling were proposed. The refined model improved the Torrance-Sparrow model while having the modeling advantages of five-parameter model. Compared with the existing empirical model, the model contains six simple parameters, which can approximate the roughness distribution of the material surface, can approximate the intensity of the Fresnel reflectance phenomenon and the attenuation of the reflected light's brightness with the azimuth angle changes. The model is able to achieve parameter inversion quickly with no extra loss of accuracy. The genetic algorithm was used to invert the parameters of 11 different samples in the space target commonly used materials, and the fitting errors of all materials were below 6%, which were much lower than those of five-parameter model. The effect of the refined model is verified by comparing the fitting results of the three samples at different incident zenith angles in 0° azimuth angle. Finally, the three-dimensional modeling visualizations of these samples in the upper hemisphere space was given, in which the strength of the optical scattering of different materials could be clearly shown. It proved the good describing ability of the refined model at the material characterization as well.

  10. Alpine Windharvest: development of information base regarding potentials and the necessary technical, legal and socio-economic conditions for expanding wind energy in the Alpine Space - Alpine Space wind map - Modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Schaffner, B.; Remund, J. [Meteotest, Berne (Switzerland)

    2005-07-01

    This report presents describes the development work carried out by the Swiss meteorology specialists of the company METEOTEST as part of a project carried out together with the Swiss wind-energy organisation 'Suisse Eole'. The framework for the project is the EU Interreg IIIB Alpine Space Programme, a European Community Initiative Programme funded by the European Regional Development Fund. The project investigated the use of digital relief-analysis. The series of reports describes the development and use of a basic information system to aid the investigation of the technical, legal and socio-economical conditions for the use of wind energy in the alpine area. This report discusses two modelling approaches investigated for use in the definition of a wind map for the alpine area. The method chosen and its application are discussed. The various sources of information for input to the model are listed and discussed.

  11. Community Coordinated Modeling Center: A Powerful Resource in Space Science and Space Weather Education

    Science.gov (United States)

    Chulaki, A.; Kuznetsova, M. M.; Rastaetter, L.; MacNeice, P. J.; Shim, J. S.; Pulkkinen, A. A.; Taktakishvili, A.; Mays, M. L.; Mendoza, A. M. M.; Zheng, Y.; Mullinix, R.; Collado-Vega, Y. M.; Maddox, M. M.; Pembroke, A. D.; Wiegand, C.

    2015-12-01

    Community Coordinated Modeling Center (CCMC) is a NASA affiliated interagency partnership with the primary goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research. Additionally, over the past ten years it has established itself as a global space science education resource supporting undergraduate and graduate education and research, and spreading space weather awareness worldwide. A unique combination of assets, capabilities and close ties to the scientific and educational communities enable this small group to serve as a hub for raising generations of young space scientists and engineers. CCMC resources are publicly available online, providing unprecedented global access to the largest collection of modern space science models (developed by the international research community). CCMC has revolutionized the way simulations are utilized in classrooms settings, student projects, and scientific labs and serves hundreds of educators, students and researchers every year. Another major CCMC asset is an expert space weather prototyping team primarily serving NASA's interplanetary space weather needs. Capitalizing on its unrivaled capabilities and experiences, the team provides in-depth space weather training to students and professionals worldwide, and offers an amazing opportunity for undergraduates to engage in real-time space weather monitoring, analysis, forecasting and research. In-house development of state-of-the-art space weather tools and applications provides exciting opportunities to students majoring in computer science and computer engineering fields to intern with the software engineers at the CCMC while also learning about the space weather from the NASA scientists.

  12. Modelling of air-conditioned and heated spaces

    Energy Technology Data Exchange (ETDEWEB)

    Moehl, U

    1987-01-01

    A space represents a complex system involving numerous components, manipulated variables and disturbances which need to be described if dynamic behaviour of space air is to be determined. A justifiable amount of simulation input is determined by the application of adjusted modelling of the individual components. The determination of natural air exchange in heated spaces and of space-air flow in air-conditioned space are a primary source of uncertainties. (orig.).

  13. The Hyper-Envelope Modeling Interface (HEMI): A Novel Approach Illustrated Through Predicting Tamarisk (Tamarix spp.) Habitat in the Western USA

    Science.gov (United States)

    Graham, Jim; Young, Nick; Jarnevich, Catherine S.; Newman, Greg; Evangelista, Paul; Stohlgren, Thomas J.

    2013-01-01

    Habitat suitability maps are commonly created by modeling a species’ environmental niche from occurrences and environmental characteristics. Here, we introduce the hyper-envelope modeling interface (HEMI), providing a new method for creating habitat suitability models using Bezier surfaces to model a species niche in environmental space. HEMI allows modeled surfaces to be visualized and edited in environmental space based on expert knowledge and does not require absence points for model development. The modeled surfaces require relatively few parameters compared to similar modeling approaches and may produce models that better match ecological niche theory. As a case study, we modeled the invasive species tamarisk (Tamarix spp.) in the western USA. We compare results from HEMI with those from existing similar modeling approaches (including BioClim, BioMapper, and Maxent). We used synthetic surfaces to create visualizations of the various models in environmental space and used modified area under the curve (AUC) statistic and akaike information criterion (AIC) as measures of model performance. We show that HEMI produced slightly better AUC values, except for Maxent and better AIC values overall. HEMI created a model with only ten parameters while Maxent produced a model with over 100 and BioClim used only eight. Additionally, HEMI allowed visualization and editing of the model in environmental space to develop alternative potential habitat scenarios. The use of Bezier surfaces can provide simple models that match our expectations of biological niche models and, at least in some cases, out-perform more complex approaches.

  14. The Hyper-Envelope Modeling Interface (HEMI): A Novel Approach Illustrated Through Predicting Tamarisk ( Tamarix spp.) Habitat in the Western USA

    Science.gov (United States)

    Graham, Jim; Young, Nick; Jarnevich, Catherine S.; Newman, Greg; Evangelista, Paul; Stohlgren, Thomas J.

    2013-10-01

    Habitat suitability maps are commonly created by modeling a species' environmental niche from occurrences and environmental characteristics. Here, we introduce the hyper-envelope modeling interface (HEMI), providing a new method for creating habitat suitability models using Bezier surfaces to model a species niche in environmental space. HEMI allows modeled surfaces to be visualized and edited in environmental space based on expert knowledge and does not require absence points for model development. The modeled surfaces require relatively few parameters compared to similar modeling approaches and may produce models that better match ecological niche theory. As a case study, we modeled the invasive species tamarisk ( Tamarix spp.) in the western USA. We compare results from HEMI with those from existing similar modeling approaches (including BioClim, BioMapper, and Maxent). We used synthetic surfaces to create visualizations of the various models in environmental space and used modified area under the curve (AUC) statistic and akaike information criterion (AIC) as measures of model performance. We show that HEMI produced slightly better AUC values, except for Maxent and better AIC values overall. HEMI created a model with only ten parameters while Maxent produced a model with over 100 and BioClim used only eight. Additionally, HEMI allowed visualization and editing of the model in environmental space to develop alternative potential habitat scenarios. The use of Bezier surfaces can provide simple models that match our expectations of biological niche models and, at least in some cases, out-perform more complex approaches.

  15. Properties of Brownian Image Models in Scale-Space

    DEFF Research Database (Denmark)

    Pedersen, Kim Steenstrup

    2003-01-01

    Brownian images) will be discussed in relation to linear scale-space theory, and it will be shown empirically that the second order statistics of natural images mapped into jet space may, within some scale interval, be modeled by the Brownian image model. This is consistent with the 1/f 2 power spectrum...... law that apparently governs natural images. Furthermore, the distribution of Brownian images mapped into jet space is Gaussian and an analytical expression can be derived for the covariance matrix of Brownian images in jet space. This matrix is also a good approximation of the covariance matrix......In this paper it is argued that the Brownian image model is the least committed, scale invariant, statistical image model which describes the second order statistics of natural images. Various properties of three different types of Gaussian image models (white noise, Brownian and fractional...

  16. State-space prediction model for chaotic time series

    Science.gov (United States)

    Alparslan, A. K.; Sayar, M.; Atilgan, A. R.

    1998-08-01

    A simple method for predicting the continuation of scalar chaotic time series ahead in time is proposed. The false nearest neighbors technique in connection with the time-delayed embedding is employed so as to reconstruct the state space. A local forecasting model based upon the time evolution of the topological neighboring in the reconstructed phase space is suggested. A moving root-mean-square error is utilized in order to monitor the error along the prediction horizon. The model is tested for the convection amplitude of the Lorenz model. The results indicate that for approximately 100 cycles of the training data, the prediction follows the actual continuation very closely about six cycles. The proposed model, like other state-space forecasting models, captures the long-term behavior of the system due to the use of spatial neighbors in the state space.

  17. Phase-space densities and effects of resonance decays in a hydrodynamic approach to heavy ion collisions

    International Nuclear Information System (INIS)

    Akkelin, S.V.; Sinyukov, Yu.M.

    2004-01-01

    A method allowing analysis of the overpopulation of phase space in heavy ion collisions in a model-independent way is proposed within the hydrodynamic approach. It makes it possible to extract a chemical potential of thermal pions at freeze-out, irrespective of the form of freeze-out (isothermal) hypersurface in Minkowski space and transverse flows on it. The contributions of resonance (with masses up to 2 GeV) decays to spectra, interferometry volumes, and phase-space densities are calculated and discussed in detail. The estimates of average phase-space densities and chemical potentials of thermal pions are obtained for SPS and RHIC energies. They demonstrate that multibosonic phenomena at those energies might be considered as a correction factor rather than as a significant physical effect. The analysis of the evolution of the pion average phase-space density in chemically frozen hadron systems shows that it is almost constant or slightly increases with time while the particle density and phase-space density at each space point decreases rapidly during the system's expansion. We found that, unlike the particle density, the average phase-space density has no direct link to the freeze-out criterion and final thermodynamic parameters, being connected rather to the initial phase-space density of hadronic matter formed in relativistic nucleus-nucleus collisions

  18. Quantum harmonic Brownian motion in a general environment: A modified phase-space approach

    International Nuclear Information System (INIS)

    Yeh, L.

    1993-01-01

    After extensive investigations over three decades, the linear-coupling model and its equivalents have become the standard microscopic models for quantum harmonic Brownian motion, in which a harmonically bound Brownian particle is coupled to a quantum dissipative heat bath of general type modeled by infinitely many harmonic oscillators. The dynamics of these models have been studied by many authors using the quantum Langevin equation, the path-integral approach, quasi-probability distribution functions (e.g., the Wigner function), etc. However, the quantum Langevin equation is only applicable to some special problems, while other approaches all involve complicated calculations due to the inevitable reduction (i.e., contraction) operation for ignoring/eliminating the degrees of freedom of the heat bath. In this dissertation, the author proposes an improved methodology via a modified phase-space approach which employs the characteristic function (the symplectic Fourier transform of the Wigner function) as the representative of the density operator. This representative is claimed to be the most natural one for performing the reduction, not only because of its simplicity but also because of its manifestation of geometric meaning. Accordingly, it is particularly convenient for studying the time evolution of the Brownian particle with an arbitrary initial state. The power of this characteristic function is illuminated through a detailed study of several physically interesting problems, including the environment-induced damping of quantum interference, the exact quantum Fokker-Planck equations, and the relaxation of non-factorizable initial states. All derivations and calculations axe shown to be much simplified in comparison with other approaches. In addition to dynamical problems, a novel derivation of the fluctuation-dissipation theorem which is valid for all quantum linear systems is presented

  19. Analysis of Fault Spacing in Thrust-Belt Wedges Using Numerical Modeling

    Science.gov (United States)

    Regensburger, P. V.; Ito, G.

    2017-12-01

    Numerical modeling is invaluable in studying the mechanical processes governing the evolution of geologic features such as thrust-belt wedges. The mechanisms controlling thrust fault spacing in wedges is not well understood. Our numerical model treats the thrust belt as a visco-elastic-plastic continuum and uses a finite-difference, marker-in-cell method to solve for conservation of mass and momentum. From these conservation laws, stress is calculated and Byerlee's law is used to determine the shear stress required for a fault to form. Each model consists of a layer of crust, initially 3-km-thick, carried on top of a basal décollement, which moves at a constant speed towards a rigid backstop. A series of models were run with varied material properties, focusing on the angle of basal friction at the décollement, the angle of friction within the crust, and the cohesion of the crust. We investigate how these properties affected the spacing between thrusts that have the most time-integrated history of slip and therefore have the greatest effect on the large-scale undulations in surface topography. The surface position of these faults, which extend through most of the crustal layer, are identifiable as local maxima in positive curvature of surface topography. Tracking the temporal evolution of faults, we find that thrust blocks are widest when they first form at the front of the wedge and then they tend to contract over time as more crustal material is carried to the wedge. Within each model, thrust blocks form with similar initial widths, but individual thrust blocks develop differently and may approach an asymptotic width over time. The median of thrust block widths across the whole wedge tends to decrease with time. Median fault spacing shows a positive correlation with both wedge cohesion and internal friction. In contrast, median fault spacing exhibits a negative correlation at small angles of basal friction (laws that can be used to predict fault spacing in

  20. A Procedure for Identification of Appropriate State Space and ARIMA Models Based on Time-Series Cross-Validation

    Directory of Open Access Journals (Sweden)

    Patrícia Ramos

    2016-11-01

    Full Text Available In this work, a cross-validation procedure is used to identify an appropriate Autoregressive Integrated Moving Average model and an appropriate state space model for a time series. A minimum size for the training set is specified. The procedure is based on one-step forecasts and uses different training sets, each containing one more observation than the previous one. All possible state space models and all ARIMA models where the orders are allowed to range reasonably are fitted considering raw data and log-transformed data with regular differencing (up to second order differences and, if the time series is seasonal, seasonal differencing (up to first order differences. The value of root mean squared error for each model is calculated averaging the one-step forecasts obtained. The model which has the lowest root mean squared error value and passes the Ljung–Box test using all of the available data with a reasonable significance level is selected among all the ARIMA and state space models considered. The procedure is exemplified in this paper with a case study of retail sales of different categories of women’s footwear from a Portuguese retailer, and its accuracy is compared with three reliable forecasting approaches. The results show that our procedure consistently forecasts more accurately than the other approaches and the improvements in the accuracy are significant.

  1. Space Culture: Innovative Cultural Approaches To Public Engagement With Astronomy, Space Science And Astronautics

    Science.gov (United States)

    Malina, Roger F.

    2012-01-01

    In recent years a number of cultural organizations have established ongoing programs of public engagement with astronomy, space science and astronautics. Many involve elements of citizen science initiatives, artists’ residencies in scientific laboratories and agencies, art and science festivals, and social network projects as well as more traditional exhibition venues. Recognizing these programs several agencies and organizations have established mechanisms for facilitating public engagement with astronomy and space science through cultural activities. The International Astronautics Federation has established an Technical Activities Committee for the Cultural Utilization of Space. Over the past year the NSF and NEA have organized disciplinary workshops to develop recommendations relating to art-science interaction and community building efforts. Rationales for encouraging public engagement via cultural projects range from theory of creativity, innovation and invention to cultural appropriation in the context of `socially robust science’ as advocated by Helga Nowotny of the European Research Council. Public engagement with science, as opposed to science education and outreach initiatives, require different approaches. Just as organizations have employed education professionals to lead education activities, so they must employ cultural professionals if they wish to develop public engagement projects via arts and culture. One outcome of the NSF and NEA workshops has been development of a rationale for converting STEM to STEAM by including the arts in STEM methodologies, particularly for K-12 where students can access science via arts and cultural contexts. Often these require new kinds of informal education approaches that exploit locative media, gaming platforms, artists projects and citizen science. Incorporating astronomy and space science content in art and cultural projects requires new skills in `cultural translation’ and `trans-mediation’ and new kinds

  2. UCLA space-time area law model: A persuasive foundation for hadronization

    International Nuclear Information System (INIS)

    Abachi, S.; Buchanan, C.; Chien, A.; Chun, S.; Hartfiel, B.

    2007-01-01

    From the studies of rates and distributions of heavy quark (c,b) mesons we have developed additional evidence that hadron formation, at least in the simplest environment of e + e - collisions, is dominantly controlled by a space-time area law (''STAL''), an approach suggested by both non-perturbative QCD and relativistic string models. From the dynamics of heavy quarks whose classical space-time world-lines deviate significantly from the light-cone, we report the exact calculation of the relevant space-time area and the derivation of a Lorentz invariant variable, z eff , which reduces to the light-cone momentum fraction z for low mass quarks. Using z eff in the exponent of our fragmentation function in place of z, we find persuasive agreement with L=0,1 charmed and bottom meson data as well as for u,d,s L=0 states. Presuming STAL to be a valid first-order description for all these meson data, we find the scale of other possible second-order effects to be limited to ∝20% or less of the observed rates. The model favors a b-quark mass of ∝4.5 GeV. (orig.)

  3. Bottom-up modeling approach for the quantitative estimation of parameters in pathogen-host interactions.

    Science.gov (United States)

    Lehnert, Teresa; Timme, Sandra; Pollmächer, Johannes; Hünniger, Kerstin; Kurzai, Oliver; Figge, Marc Thilo

    2015-01-01

    Opportunistic fungal pathogens can cause bloodstream infection and severe sepsis upon entering the blood stream of the host. The early immune response in human blood comprises the elimination of pathogens by antimicrobial peptides and innate immune cells, such as neutrophils or monocytes. Mathematical modeling is a predictive method to examine these complex processes and to quantify the dynamics of pathogen-host interactions. Since model parameters are often not directly accessible from experiment, their estimation is required by calibrating model predictions with experimental data. Depending on the complexity of the mathematical model, parameter estimation can be associated with excessively high computational costs in terms of run time and memory. We apply a strategy for reliable parameter estimation where different modeling approaches with increasing complexity are used that build on one another. This bottom-up modeling approach is applied to an experimental human whole-blood infection assay for Candida albicans. Aiming for the quantification of the relative impact of different routes of the immune response against this human-pathogenic fungus, we start from a non-spatial state-based model (SBM), because this level of model complexity allows estimating a priori unknown transition rates between various system states by the global optimization method simulated annealing. Building on the non-spatial SBM, an agent-based model (ABM) is implemented that incorporates the migration of interacting cells in three-dimensional space. The ABM takes advantage of estimated parameters from the non-spatial SBM, leading to a decreased dimensionality of the parameter space. This space can be scanned using a local optimization approach, i.e., least-squares error estimation based on an adaptive regular grid search, to predict cell migration parameters that are not accessible in experiment. In the future, spatio-temporal simulations of whole-blood samples may enable timely

  4. Application of a Systems Engineering Approach to Support Space Reactor Development

    International Nuclear Information System (INIS)

    Wold, Scott

    2005-01-01

    In 1992, approximately 25 Russian and 12 U.S. engineers and technicians were involved in the transport, assembly, inspection, and testing of over 90 tons of Russian equipment associated with the Thermionic System Evaluation Test (TSET) Facility. The entire Russian Baikal Test Stand, consisting of a 5.79 m tall vacuum chamber and related support equipment, was reassembled and tested at the TSET facility in less than four months. In November 1992, the first non-nuclear operational test of a complete thermionic power reactor system in the U.S. was accomplished three months ahead of schedule and under budget. A major factor in this accomplishment was the application of a disciplined top-down systems engineering approach and application of a spiral development model to achieve the desired objectives of the TOPAZ International Program (TIP). Systems Engineering is a structured discipline that helps programs and projects conceive, develop, integrate, test and deliver products and services that meet customer requirements within cost and schedule. This paper discusses the impact of Systems Engineering and a spiral development model on the success of the TOPAZ International Program and how the application of a similar approach could help ensure the success of future space reactor development projects

  5. Foundation plate on the elastic half-space, deterministic and probabilistic approach

    Directory of Open Access Journals (Sweden)

    Tvrdá Katarína

    2017-01-01

    Full Text Available Interaction between the foundation plate and subgrade can be described by different mathematical - physical model. Elastic foundation can be modelled by different types of models, e.g. one-parametric model, two-parametric model and a comprehensive model - Boussinesque (elastic half-space had been used. The article deals with deterministic and probabilistic analysis of deflection of the foundation plate on the elastic half-space. Contact between the foundation plate and subsoil was modelled using contact elements node-node. At the end the obtained results are presented.

  6. Space-time uncertainty and approaches to D-brane field theory

    International Nuclear Information System (INIS)

    Yoneya, Tamiaki

    2008-01-01

    In connection with the space-time uncertainty principle which gives a simple qualitative characterization of non-local or non-commutative nature of short-distance space-time structure in string theory, the author's recent approaches toward field theories for D-branes are briefly outlined, putting emphasis on some key ideas lying in the background. The final section of the present report is devoted partially to a tribute to Yukawa on the occasion of the centennial of his birth. (author)

  7. Topology-based description of the NCA cathode configurational space and an approach of its effective reduction

    OpenAIRE

    Zolotarev Pavel; Eremin Roman

    2018-01-01

    Modification of existing solid electrolyte and cathode materialsis a topic of interest for theoreticians and experimentalists. In particular, itrequires elucidation of the influence of dopants on the characteristics of thestudying materials. For the reason of high complexity of theconfigurational space of doped/deintercalated systems, application of thecomputer modeling approaches is hindered, despite significant advances ofcomputational facilities in last decades. In this study, we propose a...

  8. THE PRINCIPLES AND METHODS OF INFORMATION AND EDUCATIONAL SPACE SEMANTIC STRUCTURING BASED ON ONTOLOGIC APPROACH REALIZATION

    Directory of Open Access Journals (Sweden)

    Yurij F. Telnov

    2014-01-01

    Full Text Available This article reveals principles of semantic structuring of information and educational space of objects of knowledge and scientific and educational services with use of methods of ontologic engineering. Novelty of offered approach is interface of ontology of a content and ontology of scientific and educational services that allows to carry out effective composition of services and objects of knowledge according to models of professional competences and requirements being trained. As a result of application of methods of information and educational space semantic structuring integration of use of the diverse distributed scientific and educational content by educational institutions for carrying out scientific researches, methodical development and training is provided.

  9. An integrated mission approach to the space exploration initiative will ensure success

    International Nuclear Information System (INIS)

    Coomes, E.P.; Dagle, J.E.; Bamberger, J.A.; Noffsinger, K.E.

    1991-01-01

    The direction of the American space program, as defined by President Bush and the National Commission on Space, is to expand human presence into the solar system. Landing an American on Mars by the 50th anniversary of the Apollo 11 lunar landing is the goal. This challenge has produced a level of excitement among young Americans not seen for nearly three decades. The exploration and settlement of the space frontier will occupy the creative thoughts and energies of generations of Americans well into the next century. The return of Americans to the moon and beyond must be viewed as a national effort with strong public support if it is to become a reality. Key to making this an actuality is the mission approach selected. Developing a permanent presence in space requires a continual stepping outward from Earch in a logical progressive manner. If we seriously plan to go and to stay, then not only must we plan what we are to do and how we are to do it, we must address the logistic support infrastructure that will allow us to stay there once we arrive. A fully integrated approach to mission planning is needed if the Space exploration Initiative (SEI) is to be successful. Only in this way can a permanent human presence in space be sustained. An integrated infrastructure approach would reduce the number of new systems and technologies requiring development. The resultant horizontal commonality of systems and hardware would reduce the direct economic impact of SEI while an early return on investment through technology spin-offs would be an economic benefit by greatly enhancing our international technical competitiveness. If the exploration, development, and colonization of space is to be affordable and acceptable, careful consideration must be given to such things as ''return on investment'' and ''commercial product potential'' of the technologies developed

  10. A composite model of the space-time and 'colors'

    International Nuclear Information System (INIS)

    Terazawa, Hidezumi.

    1987-03-01

    A pregeometric and pregauge model of the space-time and ''colors'' in which the space-time metric and ''color'' gauge fields are both composite is presented. By the non-triviality of the model, the number of space-time dimensions is restricted to be not larger than the number of ''colors''. The long conjectured space-color correspondence is realized in the model action of the Nambu-Goto type which is invariant under both general-coordinate and local-gauge transformations. (author)

  11. Approach to an Affordable and Sustainable Space Transportation System

    Science.gov (United States)

    McCleskey, Caey M.; Rhodes, R. E.; Robinson, J. W.; Henderson, E. M.

    2012-01-01

    This paper describes an approach and a general procedure for creating space transportation architectural concepts that are at once affordable and sustainable. Previous papers by the authors and other members of the Space Propulsion Synergy Team (SPST) focused on a functional system breakdown structure for an architecture and definition of high-payoff design techniques with a technology integration strategy. This paper follows up by using a structured process that derives architectural solutions focused on achieving life cycle affordability and sustainability. Further, the paper includes an example concept that integrates key design techniques discussed in previous papers. !

  12. State space approach to mixed boundary value problems.

    Science.gov (United States)

    Chen, C. F.; Chen, M. M.

    1973-01-01

    A state-space procedure for the formulation and solution of mixed boundary value problems is established. This procedure is a natural extension of the method used in initial value problems; however, certain special theorems and rules must be developed. The scope of the applications of the approach includes beam, arch, and axisymmetric shell problems in structural analysis, boundary layer problems in fluid mechanics, and eigenvalue problems for deformable bodies. Many classical methods in these fields developed by Holzer, Prohl, Myklestad, Thomson, Love-Meissner, and others can be either simplified or unified under new light shed by the state-variable approach. A beam problem is included as an illustration.

  13. Approach to transaction management for Space Station Freedom

    Science.gov (United States)

    Easton, C. R.; Cressy, Phil; Ohnesorge, T. E.; Hector, Garland

    1990-01-01

    The Space Station Freedom Manned Base (SSFMB) will support the operation of the many payloads that may be located within the pressurized modules or on external attachment points. The transaction management (TM) approach presented provides a set of overlapping features that will assure the effective and safe operation of the SSFMB and provide a schedule that makes potentially hazardous operations safe, allocates resources within the capability of the resource providers, and maintains an environment conducive to the operations planned. This approach provides for targets of opportunity and schedule adjustments that give the operators the flexibility to conduct a vast majority of their operations with no conscious involvement with the TM function.

  14. A Gaussian graphical model approach to climate networks

    International Nuclear Information System (INIS)

    Zerenner, Tanja; Friederichs, Petra; Hense, Andreas; Lehnertz, Klaus

    2014-01-01

    Distinguishing between direct and indirect connections is essential when interpreting network structures in terms of dynamical interactions and stability. When constructing networks from climate data the nodes are usually defined on a spatial grid. The edges are usually derived from a bivariate dependency measure, such as Pearson correlation coefficients or mutual information. Thus, the edges indistinguishably represent direct and indirect dependencies. Interpreting climate data fields as realizations of Gaussian Random Fields (GRFs), we have constructed networks according to the Gaussian Graphical Model (GGM) approach. In contrast to the widely used method, the edges of GGM networks are based on partial correlations denoting direct dependencies. Furthermore, GRFs can be represented not only on points in space, but also by expansion coefficients of orthogonal basis functions, such as spherical harmonics. This leads to a modified definition of network nodes and edges in spectral space, which is motivated from an atmospheric dynamics perspective. We construct and analyze networks from climate data in grid point space as well as in spectral space, and derive the edges from both Pearson and partial correlations. Network characteristics, such as mean degree, average shortest path length, and clustering coefficient, reveal that the networks posses an ordered and strongly locally interconnected structure rather than small-world properties. Despite this, the network structures differ strongly depending on the construction method. Straightforward approaches to infer networks from climate data while not regarding any physical processes may contain too strong simplifications to describe the dynamics of the climate system appropriately

  15. A Gaussian graphical model approach to climate networks

    Energy Technology Data Exchange (ETDEWEB)

    Zerenner, Tanja, E-mail: tanjaz@uni-bonn.de [Meteorological Institute, University of Bonn, Auf dem Hügel 20, 53121 Bonn (Germany); Friederichs, Petra; Hense, Andreas [Meteorological Institute, University of Bonn, Auf dem Hügel 20, 53121 Bonn (Germany); Interdisciplinary Center for Complex Systems, University of Bonn, Brühler Straße 7, 53119 Bonn (Germany); Lehnertz, Klaus [Department of Epileptology, University of Bonn, Sigmund-Freud-Straße 25, 53105 Bonn (Germany); Helmholtz Institute for Radiation and Nuclear Physics, University of Bonn, Nussallee 14-16, 53115 Bonn (Germany); Interdisciplinary Center for Complex Systems, University of Bonn, Brühler Straße 7, 53119 Bonn (Germany)

    2014-06-15

    Distinguishing between direct and indirect connections is essential when interpreting network structures in terms of dynamical interactions and stability. When constructing networks from climate data the nodes are usually defined on a spatial grid. The edges are usually derived from a bivariate dependency measure, such as Pearson correlation coefficients or mutual information. Thus, the edges indistinguishably represent direct and indirect dependencies. Interpreting climate data fields as realizations of Gaussian Random Fields (GRFs), we have constructed networks according to the Gaussian Graphical Model (GGM) approach. In contrast to the widely used method, the edges of GGM networks are based on partial correlations denoting direct dependencies. Furthermore, GRFs can be represented not only on points in space, but also by expansion coefficients of orthogonal basis functions, such as spherical harmonics. This leads to a modified definition of network nodes and edges in spectral space, which is motivated from an atmospheric dynamics perspective. We construct and analyze networks from climate data in grid point space as well as in spectral space, and derive the edges from both Pearson and partial correlations. Network characteristics, such as mean degree, average shortest path length, and clustering coefficient, reveal that the networks posses an ordered and strongly locally interconnected structure rather than small-world properties. Despite this, the network structures differ strongly depending on the construction method. Straightforward approaches to infer networks from climate data while not regarding any physical processes may contain too strong simplifications to describe the dynamics of the climate system appropriately.

  16. Formulating state space models in R with focus on longitudinal regression models

    DEFF Research Database (Denmark)

    Dethlefsen, Claus; Lundbye-Christensen, Søren

    2006-01-01

    We provide a language for formulating a range of state space models with response densities within the exponential family. The described methodology is implemented in the R-package sspir. A state space model is specified similarly to a generalized linear model in R, and then the time-varying terms...

  17. Comparing Laser Interferometry and Atom Interferometry Approaches to Space-Based Gravitational-Wave Measurement

    Science.gov (United States)

    Baker, John; Thorpe, Ira

    2012-01-01

    Thoroughly studied classic space-based gravitational-wave missions concepts such as the Laser Interferometer Space Antenna (LISA) are based on laser-interferometry techniques. Ongoing developments in atom-interferometry techniques have spurred recently proposed alternative mission concepts. These different approaches can be understood on a common footing. We present an comparative analysis of how each type of instrument responds to some of the noise sources which may limiting gravitational-wave mission concepts. Sensitivity to laser frequency instability is essentially the same for either approach. Spacecraft acceleration reference stability sensitivities are different, allowing smaller spacecraft separations in the atom interferometry approach, but acceleration noise requirements are nonetheless similar. Each approach has distinct additional measurement noise issues.

  18. State-Space Modelling in Marine Science

    DEFF Research Database (Denmark)

    Albertsen, Christoffer Moesgaard

    State-space models provide a natural framework for analysing time series that cannot be observed without error. This is the case for fisheries stock assessments and movement data from marine animals. In fisheries stock assessments, the aim is to estimate the stock size; however, the only data...... available is the number of fish removed from the population and samples on a small fraction of the population. In marine animal movement, accurate position systems such as GPS cannot be used. Instead, inaccurate alternative must be used yielding observations with large errors. Both assessment and individual...... animal movement models are important for management and conservation of marine animals. Consequently, models should be developed to be operational in a management context while adequately evaluating uncertainties in the models. This thesis develops state-space models using the Laplace approximation...

  19. Advantage of Animal Models with Metabolic Flexibility for Space Research Beyond Low Earth Orbit

    Science.gov (United States)

    Griko, Yuri V.; Rask, Jon C.; Raychev, Raycho

    2017-01-01

    As the worlds space agencies and commercial entities continue to expand beyond Low Earth Orbit (LEO), novel approaches to carry out biomedical experiments with animals are required to address the challenge of adaptation to space flight and new planetary environments. The extended time and distance of space travel along with reduced involvement of Earth-based mission support increases the cumulative impact of the risks encountered in space. To respond to these challenges, it becomes increasingly important to develop the capability to manage an organisms self-regulatory control system, which would enable survival in extraterrestrial environments. To significantly reduce the risk to animals on future long duration space missions, we propose the use of metabolically flexible animal models as pathfinders, which are capable of tolerating the environmental extremes exhibited in spaceflight, including altered gravity, exposure to space radiation, chemically reactive planetary environments and temperature extremes.In this report we survey several of the pivotal metabolic flexibility studies and discuss the importance of utilizing animal models with metabolic flexibility with particular attention given to the ability to suppress the organism's metabolism in spaceflight experiments beyond LEO. The presented analysis demonstrates the adjuvant benefits of these factors to minimize damage caused by exposure to spaceflight and extreme planetary environments. Examples of microorganisms and animal models with dormancy capabilities suitable for space research are considered in the context of their survivability under hostile or deadly environments outside of Earth. Potential steps toward implementation of metabolic control technology in spaceflight architecture and its benefits for animal experiments and manned space exploration missions are discussed.

  20. State-space models for bio-loggers: A methodological road map

    DEFF Research Database (Denmark)

    Jonsen, I.D.; Basson, M.; Bestley, S.

    2012-01-01

    Ecologists have an unprecedented array of bio-logging technologies available to conduct in situ studies of horizontal and vertical movement patterns of marine animals. These tracking data provide key information about foraging, migratory, and other behaviours that can be linked with bio-physical...... development of state-space modelling approaches for animal movement data provides statistical rigor for inferring hidden behavioural states, relating these states to bio-physical data, and ultimately for predicting the potential impacts of climate change. Despite the widespread utility, and current popularity...

  1. On The Development of Biophysical Models for Space Radiation Risk Assessment

    Science.gov (United States)

    Cucinotta, F. A.; Dicello, J. F.

    1999-01-01

    Experimental techniques in molecular biology are being applied to study biological risks from space radiation. The use of molecular assays presents a challenge to biophysical models which in the past have relied on descriptions of energy deposition and phenomenological treatments of repair. We describe a biochemical kinetics model of cell cycle control and DNA damage response proteins in order to model cellular responses to radiation exposures. Using models of cyclin-cdk, pRB, E2F's, p53, and GI inhibitors we show that simulations of cell cycle populations and GI arrest can be described by our biochemical approach. We consider radiation damaged DNA as a substrate for signal transduction processes and consider a dose and dose-rate reduction effectiveness factor (DDREF) for protein expression.

  2. Modal Analysis and Model Correlation of the Mir Space Station

    Science.gov (United States)

    Kim, Hyoung M.; Kaouk, Mohamed

    2000-01-01

    This paper will discuss on-orbit dynamic tests, modal analysis, and model refinement studies performed as part of the Mir Structural Dynamics Experiment (MiSDE). Mir is the Russian permanently manned Space Station whose construction first started in 1986. The MiSDE was sponsored by the NASA International Space Station (ISS) Phase 1 Office and was part of the Shuttle-Mir Risk Mitigation Experiment (RME). One of the main objectives for MiSDE is to demonstrate the feasibility of performing on-orbit modal testing on large space structures to extract modal parameters that will be used to correlate mathematical models. The experiment was performed over a one-year span on the Mir-alone and Mir with a Shuttle docked. A total of 45 test sessions were performed including: Shuttle and Mir thruster firings, Shuttle-Mir and Progress-Mir dockings, crew exercise and pushoffs, and ambient noise during night-to-day and day-to-night orbital transitions. Test data were recorded with a variety of existing and new instrumentation systems that included: the MiSDE Mir Auxiliary Sensor Unit (MASU), the Space Acceleration Measurement System (SAMS), the Russian Mir Structural Dynamic Measurement System (SDMS), the Mir and Shuttle Inertial Measurement Units (IMUs), and the Shuttle payload bay video cameras. Modal analysis was performed on the collected test data to extract modal parameters, i.e. frequencies, damping factors, and mode shapes. A special time-domain modal identification procedure was used on free-decay structural responses. The results from this study show that modal testing and analysis of large space structures is feasible within operational constraints. Model refinements were performed on both the Mir alone and the Shuttle-Mir mated configurations. The design sensitivity approach was used for refinement, which adjusts structural properties in order to match analytical and test modal parameters. To verify the refinement results, the analytical responses calculated using

  3. Space Vehicle Reliability Modeling in DIORAMA

    Energy Technology Data Exchange (ETDEWEB)

    Tornga, Shawn Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-12

    When modeling system performance of space based detection systems it is important to consider spacecraft reliability. As space vehicles age the components become prone to failure for a variety of reasons such as radiation damage. Additionally, some vehicles may lose the ability to maneuver once they exhaust fuel supplies. Typically failure is divided into two categories: engineering mistakes and technology surprise. This document will report on a method of simulating space vehicle reliability in the DIORAMA framework.

  4. Disease severity, not operative approach, drives organ space infection after pediatric appendectomy.

    Science.gov (United States)

    Kelly, Kristin N; Fleming, Fergal J; Aquina, Christopher T; Probst, Christian P; Noyes, Katia; Pegoli, Walter; Monson, John R T

    2014-09-01

    This study examines patient and operative factors associated with organ space infection (OSI) in children after appendectomy, specifically focusing on the role of operative approach. Although controversy exists regarding the risk of increased postoperative intra-abdominal infections after laparoscopic appendectomy, this approach has been largely adopted in the treatment of pediatric acute appendicitis. Children aged 2 to 18 years undergoing open or laparoscopic appendectomy for acute appendicitis were selected from the 2012 American College of Surgeons Pediatric National Surgical Quality Improvement Program database. Univariate analysis compared patient and operative characteristics with 30-day OSI and incisional complication rates. Factors with a P value of less than 0.1 and clinical importance were included in the multivariable logistic regression models. A P value less than 0.05 was considered significant. For 5097 children undergoing appendectomy, 4514 surgical procedures (88.6%) were performed laparoscopically. OSI occurred in 155 children (3%), with half of these infections developing postdischarge. Significant predictors for OSI included complicated appendicitis, preoperative sepsis, wound class III/IV, and longer operative time. Although 5.2% of patients undergoing open surgery developed OSI (odds ratio = 1.82; 95% confidence interval, 1.21-2.76; P = 0.004), operative approach was not associated with increased relative odds of OSI (odds ratio = 0.99; confidence interval, 0.64-1.55; P = 0.970) after adjustment for other risk factors. Overall, the model had excellent predictive ability (c-statistic = 0.837). This model suggests that disease severity, not operative approach, as previously suggested, drives OSI development in children. Although 88% of appendectomies in this population were performed laparoscopically, these findings support utilization of the surgeon's preferred surgical technique and may help guide postoperative counsel in high-risk children.

  5. Thermal control of high energy nuclear waste, space option. [mathematical models

    Science.gov (United States)

    Peoples, J. A.

    1979-01-01

    Problems related to the temperature and packaging of nuclear waste material for disposal in space are explored. An approach is suggested for solving both problems with emphasis on high energy density waste material. A passive cooling concept is presented which utilized conduction rods that penetrate the inner core. Data are presented to illustrate the effectiveness of the rods and the limit of their capability. A computerized thermal model is discussed and developed for the cooling concept.

  6. NASA Space Radiation Program Integrative Risk Model Toolkit

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  7. Requirements and approach for a space tourism launch system

    Science.gov (United States)

    Penn, Jay P.; Lindley, Charles A.

    2003-01-01

    Market surveys suggest that a viable space tourism industry will require flight rates about two orders of magnitude higher than those required for conventional spacelift. Although enabling round-trip cost goals for a viable space tourism business are about 240/pound (529/kg), or 72,000/passenger round-trip, goals should be about 50/pound (110/kg) or approximately 15,000 for a typical passenger and baggage. The lower price will probably open space tourism to the general population. Vehicle reliabilities must approach those of commercial aircraft as closely as possible. This paper addresses the development of spaceplanes optimized for the ultra-high flight rate and high reliability demands of the space tourism mission. It addresses the fundamental operability, reliability, and cost drivers needed to satisfy this mission need. Figures of merit similar to those used to evaluate the economic viability of conventional commercial aircraft are developed, including items such as payload/vehicle dry weight, turnaround time, propellant cost per passenger, and insurance and depreciation costs, which show that infrastructure can be developed for a viable space tourism industry. A reference spaceplane design optimized for space tourism is described. Subsystem allocations for reliability, operability, and costs are made and a route to developing such a capability is discussed. The vehicle's ability to satisfy the traditional spacelift market is also shown.

  8. Indoor Semantic Modelling for Routing: The Two-Level Routing Approach for Indoor Navigation

    Directory of Open Access Journals (Sweden)

    Liu Liu

    2017-11-01

    Full Text Available Humans perform many activities indoors and they show a growing need for indoor navigation, especially in unfamiliar buildings such as airports, museums and hospitals. Complexity of such buildings poses many challenges for building managers and visitors. Indoor navigation services play an important role in supporting these indoor activities. Indoor navigation covers extensive topics such as: 1 indoor positioning and localization; 2 indoor space representation for navigation model generation; 3 indoor routing computation; 4 human wayfinding behaviours; and 5 indoor guidance (e.g., textual directories. So far, a large number of studies of pedestrian indoor navigation have presented diverse navigation models and routing algorithms/methods. However, the major challenge is rarely referred to: how to represent the complex indoor environment for pedestrians and conduct routing according to the different roles and sizes of users. Such complex buildings contain irregular shapes, large open spaces, complicated obstacles and different types of passages. A navigation model can be very complicated if the indoors are accurately represented. Although most research demonstrates feasible indoor navigation models and related routing methods in regular buildings, the focus is still on a general navigation model for pedestrians who are simplified as circles. In fact, pedestrians represent different sizes, motion abilities and preferences (e.g., described in user profiles, which should be reflected in navigation models and be considered for indoor routing (e.g., relevant Spaces of Interest and Points of Interest. In order to address this challenge, this thesis proposes an innovative indoor modelling and routing approach – two-level routing. It specially targets the case of routing in complex buildings for distinct users. The conceptual (first level uses general free indoor spaces: this is represented by the logical network whose nodes represent the spaces and edges

  9. A new approach to the analysis of the phase space of f(R)-gravity

    Energy Technology Data Exchange (ETDEWEB)

    Carloni, S., E-mail: sante.carloni@tecnico.ulisboa.pt [Centro Multidisciplinar de Astrofisica—CENTRA, Instituto Superior Tecnico – IST, Universidade de Lisboa – UL, Avenida Rovisco Pais 1, 1049-001 (Portugal)

    2015-09-01

    We propose a new dynamical system formalism for the analysis of f(R) cosmologies. The new approach eliminates the need for cumbersome inversions to close the dynamical system and allows the analysis of the phase space of f(R)-gravity models which cannot be investigated using the standard technique. Differently form previously proposed similar techniques, the new method is constructed in such a way to associate to the fixed points scale factors, which contain four integration constants (i.e. solutions of fourth order differential equations). In this way a new light is shed on the physical meaning of the fixed points. We apply this technique to some f(R) Lagrangians relevant for inflationary and dark energy models.

  10. Symbolic Solution Approach to Wind Turbine based on Doubly Fed Induction Generator Model

    DEFF Research Database (Denmark)

    Cañas–Carretón, M.; Gómez–Lázaro, E.; Martín–Martínez, S.

    2015-01-01

    –order induction generator is selected to model the electric machine, being this approach suitable to estimate the DFIG performance under transient conditions. The corresponding non–linear integro-differential equation system has been reduced to a linear state-space system by using an ad-hoc local linearization......This paper describes an alternative approach based on symbolic computations to simulate wind turbines equipped with Doubly–Fed Induction Generator (DFIG). The actuator disk theory is used to represent the aerodynamic part, and the one-mass model simulates the mechanical part. The 5th...

  11. Space charge models and PATH

    International Nuclear Information System (INIS)

    Wald, H.B.

    1990-01-01

    The 'PATH' codes are used to design magnetic optics subsystems for neutral particle beam systems. They include a 2-1/2D and three 3-D space charge models, two of which have recently been added. This paper describes the 3-D models and reports on preliminary benchmark studies in which these models are checked for stability as the cloud size is varied and for consistency with each other. Differences between the models are investigated and the computer time requirements for running these models are established

  12. A phase space approach to wave propagation with dispersion.

    Science.gov (United States)

    Ben-Benjamin, Jonathan S; Cohen, Leon; Loughlin, Patrick J

    2015-08-01

    A phase space approximation method for linear dispersive wave propagation with arbitrary initial conditions is developed. The results expand on a previous approximation in terms of the Wigner distribution of a single mode. In contrast to this previously considered single-mode case, the approximation presented here is for the full wave and is obtained by a different approach. This solution requires one to obtain (i) the initial modal functions from the given initial wave, and (ii) the initial cross-Wigner distribution between different modal functions. The full wave is the sum of modal functions. The approximation is obtained for general linear wave equations by transforming the equations to phase space, and then solving in the new domain. It is shown that each modal function of the wave satisfies a Schrödinger-type equation where the equivalent "Hamiltonian" operator is the dispersion relation corresponding to the mode and where the wavenumber is replaced by the wavenumber operator. Application to the beam equation is considered to illustrate the approach.

  13. Payload maintenance cost model for the space telescope

    Science.gov (United States)

    White, W. L.

    1980-01-01

    An optimum maintenance cost model for the space telescope for a fifteen year mission cycle was developed. Various documents and subsequent updates of failure rates and configurations were made. The reliability of the space telescope for one year, two and one half years, and five years were determined using the failure rates and configurations. The failure rates and configurations were also used in the maintenance simulation computer model which simulate the failure patterns for the fifteen year mission life of the space telescope. Cost algorithms associated with the maintenance options as indicated by the failure patterns were developed and integrated into the model.

  14. Conformally invariant models: A new approach

    International Nuclear Information System (INIS)

    Fradkin, E.S.; Palchik, M.Ya.; Zaikin, V.N.

    1996-02-01

    A pair of mathematical models of quantum field theory in D dimensions is analyzed, particularly, a model of a charged scalar field defined by two generations of secondary fields in the space of even dimensions D>=4 and a model of a neutral scalar field defined by two generations of secondary fields in two-dimensional space. 6 refs

  15. Forward modeling of space-borne gravitational wave detectors

    International Nuclear Information System (INIS)

    Rubbo, Louis J.; Cornish, Neil J.; Poujade, Olivier

    2004-01-01

    Planning is underway for several space-borne gravitational wave observatories to be built in the next 10 to 20 years. Realistic and efficient forward modeling will play a key role in the design and operation of these observatories. Space-borne interferometric gravitational wave detectors operate very differently from their ground-based counterparts. Complex orbital motion, virtual interferometry, and finite size effects complicate the description of space-based systems, while nonlinear control systems complicate the description of ground-based systems. Here we explore the forward modeling of space-based gravitational wave detectors and introduce an adiabatic approximation to the detector response that significantly extends the range of the standard low frequency approximation. The adiabatic approximation will aid in the development of data analysis techniques, and improve the modeling of astrophysical parameter extraction

  16. Mentoring SFRM: A New Approach to International Space Station Flight Controller Training

    Science.gov (United States)

    Huning, Therese; Barshi, Immanuel; Schmidt, Lacey

    2008-01-01

    The Mission Operations Directorate (MOD) of the Johnson Space Center is responsible for providing continuous operations support for the International Space Station (ISS). Operations support requires flight controllers who are skilled in team performance as well as the technical operations of the ISS. Space Flight Resource Management (SFRM), a NASA adapted variant of Crew Resource Management (CRM), is the competency model used in the MOD. ISS flight controller certification has evolved to include a balanced focus on development of SFRM and technical expertise. The latest challenge the MOD faces is how to certify an ISS flight controller (operator) to a basic level of effectiveness in 1 year. SFRM training uses a two-pronged approach to expediting operator certification: 1) imbed SFRM skills training into all operator technical training and 2) use senior flight controllers as mentors. This paper focuses on how the MOD uses senior flight controllers as mentors to train SFRM skills. Methods: A mentor works with an operator throughout the training flow. Inserted into the training flow are guided-discussion sessions and on-the-job observation opportunities focusing on specific SFRM skills, including: situational leadership, conflict management, stress management, cross-cultural awareness, self care and team care while on-console, communication, workload management, and situation awareness. The mentor and operator discuss the science and art behind the skills, cultural effects on skills applications, recognition of good and bad skills applications, recognition of how skills application changes subtly in different situations, and individual goals and techniques for improving skills. Discussion: This mentoring program provides an additional means of transferring SFRM knowledge compared to traditional CRM training programs. Our future endeavors in training SFRM skills (as well as other organization s) may benefit from adding team performance skills mentoring. This paper

  17. Space-Charge-Limited Emission Models for Particle Simulation

    Science.gov (United States)

    Verboncoeur, J. P.; Cartwright, K. L.; Murphy, T.

    2004-11-01

    Space-charge-limited (SCL) emission of electrons from various materials is a common method of generating the high current beams required to drive high power microwave (HPM) sources. In the SCL emission process, sufficient space charge is extracted from a surface, often of complicated geometry, to drive the electric field normal to the surface close to zero. The emitted current is highly dominated by space charge effects as well as ambient fields near the surface. In this work, we consider computational models for the macroscopic SCL emission process including application of Gauss's law and the Child-Langmuir law for space-charge-limited emission. Models are described for ideal conductors, lossy conductors, and dielectrics. Also considered is the discretization of these models, and the implications for the emission physics. Previous work on primary and dual-cell emission models [Watrous et al., Phys. Plasmas 8, 289-296 (2001)] is reexamined, and aspects of the performance, including fidelity and noise properties, are improved. Models for one-dimensional diodes are considered, as well as multidimensional emitting surfaces, which include corners and transverse fields.

  18. A phase-space approach to atmospheric dynamics based on observational data. Theory and applications

    International Nuclear Information System (INIS)

    Wang Risheng.

    1994-01-01

    This thesis is an attempt to develop systematically a phase-space approach to the atmospheric dynamics based on the theoretical achievement and application experiences in nonlinear time-series analysis. In particular, it is concerned with the derivation of quantities for describing the geometrical structure of the observed dynamics in phase-space (dimension estimation) and the examination of the observed atmospheric fluctuations in the light of phase-space representation. The thesis is, therefore composed of three major parts, i.e. an general survey of the theory of statistical approaches to dynamic systems, the methodology designed for the present study and specific applications with respect to dimension estimation and to a phase-space analysis of the tropical stratospheric quasi-biennial oscillation. (orig./KW)

  19. Identified state-space prediction model for aero-optical wavefronts

    Science.gov (United States)

    Faghihi, Azin; Tesch, Jonathan; Gibson, Steve

    2013-07-01

    A state-space disturbance model and associated prediction filter for aero-optical wavefronts are described. The model is computed by system identification from a sequence of wavefronts measured in an airborne laboratory. Estimates of the statistics and flow velocity of the wavefront data are shown and can be computed from the matrices in the state-space model without returning to the original data. Numerical results compare velocity values and power spectra computed from the identified state-space model with those computed from the aero-optical data.

  20. Ethical approach to digital skills. Sense and use in virtual educational spaces

    Directory of Open Access Journals (Sweden)

    Juan GARCÍA-GUTIÉRREZ

    2013-12-01

    Full Text Available In the context of technology and cyberspace, should we do everything we can do? The answer given to this question is not ethical, is political: safety. The safety and security are overshadowing the ethical question about the meaning of technology. Cyberspace imposes a "new logic" and new forms of "ownership". When it comes to the Internet in relation to children not always adopt logic of accountability to the cyberspace, Internet showing a space not only ethical and technical. We talk about safe Internet, Internet healthy, and Internet Fit for Children... why not talk over Internet ethics? With this work we approach digital skills as those skills that help us to position ourselves and guide us in cyberspace. Something that is not possible without also ethical skills. So, in this article we will try to build and propose a model for analyzing the virtual learning spaces (and cyberspace in general based on the categories of "use" and "sense" as different levels of ownership that indicate the types of competences needed to access cyberspace.  

  1. Mouse infection models for space flight immunology

    Science.gov (United States)

    Chapes, Stephen Keith; Ganta, Roman Reddy; Chapers, S. K. (Principal Investigator)

    2005-01-01

    Several immunological processes can be affected by space flight. However, there is little evidence to suggest that flight-induced immunological deficits lead to illness. Therefore, one of our goals has been to define models to examine host resistance during space flight. Our working hypothesis is that space flight crews will come from a heterogeneous population; the immune response gene make-up will be quite varied. It is unknown how much the immune response gene variation contributes to the potential threat from infectious organisms, allergic responses or other long term health problems (e.g. cancer). This article details recent efforts of the Kansas State University gravitational immunology group to assess how population heterogeneity impacts host health, either in laboratory experimental situations and/or using the skeletal unloading model of space-flight stress. This paper details our use of several mouse strains with several different genotypes. In particular, mice with varying MHCII allotypes and mice on the C57BL background with different genetic defects have been particularly useful tools with which to study infections by Staphylococcus aureus, Salmonella typhimurium, Pasteurella pneumotropica and Ehrlichia chaffeensis. We propose that some of these experimental challenge models will be useful to assess the effects of space flight on host resistance to infection.

  2. A simple coordinate space approach to three-body problems ...

    Indian Academy of Sciences (India)

    We show how to treat the dynamics of an asymmetric three-body system consisting of one heavy and two identical light particles in a simple coordinate space variational approach. The method is constructive and gives an efficient way of resolving a three-body system to an effective two-body system. It is illustrated by ...

  3. Transforming community access to space science models

    Science.gov (United States)

    MacNeice, Peter; Hesse, Michael; Kuznetsova, Maria; Maddox, Marlo; Rastaetter, Lutz; Berrios, David; Pulkkinen, Antti

    2012-04-01

    Researching and forecasting the ever changing space environment (often referred to as space weather) and its influence on humans and their activities are model-intensive disciplines. This is true because the physical processes involved are complex, but, in contrast to terrestrial weather, the supporting observations are typically sparse. Models play a vital role in establishing a physically meaningful context for interpreting limited observations, testing theory, and producing both nowcasts and forecasts. For example, with accurate forecasting of hazardous space weather conditions, spacecraft operators can place sensitive systems in safe modes, and power utilities can protect critical network components from damage caused by large currents induced in transmission lines by geomagnetic storms.

  4. Economic analysis of open space box model utilization in spacecraft

    Science.gov (United States)

    Mohammad, Atif F.; Straub, Jeremy

    2015-05-01

    It is a known fact that the amount of data about space that is stored is getting larger on an everyday basis. However, the utilization of Big Data and related tools to perform ETL (Extract, Transform and Load) applications will soon be pervasive in the space sciences. We have entered in a crucial time where using Big Data can be the difference (for terrestrial applications) between organizations underperforming and outperforming their peers. The same is true for NASA and other space agencies, as well as for individual missions and the highly-competitive process of mission data analysis and publication. In most industries, conventional opponents and new candidates alike will influence data-driven approaches to revolutionize and capture the value of Big Data archives. The Open Space Box Model is poised to take the proverbial "giant leap", as it provides autonomic data processing and communications for spacecraft. We can find economic value generated from such use of data processing in our earthly organizations in every sector, such as healthcare, retail. We also can easily find retailers, performing research on Big Data, by utilizing sensors driven embedded data in products within their stores and warehouses to determine how these products are actually used in the real world.

  5. Architecture and Knowledge-Driven Self-Adaptive Security in Smart Space

    Directory of Open Access Journals (Sweden)

    Antti Evesti

    2013-03-01

    Full Text Available Dynamic and heterogeneous smart spaces cause challenges for security because it is impossible to anticipate all the possible changes at design-time. Self-adaptive security is an applicable solution for this challenge. This paper presents an architectural approach for security adaptation in smart spaces. The approach combines an adaptation loop, Information Security Measuring Ontology (ISMO and a smart space security-control model. The adaptation loop includes phases to monitor, analyze, plan and execute changes in the smart space. The ISMO offers input knowledge for the adaptation loop and the security-control model enforces dynamic access control policies. The approach is novel because it defines the whole adaptation loop and knowledge required in each phase of the adaptation. The contributions are validated as a part of the smart space pilot implementation. The approach offers reusable and extensible means to achieve adaptive security in smart spaces and up-to-date access control for devices that appear in the space. Hence, the approach supports the work of smart space application developers.

  6. A stochastic space-time model for intermittent precipitation occurrences

    KAUST Repository

    Sun, Ying; Stein, Michael L.

    2016-01-01

    Modeling a precipitation field is challenging due to its intermittent and highly scale-dependent nature. Motivated by the features of high-frequency precipitation data from a network of rain gauges, we propose a threshold space-time t random field (tRF) model for 15-minute precipitation occurrences. This model is constructed through a space-time Gaussian random field (GRF) with random scaling varying along time or space and time. It can be viewed as a generalization of the purely spatial tRF, and has a hierarchical representation that allows for Bayesian interpretation. Developing appropriate tools for evaluating precipitation models is a crucial part of the model-building process, and we focus on evaluating whether models can produce the observed conditional dry and rain probabilities given that some set of neighboring sites all have rain or all have no rain. These conditional probabilities show that the proposed space-time model has noticeable improvements in some characteristics of joint rainfall occurrences for the data we have considered.

  7. A stochastic space-time model for intermittent precipitation occurrences

    KAUST Repository

    Sun, Ying

    2016-01-28

    Modeling a precipitation field is challenging due to its intermittent and highly scale-dependent nature. Motivated by the features of high-frequency precipitation data from a network of rain gauges, we propose a threshold space-time t random field (tRF) model for 15-minute precipitation occurrences. This model is constructed through a space-time Gaussian random field (GRF) with random scaling varying along time or space and time. It can be viewed as a generalization of the purely spatial tRF, and has a hierarchical representation that allows for Bayesian interpretation. Developing appropriate tools for evaluating precipitation models is a crucial part of the model-building process, and we focus on evaluating whether models can produce the observed conditional dry and rain probabilities given that some set of neighboring sites all have rain or all have no rain. These conditional probabilities show that the proposed space-time model has noticeable improvements in some characteristics of joint rainfall occurrences for the data we have considered.

  8. An approach to developing user interfaces for space systems

    Science.gov (United States)

    Shackelford, Keith; McKinney, Karen

    1993-08-01

    Inherent weakness in the traditional waterfall model of software development has led to the definition of the spiral model. The spiral model software development lifecycle model, however, has not been applied to NASA projects. This paper describes its use in developing real time user interface software for an Environmental Control and Life Support System (ECLSS) Process Control Prototype at NASA's Marshall Space Flight Center.

  9. Distributed Model Predictive Control over Multiple Groups of Vehicles in Highway Intelligent Space for Large Scale System

    Directory of Open Access Journals (Sweden)

    Tang Xiaofeng

    2014-01-01

    Full Text Available The paper presents the three time warning distances for solving the large scale system of multiple groups of vehicles safety driving characteristics towards highway tunnel environment based on distributed model prediction control approach. Generally speaking, the system includes two parts. First, multiple vehicles are divided into multiple groups. Meanwhile, the distributed model predictive control approach is proposed to calculate the information framework of each group. Each group of optimization performance considers the local optimization and the neighboring subgroup of optimization characteristics, which could ensure the global optimization performance. Second, the three time warning distances are studied based on the basic principles used for highway intelligent space (HIS and the information framework concept is proposed according to the multiple groups of vehicles. The math model is built to avoid the chain avoidance of vehicles. The results demonstrate that the proposed highway intelligent space method could effectively ensure driving safety of multiple groups of vehicles under the environment of fog, rain, or snow.

  10. Anthropogenic resource subsidies determine space use by Australian arid zone dingoes: an improved resource selection modelling approach.

    Directory of Open Access Journals (Sweden)

    Thomas M Newsome

    Full Text Available Dingoes (Canis lupus dingo were introduced to Australia and became feral at least 4,000 years ago. We hypothesized that dingoes, being of domestic origin, would be adaptable to anthropogenic resource subsidies and that their space use would be affected by the dispersion of those resources. We tested this by analyzing Resource Selection Functions (RSFs developed from GPS fixes (locations of dingoes in arid central Australia. Using Generalized Linear Mixed-effect Models (GLMMs, we investigated resource relationships for dingoes that had access to abundant food near mine facilities, and for those that did not. From these models, we predicted the probability of dingo occurrence in relation to anthropogenic resource subsidies and other habitat characteristics over ∼ 18,000 km(2. Very small standard errors and subsequent pervasively high P-values of results will become more important as the size of data sets, such as our GPS tracking logs, increases. Therefore, we also investigated methods to minimize the effects of serial and spatio-temporal correlation among samples and unbalanced study designs. Using GLMMs, we accounted for some of the correlation structure of GPS animal tracking data; however, parameter standard errors remained very small and all predictors were highly significant. Consequently, we developed an alternative approach that allowed us to review effect sizes at different spatial scales and determine which predictors were sufficiently ecologically meaningful to include in final RSF models. We determined that the most important predictor for dingo occurrence around mine sites was distance to the refuse facility. Away from mine sites, close proximity to human-provided watering points was predictive of dingo dispersion as were other landscape factors including palaeochannels, rocky rises and elevated drainage depressions. Our models demonstrate that anthropogenically supplemented food and water can alter dingo-resource relationships. The

  11. Space Biology Model Organism Research on the Deep Space Gateway to Pioneer Discovery and Advance Human Space Exploration

    Science.gov (United States)

    Sato, K. Y.; Tomko, D. L.; Levine, H. G.; Quincy, C. D.; Rayl, N. A.; Sowa, M. B.; Taylor, E. M.; Sun, S. C.; Kundrot, C. E.

    2018-02-01

    Model organisms are foundational for conducting physiological and systems biology research to define how life responds to the deep space environment. The organisms, areas of research, and Deep Space Gateway capabilities needed will be presented.

  12. Particle currents in a space-time dependent and CP-violating Higgs background: a field theory approach

    International Nuclear Information System (INIS)

    Comelli, D.; Riotto, A.

    1995-06-01

    Motivated by cosmological applications like electroweak baryogenesis, we develop a field theoretic approach to the computation of particle currents on a space-time dependent and CP-violating Higgs background. We consider the Standard Model model with two Higgs doublets and CP violation in the scalar sector, and compute both fermionic and Higgs currents by means of an expansion in the background fields. We discuss the gauge dependence of the results and the renormalization of the current operators, showing that in the limit of local equilibrium, no extra renormalization conditions are needed in order to specify the system completely. (orig.)

  13. Minimization of required model runs in the Random Mixing approach to inverse groundwater flow and transport modeling

    Science.gov (United States)

    Hoerning, Sebastian; Bardossy, Andras; du Plessis, Jaco

    2017-04-01

    Most geostatistical inverse groundwater flow and transport modelling approaches utilize a numerical solver to minimize the discrepancy between observed and simulated hydraulic heads and/or hydraulic concentration values. The optimization procedure often requires many model runs, which for complex models lead to long run times. Random Mixing is a promising new geostatistical technique for inverse modelling. The method is an extension of the gradual deformation approach. It works by finding a field which preserves the covariance structure and maintains observed hydraulic conductivities. This field is perturbed by mixing it with new fields that fulfill the homogeneous conditions. This mixing is expressed as an optimization problem which aims to minimize the difference between the observed and simulated hydraulic heads and/or concentration values. To preserve the spatial structure, the mixing weights must lie on the unit hyper-sphere. We present a modification to the Random Mixing algorithm which significantly reduces the number of model runs required. The approach involves taking n equally spaced points on the unit circle as weights for mixing conditional random fields. Each of these mixtures provides a solution to the forward model at the conditioning locations. For each of the locations the solutions are then interpolated around the circle to provide solutions for additional mixing weights at very low computational cost. The interpolated solutions are used to search for a mixture which maximally reduces the objective function. This is in contrast to other approaches which evaluate the objective function for the n mixtures and then interpolate the obtained values. Keeping the mixture on the unit circle makes it easy to generate equidistant sampling points in the space; however, this means that only two fields are mixed at a time. Once the optimal mixture for two fields has been found, they are combined to form the input to the next iteration of the algorithm. This

  14. Mid- and long-term runoff predictions by an improved phase-space reconstruction model

    International Nuclear Information System (INIS)

    Hong, Mei; Wang, Dong; Wang, Yuankun; Zeng, Xiankui; Ge, Shanshan; Yan, Hengqian; Singh, Vijay P.

    2016-01-01

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall–runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the model are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ‘‘wet years and dry years predictability barrier,’’ to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. - Highlights: • The improved phase-space reconstruction model of monthly runoff is established. • Two variables (temperature and rainfall) are incorporated

  15. Mid- and long-term runoff predictions by an improved phase-space reconstruction model

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Mei [Research Center of Ocean Environment Numerical Simulation, Institute of Meteorology and oceanography, PLA University of Science and Technology, Nanjing (China); Wang, Dong, E-mail: wangdong@nju.edu.cn [Key Laboratory of Surficial Geochemistry, Ministry of Education, Department of Hydrosciences, School of Earth Sciences and Engineering, Collaborative Innovation Center of South China Sea Studies, State Key Laboratory of Pollution Control and Resource Reuse, Nanjing University, Nanjing 210093 (China); Wang, Yuankun; Zeng, Xiankui [Key Laboratory of Surficial Geochemistry, Ministry of Education, Department of Hydrosciences, School of Earth Sciences and Engineering, Collaborative Innovation Center of South China Sea Studies, State Key Laboratory of Pollution Control and Resource Reuse, Nanjing University, Nanjing 210093 (China); Ge, Shanshan; Yan, Hengqian [Research Center of Ocean Environment Numerical Simulation, Institute of Meteorology and oceanography, PLA University of Science and Technology, Nanjing (China); Singh, Vijay P. [Department of Biological and Agricultural Engineering Zachry Department of Civil Engineering, Texas A & M University, College Station, TX 77843 (United States)

    2016-07-15

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall–runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the model are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ‘‘wet years and dry years predictability barrier,’’ to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. - Highlights: • The improved phase-space reconstruction model of monthly runoff is established. • Two variables (temperature and rainfall) are incorporated

  16. Modeling in the quality by design environment: Regulatory requirements and recommendations for design space and control strategy appointment.

    Science.gov (United States)

    Djuris, Jelena; Djuric, Zorica

    2017-11-30

    Mathematical models can be used as an integral part of the quality by design (QbD) concept throughout the product lifecycle for variety of purposes, including appointment of the design space and control strategy, continual improvement and risk assessment. Examples of different mathematical modeling techniques (mechanistic, empirical and hybrid) in the pharmaceutical development and process monitoring or control are provided in the presented review. In the QbD context, mathematical models are predominantly used to support design space and/or control strategies. Considering their impact to the final product quality, models can be divided into the following categories: high, medium and low impact models. Although there are regulatory guidelines on the topic of modeling applications, review of QbD-based submission containing modeling elements revealed concerns regarding the scale-dependency of design spaces and verification of models predictions at commercial scale of manufacturing, especially regarding real-time release (RTR) models. Authors provide critical overview on the good modeling practices and introduce concepts of multiple-unit, adaptive and dynamic design space, multivariate specifications and methods for process uncertainty analysis. RTR specification with mathematical model and different approaches to multivariate statistical process control supporting process analytical technologies are also presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Integrated Space Asset Management Database and Modeling

    Science.gov (United States)

    Gagliano, L.; MacLeod, T.; Mason, S.; Percy, T.; Prescott, J.

    The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

  18. Fitted HBT radii versus space-time variances in flow-dominated models

    International Nuclear Information System (INIS)

    Lisa, Mike; Frodermann, Evan; Heinz, Ulrich

    2007-01-01

    The inability of otherwise successful dynamical models to reproduce the 'HBT radii' extracted from two-particle correlations measured at the Relativistic Heavy Ion Collider (RHIC) is known as the 'RHIC HBT Puzzle'. Most comparisons between models and experiment exploit the fact that for Gaussian sources the HBT radii agree with certain combinations of the space-time widths of the source which can be directly computed from the emission function, without having to evaluate, at significant expense, the two-particle correlation function. We here study the validity of this approach for realistic emission function models some of which exhibit significant deviations from simple Gaussian behaviour. By Fourier transforming the emission function we compute the 2-particle correlation function and fit it with a Gaussian to partially mimic the procedure used for measured correlation functions. We describe a novel algorithm to perform this Gaussian fit analytically. We find that for realistic hydrodynamic models the HBT radii extracted from this procedure agree better with the data than the values previously extracted from the space-time widths of the emission function. Although serious discrepancies between the calculated and measured HBT radii remain, we show that a more 'apples-to-apples' comparison of models with data can play an important role in any eventually successful theoretical description of RHIC HBT data. (author)

  19. Space Surveillance Network and Analysis Model (SSNAM) Performance Improvements

    National Research Council Canada - National Science Library

    Butkus, Albert; Roe, Kevin; Mitchell, Barbara L; Payne, Timothy

    2007-01-01

    ... capacity by sensor, models for sensors yet to be created, user defined weather conditions, National Aeronautical and Space Administration catalog growth model including space debris, and solar flux just to name a few...

  20. Implementation of an Open-Scenario, Long-Term Space Debris Simulation Approach

    Science.gov (United States)

    Nelson, Bron; Yang Yang, Fan; Carlino, Roberto; Dono Perez, Andres; Faber, Nicolas; Henze, Chris; Karacalioglu, Arif Goktug; O'Toole, Conor; Swenson, Jason; Stupl, Jan

    2015-01-01

    This paper provides a status update on the implementation of a flexible, long-term space debris simulation approach. The motivation is to build a tool that can assess the long-term impact of various options for debris-remediation, including the LightForce space debris collision avoidance concept that diverts objects using photon pressure [9]. State-of-the-art simulation approaches that assess the long-term development of the debris environment use either completely statistical approaches, or they rely on large time steps on the order of several days if they simulate the positions of single objects over time. They cannot be easily adapted to investigate the impact of specific collision avoidance schemes or de-orbit schemes, because the efficiency of a collision avoidance maneuver can depend on various input parameters, including ground station positions and orbital and physical parameters of the objects involved in close encounters (conjunctions). Furthermore, maneuvers take place on timescales much smaller than days. For example, LightForce only changes the orbit of a certain object (aiming to reduce the probability of collision), but it does not remove entire objects or groups of objects. In the same sense, it is also not straightforward to compare specific de-orbit methods in regard to potential collision risks during a de-orbit maneuver. To gain flexibility in assessing interactions with objects, we implement a simulation that includes every tracked space object in Low Earth Orbit (LEO) and propagates all objects with high precision and variable time-steps as small as one second. It allows the assessment of the (potential) impact of physical or orbital changes to any object. The final goal is to employ a Monte Carlo approach to assess the debris evolution during the simulation time-frame of 100 years and to compare a baseline scenario to debris remediation scenarios or other scenarios of interest. To populate the initial simulation, we use the entire space

  1. Vol. 33 - Compact State-Space Models for Complex Superconducting Radio-Frequency Structures Based on Model Order Reduction and Concatenation Methods

    CERN Document Server

    Flisgen, Thomas

    2015-01-01

    The modeling of large chains of superconducting cavities with couplers is a challeng- ing task in computational electrical engineering. The direct numerical treatment of these structures can easily lead to problems with more than ten million degrees of freedom. Problems of this complexity are typically solved with the help of parallel programs running on supercomputing infrastructures. However, these infrastructures are expensive to purchase, to operate, and to maintain. The aim of this thesis is to introduce and to validate an approach which allows for modeling large structures on a standard workstation. The novel technique is called State-Space Concatena- tions and is based on the decomposition of the complete structure into individual segments. The radio-frequency properties of the generated segments are described by a set of state-space equations which either emerge from analytical considera- tions or from numerical discretization schemes. The model order of these equations is reduced...

  2. 3D space analysis of dental models

    Science.gov (United States)

    Chuah, Joon H.; Ong, Sim Heng; Kondo, Toshiaki; Foong, Kelvin W. C.; Yong, Than F.

    2001-05-01

    Space analysis is an important procedure by orthodontists to determine the amount of space available and required for teeth alignment during treatment planning. Traditional manual methods of space analysis are tedious and often inaccurate. Computer-based space analysis methods that work on 2D images have been reported. However, as the space problems in the dental arch exist in all three planes of space, a full 3D analysis of the problems is necessary. This paper describes a visualization and measurement system that analyses 3D images of dental plaster models. Algorithms were developed to determine dental arches. The system is able to record the depths of the Curve of Spee, and quantify space liabilities arising from a non-planar Curve of Spee, malalignment and overjet. Furthermore, the difference between total arch space available and the space required to arrange the teeth in ideal occlusion can be accurately computed. The system for 3D space analysis of the dental arch is an accurate, comprehensive, rapid and repeatable method of space analysis to facilitate proper orthodontic diagnosis and treatment planning.

  3. Conceptual Explanation for the Algebra in the Noncommutative Approach to the Standard Model

    International Nuclear Information System (INIS)

    Chamseddine, Ali H.; Connes, Alain

    2007-01-01

    The purpose of this Letter is to remove the arbitrariness of the ad hoc choice of the algebra and its representation in the noncommutative approach to the standard model, which was begging for a conceptual explanation. We assume as before that space-time is the product of a four-dimensional manifold by a finite noncommmutative space F. The spectral action is the pure gravitational action for the product space. To remove the above arbitrariness, we classify the irreducible geometries F consistent with imposing reality and chiral conditions on spinors, to avoid the fermion doubling problem, which amounts to have total dimension 10 (in the K-theoretic sense). It gives, almost uniquely, the standard model with all its details, predicting the number of fermions per generation to be 16, their representations and the Higgs breaking mechanism, with very little input

  4. Adaptive Numerical Algorithms in Space Weather Modeling

    Science.gov (United States)

    Toth, Gabor; vanderHolst, Bart; Sokolov, Igor V.; DeZeeuw, Darren; Gombosi, Tamas I.; Fang, Fang; Manchester, Ward B.; Meng, Xing; Nakib, Dalal; Powell, Kenneth G.; hide

    2010-01-01

    Space weather describes the various processes in the Sun-Earth system that present danger to human health and technology. The goal of space weather forecasting is to provide an opportunity to mitigate these negative effects. Physics-based space weather modeling is characterized by disparate temporal and spatial scales as well as by different physics in different domains. A multi-physics system can be modeled by a software framework comprising of several components. Each component corresponds to a physics domain, and each component is represented by one or more numerical models. The publicly available Space Weather Modeling Framework (SWMF) can execute and couple together several components distributed over a parallel machine in a flexible and efficient manner. The framework also allows resolving disparate spatial and temporal scales with independent spatial and temporal discretizations in the various models. Several of the computationally most expensive domains of the framework are modeled by the Block-Adaptive Tree Solar wind Roe Upwind Scheme (BATS-R-US) code that can solve various forms of the magnetohydrodynamics (MHD) equations, including Hall, semi-relativistic, multi-species and multi-fluid MHD, anisotropic pressure, radiative transport and heat conduction. Modeling disparate scales within BATS-R-US is achieved by a block-adaptive mesh both in Cartesian and generalized coordinates. Most recently we have created a new core for BATS-R-US: the Block-Adaptive Tree Library (BATL) that provides a general toolkit for creating, load balancing and message passing in a 1, 2 or 3 dimensional block-adaptive grid. We describe the algorithms of BATL and demonstrate its efficiency and scaling properties for various problems. BATS-R-US uses several time-integration schemes to address multiple time-scales: explicit time stepping with fixed or local time steps, partially steady-state evolution, point-implicit, semi-implicit, explicit/implicit, and fully implicit numerical

  5. A Space-Time Periodic Task Model for Recommendation of Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Xiuhong Zhang

    2018-01-01

    Full Text Available With the rapid development of remote sensing technology, the quantity and variety of remote sensing images are growing so quickly that proactive and personalized access to data has become an inevitable trend. One of the active approaches is remote sensing image recommendation, which can offer related image products to users according to their preference. Although multiple studies on remote sensing retrieval and recommendation have been performed, most of these studies model the user profiles only from the perspective of spatial area or image features. In this paper, we propose a spatiotemporal recommendation method for remote sensing data based on the probabilistic latent topic model, which is named the Space-Time Periodic Task model (STPT. User retrieval behaviors of remote sensing images are represented as mixtures of latent tasks, which act as links between users and images. Each task is associated with the joint probability distribution of space, time and image characteristics. Meanwhile, the von Mises distribution is introduced to fit the distribution of tasks over time. Then, we adopt Gibbs sampling to learn the random variables and parameters and present the inference algorithm for our model. Experiments show that the proposed STPT model can improve the capability and efficiency of remote sensing image data services.

  6. New Skeletal-Space-Filling Models

    Science.gov (United States)

    Clarke, Frank H.

    1977-01-01

    Describes plastic, skeletal molecular models that are color-coded and can illustrate both the conformation and overall shape of small molecules. They can also be converted to space-filling counterparts by the additions of color-coded polystyrene spheres. (MLH)

  7. Phase-space dynamics of Bianchi IX cosmological models

    International Nuclear Information System (INIS)

    Soares, I.D.

    1985-01-01

    The complex phase-space dynamical behaviour of a class of Biachi IX cosmological models is discussed, as the chaotic gravitational collapse due Poincare's homoclinic phenomena, and the n-furcation of periodic orbits and tori in the phase space of the models. Poincare maps which show this behaviour are constructed merically and applications are discussed. (Author) [pt

  8. Space modeling with SolidWorks and NX

    CERN Document Server

    Duhovnik, Jože; Drešar, Primož

    2015-01-01

    Through a series of step-by-step tutorials and numerous hands-on exercises, this book aims to equip the reader with both a good understanding of the importance of space in the abstract world of engineers and the ability to create a model of a product in virtual space – a skill essential for any designer or engineer who needs to present ideas concerning a particular product within a professional environment. The exercises progress logically from the simple to the more complex; while SolidWorks or NX is the software used, the underlying philosophy is applicable to all modeling software. In each case, the explanation covers the entire procedure from the basic idea and production capabilities through to the real model; the conversion from 3D model to 2D manufacturing drawing is also clearly explained. Topics covered include modeling of prism, axisymmetric, symmetric, and sophisticated shapes; digitization of physical models using modeling software; creation of a CAD model starting from a physical model; free fo...

  9. Space dynamics

    International Nuclear Information System (INIS)

    Corno, S.E.

    1995-01-01

    Analytical methods for Space Dynamics of fission reactors, are presented. It is shown how a few sample problems in space dynamics can be solved, within the one and two group diffusion model, by purely analytical tools, essentially based on Laplace transform and complex Green function techniques. A quite suggestive generalization of this approach, applicable to the fluid core reactors, whose fuel is undergoing a violent mixing, is reported and briefly discussed. (author)

  10. Space engineering modeling and optimization with case studies

    CERN Document Server

    Pintér, János

    2016-01-01

    This book presents a selection of advanced case studies that cover a substantial range of issues and real-world challenges and applications in space engineering. Vital mathematical modeling, optimization methodologies and numerical solution aspects of each application case study are presented in detail, with discussions of a range of advanced model development and solution techniques and tools. Space engineering challenges are discussed in the following contexts: •Advanced Space Vehicle Design •Computation of Optimal Low Thrust Transfers •Indirect Optimization of Spacecraft Trajectories •Resource-Constrained Scheduling, •Packing Problems in Space •Design of Complex Interplanetary Trajectories •Satellite Constellation Image Acquisition •Re-entry Test Vehicle Configuration Selection •Collision Risk Assessment on Perturbed Orbits •Optimal Robust Design of Hybrid Rocket Engines •Nonlinear Regression Analysis in Space Engineering< •Regression-Based Sensitivity Analysis and Robust Design ...

  11. Comparison of two Minkowski-space approaches to heavy quarkonia

    Energy Technology Data Exchange (ETDEWEB)

    Leitao, Sofia; Biernat, Elmar P. [Universidade de Lisboa, CFTP, Instituto Superior Tecnico, Lisbon (Portugal); Li, Yang [Iowa State University, Department of Physics and Astronomy, Ames, IA (United States); College of William and Mary, Department of Physics, Williamsburg, VA (United States); Maris, Pieter; Vary, James P. [Iowa State University, Department of Physics and Astronomy, Ames, IA (United States); Pena, M.T. [Universidade de Lisboa, CFTP, Instituto Superior Tecnico, Lisbon (Portugal); Universidade de Lisboa, Departamento de Fisica, Instituto Superior Tecnico, Lisbon (Portugal); Stadler, Alfred [Universidade de Lisboa, CFTP, Instituto Superior Tecnico, Lisbon (Portugal); Universidade de Evora, Departamento de Fisica, Evora (Portugal)

    2017-10-15

    In this work we compare mass spectra and decay constants obtained from two recent, independent, and fully relativistic approaches to the quarkonium bound-state problem: the Basis Light-Front Quantization approach, where light-front wave functions are naturally formulated; and, the Covariant Spectator Theory (CST), based on a reorganization of the Bethe-Salpeter equation. Even though conceptually different, both solutions are obtained in Minkowski space. Comparisons of decay constants for more than ten states of charmonium and bottomonium show favorable agreement between the two approaches as well as with experiment where available. We also apply the Brodsky-Huang-Lepage prescription to convert the CST amplitudes into functions of light-front variables. This provides an ideal opportunity to investigate the similarities and differences at the level of the wave functions. Several qualitative features are observed in remarkable agreement between the two approaches even for the rarely addressed excited states. Leading-twist distribution amplitudes as well as parton distribution functions of heavy quarkonia are also analyzed. (orig.)

  12. The systems approach for applying artificial intelligence to space station automation (Invited Paper)

    Science.gov (United States)

    Grose, Vernon L.

    1985-12-01

    The progress of technology is marked by fragmentation -- dividing research and development into ever narrower fields of specialization. Ultimately, specialists know everything about nothing. And hope for integrating those slender slivers of specialty into a whole fades. Without an integrated, all-encompassing perspective, technology becomes applied in a lopsided and often inefficient manner. A decisionary model, developed and applied for NASA's Chief Engineer toward establishment of commercial space operations, can be adapted to the identification, evaluation, and selection of optimum application of artificial intelligence for space station automation -- restoring wholeness to a situation that is otherwise chaotic due to increasing subdivision of effort. Issues such as functional assignments for space station task, domain, and symptom modules can be resolved in a manner understood by all parties rather than just the person with assigned responsibility -- and ranked by overall significance to mission accomplishment. Ranking is based on the three basic parameters of cost, performance, and schedule. This approach has successfully integrated many diverse specialties in situations like worldwide terrorism control, coal mining safety, medical malpractice risk, grain elevator explosion prevention, offshore drilling hazards, and criminal justice resource allocation -- all of which would have otherwise been subject to "squeaky wheel" emphasis and support of decision-makers.

  13. Aspects of scintillation modelling in LEO-ground free-space optical communications

    Science.gov (United States)

    Moll, Florian

    2017-10-01

    Free-space optical communications can be used to transmit data from low Earth orbit satellites to ground with very high data rate. In the last section of the downlink, the electro-magnetic wave propagates through the turbulent atmosphere which is characterized by random index of refraction fluctuations. The propagating wave experiences phase distortions that lead to intensity scintillation in the aperture plane of the receiving telescope. For quantification, an appropriate scintillation model is needed. Approaches to analytically model the scintillation exist. Parameterization of the underlying turbulence profile (Cn2 profile) is however difficult. The Cn2 profiles are often site-specific and thus inappropriate or generic and thus too complex for a feasible deployment. An approach that directly models the scintillation effect based on measurements without claiming to be generic is therefore more feasible. Since measurements are sparse, a combination with existing theoretical framework is feasible to develop a new scintillation model that focuses on low earth orbit to ground free-space optical communications link design with direct detection. The paper addresses several questions one has to answer while analyzing the measurements data and selection of the theoretical models for the LEO downlink scenario. The first is the question of a suitable yet ease to use simple Cn2 profile. The HAP model is analyzed for its feasibility in this scenario since it includes a more realistic boundary layer profile decay than the HV model. It is found that the HAP model needs to be modified for a feasible deployment in the LEO downlink scenario for night time. The validity of the plane wave assumption in the downlink is discussed by model calculations of the scintillation index for a plane and Gaussian beam wave. Inaccuracies when using the plane earth model instead of the spherical earth model are investigated by analyzing the Rytov index. Impact of beam wander and non

  14. A probabilistic approach to safety/reliability of space nuclear power systems

    International Nuclear Information System (INIS)

    Medford, G.; Williams, K.; Kolaczkowski, A.

    1989-01-01

    An ongoing effort is investigating the feasibility of using probabilistic risk assessment (PRA) modeling techniques to construct a living model of a space nuclear power system. This is being done in conjunction with a traditional reliability and survivability analysis of the SP-100 space nuclear power system. The initial phase of the project consists of three major parts with the overall goal of developing a top-level system model and defining initiating events of interest for the SP-100 system. The three major tasks were performing a traditional survivability analysis, performing a simple system reliability analysis, and constructing a top-level system fault-tree model. Each of these tasks and their interim results are discussed in this paper. Initial results from the study support the conclusion that PRA modeling techniques can provide a valuable design and decision-making tool for space reactors. The ability of the model to rank and calculate relative contributions from various failure modes allows design optimization for maximum safety and reliability. Future efforts in the SP-100 program will see data development and quantification of the model to allow parametric evaluations of the SP-100 system. Current efforts have shown the need for formal data development and test programs within such a modeling framework

  15. Time Evolution Of The Wigner Function In Discrete Quantum Phase Space For A Soluble Quasi-spin Model

    CERN Document Server

    Galetti, D

    2000-01-01

    Summary: The discrete phase space approach to quantum mechanics of degrees of freedom without classical counterparts is applied to the many-fermions/quasi-spin Lipkin model. The Wigner function is written for some chosen states associated to discrete angle and angular momentum variables, and the time evolution is numerically calculated using the discrete von Neumann-Liouville equation. Direct evidences in the time evolution of the Wigner function are extracted that identify a tunnelling effect. A connection with an $SU(2)$-based semiclassical continuous approach to the Lipkin model is also presented.

  16. Implementing CDIO Approach in preparing engineers for Space Industry

    Directory of Open Access Journals (Sweden)

    Daneykin Yury

    2017-01-01

    Full Text Available The necessity to train highly qualified specialists leads to the development of the trajectory that can allow training specialists for the space industry. Several steps have been undertaken to reach this purpose. First, the University founded the Space Instrument Design Center that promotes a wide range of initiatives in the sphere of educating specialists, retraining specialists, carrying out research and collaborating with profiled enterprises. The University introduced Elite Engineering Education system to attract talented specialist and help them to follow individual trajectory to train unique specialist. The paper discusses the targets necessary for achievement to train specialists. Moreover, the paper presents the compliance of the attempts with the CDIO Approach, which is widely used in leading universities to improve engineering programs.

  17. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    Science.gov (United States)

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai

    2017-08-01

    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  18. Exactly solvable string models of curved space-time backgrounds

    International Nuclear Information System (INIS)

    Russo, J.G.

    1995-01-01

    We consider a new 3-parameter class of exact 4-dimensional solutions in closed string theory and solve the corresponding string model, determining the physical spectrum and the partition function. The background fields (4-metric, antisymmetric tensor, two Kaluza-Klein vector fields, dilaton and modulus) generically describe axially symmetric stationary rotating (electro)magnetic flux-tube type universes. Backgrounds of this class include both the ''dilatonic'' (a=1) and ''Kaluza-Klein'' (a=√(3)) Melvin solutions and the uniform magnetic field solution, as well as some singular space-times. Solvability of the string σ-model is related to its connection via duality to a simpler model which is a ''twisted'' product of a flat 2-space and a space dual to 2-plane. We discuss some physical properties of this model (tachyonic instabilities in the spectrum, gyromagnetic ratio, issue of singularities, etc.). It provides one of the first examples of a consistent solvable conformal string model with explicit D=4 curved space-time interpretation. (orig.)

  19. Beyond GLMs: a generative mixture modeling approach to neural system identification.

    Directory of Open Access Journals (Sweden)

    Lucas Theis

    Full Text Available Generalized linear models (GLMs represent a popular choice for the probabilistic characterization of neural spike responses. While GLMs are attractive for their computational tractability, they also impose strong assumptions and thus only allow for a limited range of stimulus-response relationships to be discovered. Alternative approaches exist that make only very weak assumptions but scale poorly to high-dimensional stimulus spaces. Here we seek an approach which can gracefully interpolate between the two extremes. We extend two frequently used special cases of the GLM-a linear and a quadratic model-by assuming that the spike-triggered and non-spike-triggered distributions can be adequately represented using Gaussian mixtures. Because we derive the model from a generative perspective, its components are easy to interpret as they correspond to, for example, the spike-triggered distribution and the interspike interval distribution. The model is able to capture complex dependencies on high-dimensional stimuli with far fewer parameters than other approaches such as histogram-based methods. The added flexibility comes at the cost of a non-concave log-likelihood. We show that in practice this does not have to be an issue and the mixture-based model is able to outperform generalized linear and quadratic models.

  20. An Open and Holistic Approach for Geo and Space Sciences

    Science.gov (United States)

    Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Toshihiko, Iyemori; Yatagai, Akiyo; Koyama, Yukinobu; Murayama, Yasuhiro; King, Todd; Hughes, Steve; Fung, Shing; Galkin, Ivan; Hapgood, Mike; Belehaki, Anna

    2016-04-01

    domain-specific data servers is necessary. In times of the WWW or nowadays Semantic Web, context enriched and mashed-up open data catalogs pointing to the appropriate data sources, step-by-step will help to overcome the burden of the users to find the right data. Further on, the Semantic Web provides an interoperable and universal format for data and metadata. The Resource Description Formation (RDF) inherently enables a domain and cross-domain mashup of data, e.g. realized in the Linked Open Data project. Scientific work and appropriate papers in the geo and space domain often are based on data, physical models and previous publications, which again have been dependent on data, models and publications. So, in order to guarantee a high quality of scientific work, the complete verification process of the results is necessary. This is nothing new, but in times of Big Data a real challenge. So, what do we need for a complete verification of presented results? Yes, especially we need all the original data which has been used. But it is also necessary to get complete information about the context of the research objectives and the resulting constraints in the preparation of the raw data. Further on we need knowledge about the methods and the appropriate processing software, which has been used to generate the results. The Open Data approach enriched by the Open Archive idea is providing the concept for sustainable and verifiable scientific work. Open Archive of course stands for the free availability of scientific papers. But furthermore it focuses on mechanisms and methods within the realm of scientific publications for referencing and providing the underlying data, methods and software. Such reference mechanism are the use of Digital Object Identifier (DOI) or Uniform Resource Identifier (URI) within the Semantic Web -in our case for geo and space science data- but also methods and software code. Nowadays, more and more open and private publishers are demanding such kind of

  1. Lee-Carter state space modeling: Application to the Malaysia mortality data

    Science.gov (United States)

    Zakiyatussariroh, W. H. Wan; Said, Z. Mohammad; Norazan, M. R.

    2014-06-01

    This article presents an approach that formalizes the Lee-Carter (LC) model as a state space model. Maximum likelihood through Expectation-Maximum (EM) algorithm was used to estimate the model. The methodology is applied to Malaysia's total population mortality data. Malaysia's mortality data was modeled based on age specific death rates (ASDR) data from 1971-2009. The fitted ASDR are compared to the actual observed values. However, results from the comparison of the fitted and actual values between LC-SS model and the original LC model shows that the fitted values from the LC-SS model and original LC model are quite close. In addition, there is not much difference between the value of root mean squared error (RMSE) and Akaike information criteria (AIC) from both models. The LC-SS model estimated for this study can be extended for forecasting ASDR in Malaysia. Then, accuracy of the LC-SS compared to the original LC can be further examined by verifying the forecasting power using out-of-sample comparison.

  2. Developing Viable Financing Models for Space Tourism

    Science.gov (United States)

    Eilingsfeld, F.; Schaetzler, D.

    2002-01-01

    Increasing commercialization of space services and the impending release of government's control of space access promise to make space ventures more attractive. Still, many investors shy away from going into the space tourism market as long as they do not feel secure that their return expectations will be met. First and foremost, attracting investors from the capital markets requires qualifying financing models. Based on earlier research on the cost of capital for space tourism, this paper gives a brief run-through of commercial, technical and financial due diligence aspects. After that, a closer look is taken at different valuation techniques as well as alternative ways of streamlining financials. Experience from earlier ventures has shown that the high cost of capital represents a significant challenge. Thus, the sophistication and professionalism of business plans and financial models needs to be very high. Special emphasis is given to the optimization of the debt-to-equity ratio over time. The different roles of equity and debt over a venture's life cycle are explained. Based on the latter, guidelines for the design of an optimized loan structure are given. These are then applied to simulating the financial performance of a typical space tourism venture over time, including the calculation of Weighted Average Cost of Capital (WACC) and Net Present Value (NPV). Based on a concluding sensitivity analysis, the lessons learned are presented. If applied properly, these will help to make space tourism economically viable.

  3. Seemingly Unrelated Regression Approach for GSTARIMA Model to Forecast Rain Fall Data in Malang Southern Region Districts

    Directory of Open Access Journals (Sweden)

    Siti Choirun Nisak

    2016-06-01

    Full Text Available Time series forecasting models can be used to predict phenomena that occur in nature. Generalized Space Time Autoregressive (GSTAR is one of time series model used to forecast the data consisting the elements of time and space. This model is limited to the stationary and non-seasonal data. Generalized Space Time Autoregressive Integrated Moving Average (GSTARIMA is GSTAR development model that accommodates the non-stationary and seasonal data. Ordinary Least Squares (OLS is method used to estimate parameter of GSTARIMA model. Estimation parameter of GSTARIMA model using OLS will not produce efficiently estimator if there is an error correlation between spaces. Ordinary Least Square (OLS assumes the variance-covariance matrix has a constant error ~(, but in fact, the observatory spaces are correlated so that variance-covariance matrix of the error is not constant. Therefore, Seemingly Unrelated Regression (SUR approach is used to accommodate the weakness of the OLS. SUR assumption is ~(, for estimating parameters GSTARIMA model. The method to estimate parameter of SUR is Generalized Least Square (GLS. Applications GSTARIMA-SUR models for rainfall data in the region Malang obtained GSTARIMA models ((1(1,12,36,(0,(1-SUR with determination coefficient generated with the average of 57.726%.

  4. Statistical Software for State Space Methods

    Directory of Open Access Journals (Sweden)

    Jacques J. F. Commandeur

    2011-05-01

    Full Text Available In this paper we review the state space approach to time series analysis and establish the notation that is adopted in this special volume of the Journal of Statistical Software. We first provide some background on the history of state space methods for the analysis of time series. This is followed by a concise overview of linear Gaussian state space analysis including the modelling framework and appropriate estimation methods. We discuss the important class of unobserved component models which incorporate a trend, a seasonal, a cycle, and fixed explanatory and intervention variables for the univariate and multivariate analysis of time series. We continue the discussion by presenting methods for the computation of different estimates for the unobserved state vector: filtering, prediction, and smoothing. Estimation approaches for the other parameters in the model are also considered. Next, we discuss how the estimation procedures can be used for constructing confidence intervals, detecting outlier observations and structural breaks, and testing model assumptions of residual independence, homoscedasticity, and normality. We then show how ARIMA and ARIMA components models fit in the state space framework to time series analysis. We also provide a basic introduction for non-Gaussian state space models. Finally, we present an overview of the software tools currently available for the analysis of time series with state space methods as they are discussed in the other contributions to this special volume.

  5. Space-time modeling of soil moisture

    Science.gov (United States)

    Chen, Zijuan; Mohanty, Binayak P.; Rodriguez-Iturbe, Ignacio

    2017-11-01

    A physically derived space-time mathematical representation of the soil moisture field is carried out via the soil moisture balance equation driven by stochastic rainfall forcing. The model incorporates spatial diffusion and in its original version, it is shown to be unable to reproduce the relative fast decay in the spatial correlation functions observed in empirical data. This decay resulting from variations in local topography as well as in local soil and vegetation conditions is well reproduced via a jitter process acting multiplicatively over the space-time soil moisture field. The jitter is a multiplicative noise acting on the soil moisture dynamics with the objective to deflate its correlation structure at small spatial scales which are not embedded in the probabilistic structure of the rainfall process that drives the dynamics. These scales of order of several meters to several hundred meters are of great importance in ecohydrologic dynamics. Properties of space-time correlation functions and spectral densities of the model with jitter are explored analytically, and the influence of the jitter parameters, reflecting variabilities of soil moisture at different spatial and temporal scales, is investigated. A case study fitting the derived model to a soil moisture dataset is presented in detail.

  6. Grassmann phase space theory and the Jaynes–Cummings model

    International Nuclear Information System (INIS)

    Dalton, B.J.; Garraway, B.M.; Jeffers, J.; Barnett, S.M.

    2013-01-01

    often developed. However, atomic spin operators satisfy the standard angular momentum commutation rules rather than the commutation rules for bosonic annihilation and creation operators, and are in fact second order combinations of fermionic annihilation and creation operators. Though phase space methods in which the fermionic operators are represented directly by c-number phase space variables have not been successful, the anti-commutation rules for these operators suggest the possibility of using Grassmann variables—which have similar anti-commutation properties. However, in spite of the seminal work by Cahill and Glauber and a few applications, the use of phase space methods in quantum optics to treat fermionic systems by representing fermionic annihilation and creation operators directly by Grassmann phase space variables is rather rare. This paper shows that phase space methods using a positive P type distribution function involving both c-number variables (for the cavity mode) and Grassmann variables (for the TLA) can be used to treat the Jaynes–Cummings model. Although it is a Grassmann function, the distribution function is equivalent to six c-number functions of the two bosonic variables. Experimental quantities are given as bosonic phase space integrals involving the six functions. A Fokker–Planck equation involving both left and right Grassmann differentiations can be obtained for the distribution function, and is equivalent to six coupled equations for the six c-number functions. The approach used involves choosing the canonical form of the (non-unique) positive P distribution function, in which the correspondence rules for the bosonic operators are non-standard and hence the Fokker–Planck equation is also unusual. Initial conditions, such as those above for initially uncorrelated states, are discussed and used to determine the initial distribution function. Transformations to new bosonic variables rotating at the cavity frequency enable the six

  7. Phase-space networks of geometrically frustrated systems.

    Science.gov (United States)

    Han, Yilong

    2009-11-01

    We illustrate a network approach to the phase-space study by using two geometrical frustration models: antiferromagnet on triangular lattice and square ice. Their highly degenerated ground states are mapped as discrete networks such that the quantitative network analysis can be applied to phase-space studies. The resulting phase spaces share some comon features and establish a class of complex networks with unique Gaussian spectral densities. Although phase-space networks are heterogeneously connected, the systems are still ergodic due to the random Poisson processes. This network approach can be generalized to phase spaces of some other complex systems.

  8. Kin-aesthetic Space-making

    DEFF Research Database (Denmark)

    Brabrand, Helle

    2016-01-01

    Body Space Object Symposium 26.02.2016 Strand: The (Moving) Body as Archive Title: Kin-aesthetic Space-making The paper presents a cross-medial practice exchanging body movement and tectonic space. Working with a performative model of gesture, the practice takes up a dialogue with Jean......’s How the Body Shapes the Mind forms part of the theoretical approach to motile kin-aesthetical forces of art-making, underlying this paper. In my practice I work with body- and space gestures, interchanging through a ‘third’ material, featured on screens. The hybrid production includes animated 2 and 3......D drawings, video sequences, and technological treatment constituted by movement of camera, light and diverse editing. Creating a mutable changing sensory surface, the modelling gestures draw attention to their actual occurring in space-time, articulating and transforming space-time configurations...

  9. Application of separable parameter space techniques to multi-tracer PET compartment modeling

    International Nuclear Information System (INIS)

    Zhang, Jeff L; Michael Morey, A; Kadrmas, Dan J

    2016-01-01

    Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg–Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models. (paper)

  10. Multimedia Mapping using Continuous State Space Models

    DEFF Research Database (Denmark)

    Lehn-Schiøler, Tue

    2004-01-01

    In this paper a system that transforms speech waveforms to animated faces are proposed. The system relies on continuous state space models to perform the mapping, this makes it possible to ensure video with no sudden jumps and allows continuous control of the parameters in 'face space'. Simulations...... are performed on recordings of 3-5 sec. video sequences with sentences from the Timit database. The model is able to construct an image sequence from an unknown noisy speech sequence fairly well even though the number of training examples are limited....

  11. Long-range planning cost model for support of future space missions by the deep space network

    Science.gov (United States)

    Sherif, J. S.; Remer, D. S.; Buchanan, H. R.

    1990-01-01

    A simple model is suggested to do long-range planning cost estimates for Deep Space Network (DSP) support of future space missions. The model estimates total DSN preparation costs and the annual distribution of these costs for long-range budgetary planning. The cost model is based on actual DSN preparation costs from four space missions: Galileo, Voyager (Uranus), Voyager (Neptune), and Magellan. The model was tested against the four projects and gave cost estimates that range from 18 percent above the actual total preparation costs of the projects to 25 percent below. The model was also compared to two other independent projects: Viking and Mariner Jupiter/Saturn (MJS later became Voyager). The model gave cost estimates that range from 2 percent (for Viking) to 10 percent (for MJS) below the actual total preparation costs of these missions.

  12. Exploring galaxy evolution with latent space walks

    Science.gov (United States)

    Schawinski, Kevin; Turp, Dennis; Zhang, Ce

    2018-01-01

    We present a new approach using artificial intelligence to perform data-driven forward models of astrophysical phenomena. We describe how a variational autoencoder can be used to encode galaxies to latent space, independently manipulate properties such as the specific star formation rate, and return it to real space. Such transformations can be used for forward modeling phenomena using data as the only constraints. We demonstrate the utility of this approach using the question of the quenching of star formation in galaxies.

  13. Conceptual design of jewellery: a space-based aesthetics approach

    Directory of Open Access Journals (Sweden)

    Tzintzi Vaia

    2017-01-01

    Full Text Available Conceptual design is a field that offers various aesthetic approaches to generation of nature-based product design concepts. Essentially, Conceptual Product Design (CPD uses similarities based on the geometrical forms and functionalities. Furthermore, the CAD-based freehand sketch is a primary conceptual tool in the early stages of the design process. The proposed Conceptual Product Design concept is dealing with jewelleries that are inspired from space. Specifically, a number of galaxy features, such as galaxy shapes, wormholes and graphical representation of planet magnetic field are used as inspirations. Those space-based design ideas at a conceptual level can lead to further opportunities for research and economic success of the jewellery industry. A number of illustrative case studies are presented and new opportunities can be derived for economic success.

  14. New Approaches in Reuseable Booster System Life Cycle Cost Modeling

    Science.gov (United States)

    Zapata, Edgar

    2013-01-01

    This paper presents the results of a 2012 life cycle cost (LCC) study of hybrid Reusable Booster Systems (RBS) conducted by NASA Kennedy Space Center (KSC) and the Air Force Research Laboratory (AFRL). The work included the creation of a new cost estimating model and an LCC analysis, building on past work where applicable, but emphasizing the integration of new approaches in life cycle cost estimation. Specifically, the inclusion of industry processes/practices and indirect costs were a new and significant part of the analysis. The focus of LCC estimation has traditionally been from the perspective of technology, design characteristics, and related factors such as reliability. Technology has informed the cost related support to decision makers interested in risk and budget insight. This traditional emphasis on technology occurs even though it is well established that complex aerospace systems costs are mostly about indirect costs, with likely only partial influence in these indirect costs being due to the more visible technology products. Organizational considerations, processes/practices, and indirect costs are traditionally derived ("wrapped") only by relationship to tangible product characteristics. This traditional approach works well as long as it is understood that no significant changes, and by relation no significant improvements, are being pursued in the area of either the government acquisition or industry?s indirect costs. In this sense then, most launch systems cost models ignore most costs. The alternative was implemented in this LCC study, whereby the approach considered technology and process/practices in balance, with as much detail for one as the other. This RBS LCC study has avoided point-designs, for now, instead emphasizing exploring the trade-space of potential technology advances joined with potential process/practice advances. Given the range of decisions, and all their combinations, it was necessary to create a model of the original model

  15. New Approaches in Reusable Booster System Life Cycle Cost Modeling

    Science.gov (United States)

    Zapata, Edgar

    2013-01-01

    This paper presents the results of a 2012 life cycle cost (LCC) study of hybrid Reusable Booster Systems (RBS) conducted by NASA Kennedy Space Center (KSC) and the Air Force Research Laboratory (AFRL). The work included the creation of a new cost estimating model and an LCC analysis, building on past work where applicable, but emphasizing the integration of new approaches in life cycle cost estimation. Specifically, the inclusion of industry processes/practices and indirect costs were a new and significant part of the analysis. The focus of LCC estimation has traditionally been from the perspective of technology, design characteristics, and related factors such as reliability. Technology has informed the cost related support to decision makers interested in risk and budget insight. This traditional emphasis on technology occurs even though it is well established that complex aerospace systems costs are mostly about indirect costs, with likely only partial influence in these indirect costs being due to the more visible technology products. Organizational considerations, processes/practices, and indirect costs are traditionally derived ("wrapped") only by relationship to tangible product characteristics. This traditional approach works well as long as it is understood that no significant changes, and by relation no significant improvements, are being pursued in the area of either the government acquisition or industry?s indirect costs. In this sense then, most launch systems cost models ignore most costs. The alternative was implemented in this LCC study, whereby the approach considered technology and process/practices in balance, with as much detail for one as the other. This RBS LCC study has avoided point-designs, for now, instead emphasizing exploring the trade-space of potential technology advances joined with potential process/practice advances. Given the range of decisions, and all their combinations, it was necessary to create a model of the original model

  16. QML-AiNet: An immune network approach to learning qualitative differential equation models.

    Science.gov (United States)

    Pang, Wei; Coghill, George M

    2015-02-01

    In this paper, we explore the application of Opt-AiNet, an immune network approach for search and optimisation problems, to learning qualitative models in the form of qualitative differential equations. The Opt-AiNet algorithm is adapted to qualitative model learning problems, resulting in the proposed system QML-AiNet. The potential of QML-AiNet to address the scalability and multimodal search space issues of qualitative model learning has been investigated. More importantly, to further improve the efficiency of QML-AiNet, we also modify the mutation operator according to the features of discrete qualitative model space. Experimental results show that the performance of QML-AiNet is comparable to QML-CLONALG, a QML system using the clonal selection algorithm (CLONALG). More importantly, QML-AiNet with the modified mutation operator can significantly improve the scalability of QML and is much more efficient than QML-CLONALG.

  17. Hybrid x-space: a new approach for MPI reconstruction.

    Science.gov (United States)

    Tateo, A; Iurino, A; Settanni, G; Andrisani, A; Stifanelli, P F; Larizza, P; Mazzia, F; Mininni, R M; Tangaro, S; Bellotti, R

    2016-06-07

    Magnetic particle imaging (MPI) is a new medical imaging technique capable of recovering the distribution of superparamagnetic particles from their measured induced signals. In literature there are two main MPI reconstruction techniques: measurement-based (MB) and x-space (XS). The MB method is expensive because it requires a long calibration procedure as well as a reconstruction phase that can be numerically costly. On the other side, the XS method is simpler than MB but the exact knowledge of the field free point (FFP) motion is essential for its implementation. Our simulation work focuses on the implementation of a new approach for MPI reconstruction: it is called hybrid x-space (HXS), representing a combination of the previous methods. Specifically, our approach is based on XS reconstruction because it requires the knowledge of the FFP position and velocity at each time instant. The difference with respect to the original XS formulation is how the FFP velocity is computed: we estimate it from the experimental measurements of the calibration scans, typical of the MB approach. Moreover, a compressive sensing technique is applied in order to reduce the calibration time, setting a fewer number of sampling positions. Simulations highlight that HXS and XS methods give similar results. Furthermore, an appropriate use of compressive sensing is crucial for obtaining a good balance between time reduction and reconstructed image quality. Our proposal is suitable for open geometry configurations of human size devices, where incidental factors could make the currents, the fields and the FFP trajectory irregular.

  18. Nonlinear sigma models with compact hyperbolic target spaces

    Energy Technology Data Exchange (ETDEWEB)

    Gubser, Steven [Joseph Henry Laboratories, Princeton University, Princeton, NJ 08544 (United States); Saleem, Zain H. [Department of Physics and Astronomy, University of Pennsylvania,Philadelphia, PA 19104 (United States); National Center for Physics, Quaid-e-Azam University Campus,Islamabad 4400 (Pakistan); Schoenholz, Samuel S. [Department of Physics and Astronomy, University of Pennsylvania,Philadelphia, PA 19104 (United States); Stoica, Bogdan [Walter Burke Institute for Theoretical Physics, California Institute of Technology,452-48, Pasadena, CA 91125 (United States); Stokes, James [Department of Physics and Astronomy, University of Pennsylvania,Philadelphia, PA 19104 (United States)

    2016-06-23

    We explore the phase structure of nonlinear sigma models with target spaces corresponding to compact quotients of hyperbolic space, focusing on the case of a hyperbolic genus-2 Riemann surface. The continuum theory of these models can be approximated by a lattice spin system which we simulate using Monte Carlo methods. The target space possesses interesting geometric and topological properties which are reflected in novel features of the sigma model. In particular, we observe a topological phase transition at a critical temperature, above which vortices proliferate, reminiscent of the Kosterlitz-Thouless phase transition in the O(2) model V.L. Berezinskii, Destruction of long-range order in one-dimensional and two-dimensional systems having a continuous symmetry group II. Quantum systems, Sov. Phys. JETP 34 (1972) 610. J.M. Kosterlitz and D.J. Thouless, Ordering, metastability and phase transitions in two-dimensional systems, J. Phys. C 6 (1973) 1181 [http://inspirehep.net/search?p=find+J+%22J.Phys.,C6,1181%22]. . Unlike in the O(2) case, there are many different types of vortices, suggesting a possible analogy to the Hagedorn treatment of statistical mechanics of a proliferating number of hadron species. Below the critical temperature the spins cluster around six special points in the target space known as Weierstrass points. The diversity of compact hyperbolic manifolds suggests that our model is only the simplest example of a broad class of statistical mechanical models whose main features can be understood essentially in geometric terms.

  19. Nonlinear sigma models with compact hyperbolic target spaces

    International Nuclear Information System (INIS)

    Gubser, Steven; Saleem, Zain H.; Schoenholz, Samuel S.; Stoica, Bogdan; Stokes, James

    2016-01-01

    We explore the phase structure of nonlinear sigma models with target spaces corresponding to compact quotients of hyperbolic space, focusing on the case of a hyperbolic genus-2 Riemann surface. The continuum theory of these models can be approximated by a lattice spin system which we simulate using Monte Carlo methods. The target space possesses interesting geometric and topological properties which are reflected in novel features of the sigma model. In particular, we observe a topological phase transition at a critical temperature, above which vortices proliferate, reminiscent of the Kosterlitz-Thouless phase transition in the O(2) model V.L. Berezinskii, Destruction of long-range order in one-dimensional and two-dimensional systems having a continuous symmetry group II. Quantum systems, Sov. Phys. JETP 34 (1972) 610. J.M. Kosterlitz and D.J. Thouless, Ordering, metastability and phase transitions in two-dimensional systems, J. Phys. C 6 (1973) 1181 [http://inspirehep.net/search?p=find+J+%22J.Phys.,C6,1181%22]. . Unlike in the O(2) case, there are many different types of vortices, suggesting a possible analogy to the Hagedorn treatment of statistical mechanics of a proliferating number of hadron species. Below the critical temperature the spins cluster around six special points in the target space known as Weierstrass points. The diversity of compact hyperbolic manifolds suggests that our model is only the simplest example of a broad class of statistical mechanical models whose main features can be understood essentially in geometric terms.

  20. Equivalent circuit modeling of space charge dominated magnetically insulated transmission lines

    Energy Technology Data Exchange (ETDEWEB)

    Hiraoka, Kazuki; Nakajima, Mitsuo; Horioka, Kazuhiko

    1997-12-31

    A new equivalent circuit model for space charge dominated MITLs (Magnetically Insulated Transmission Lines) was developed. MITLs under high power operation are dominated with space charge current flowing between anode and cathode. Conventional equivalent circuit model does not account for space charge effects on power flow. The model was modified to discuss the power transportation through the high power MITLs. With this model, it is possible to estimate the effects of space charge current on the power flow efficiency, without using complicated particle code simulations. (author). 3 figs., 3 refs.

  1. Predicting temperature and moisture distributions in conditioned spaces using the zonal approach

    Energy Technology Data Exchange (ETDEWEB)

    Mendonca, K.C. [Parana Pontifical Catholic Univ., Curitiba (Brazil); Wurtz, E.; Inard, C. [La Rochelle Univ., La Rochelle, Cedex (France). LEPTAB

    2005-07-01

    Moisture interacts with building elements in a number of different ways that impact upon building performance, causing deterioration of building materials, as well as contributing to poor indoor air quality. In humid climates, moisture represents one of the major loads in conditioned spaces. It is therefore important to understand and model moisture transport accurately. This paper discussed an intermediate zonal approach to building a library of data in order to predict whole hygrothermal behavior in conditioned rooms. The zonal library included 2 models in order to consider building envelope moisture buffering effects as well as taking into account the dynamic aspect of jet airflow in the zonal method. The zonal library was then applied to a case study to show the impact of external humidity on the whole hygrothermal performance of a room equipped with a vertical fan-coil unit. The proposed theory was structured into 3 groups representing 3 building domains: indoor air; envelope; and heating, ventilation and air conditioning (HVAC) systems. The indoor air sub-model related to indoor air space, where airflow speed was considered to be low. The envelope sub-model related to the radiation exchanges between the envelope and its environment as well as to the heat and mass transfers through the envelope material. The HVAC system sub-model referred to the whole system including equipment, control and specific airflow from the equipment. All the models were coupled into SPARK, where the resulting set of non-linear equations were solved simultaneously. A case study of a large office conditioned by a vertical fan-coil unit with a rectangular air supply diffuser was presented. Details of the building's external and internal environment were provided, as well as convective heat and mass transfer coefficients and temperature distributions versus time. Results of the study indicated that understanding building material moisture buffering effects is as important as

  2. Parametric modeling of the intervertebral disc space in 3D: application to CT images of the lumbar spine.

    Science.gov (United States)

    Korez, Robert; Likar, Boštjan; Pernuš, Franjo; Vrtovec, Tomaž

    2014-10-01

    Gradual degeneration of intervertebral discs of the lumbar spine is one of the most common causes of low back pain. Although conservative treatment for low back pain may provide relief to most individuals, surgical intervention may be required for individuals with significant continuing symptoms, which is usually performed by replacing the degenerated intervertebral disc with an artificial implant. For designing implants with good bone contact and continuous force distribution, the morphology of the intervertebral disc space and vertebral body endplates is of considerable importance. In this study, we propose a method for parametric modeling of the intervertebral disc space in three dimensions (3D) and show its application to computed tomography (CT) images of the lumbar spine. The initial 3D model of the intervertebral disc space is generated according to the superquadric approach and therefore represented by a truncated elliptical cone, which is initialized by parameters obtained from 3D models of adjacent vertebral bodies. In an optimization procedure, the 3D model of the intervertebral disc space is incrementally deformed by adding parameters that provide a more detailed morphometric description of the observed shape, and aligned to the observed intervertebral disc space in the 3D image. By applying the proposed method to CT images of 20 lumbar spines, the shape and pose of each of the 100 intervertebral disc spaces were represented by a 3D parametric model. The resulting mean (±standard deviation) accuracy of modeling was 1.06±0.98mm in terms of radial Euclidean distance against manually defined ground truth points, with the corresponding success rate of 93% (i.e. 93 out of 100 intervertebral disc spaces were modeled successfully). As the resulting 3D models provide a description of the shape of intervertebral disc spaces in a complete parametric form, morphometric analysis was straightforwardly enabled and allowed the computation of the corresponding

  3. An integrated approach for the knowledge discovery in computer simulation models with a multi-dimensional parameter space

    Energy Technology Data Exchange (ETDEWEB)

    Khawli, Toufik Al; Eppelt, Urs; Hermanns, Torsten [RWTH Aachen University, Chair for Nonlinear Dynamics, Steinbachstr. 15, 52047 Aachen (Germany); Gebhardt, Sascha [RWTH Aachen University, Virtual Reality Group, IT Center, Seffenter Weg 23, 52074 Aachen (Germany); Kuhlen, Torsten [Forschungszentrum Jülich GmbH, Institute for Advanced Simulation (IAS), Jülich Supercomputing Centre (JSC), Wilhelm-Johnen-Straße, 52425 Jülich (Germany); Schulz, Wolfgang [Fraunhofer, ILT Laser Technology, Steinbachstr. 15, 52047 Aachen (Germany)

    2016-06-08

    In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part is to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.

  4. Phase space model for transmission of light beam

    International Nuclear Information System (INIS)

    Fu Shinian

    1989-01-01

    Based on Fermat's principle of ray optics, the Hamiltonian of an optical ray is derived by comparison with classical mechanics. A phase space model of light beam is proposed, assuming that the light beam, regarded as a group of rays, can be described by an ellipse in the μ-phase space. Therefore, the transmission of light beam is represented by the phase space matrix transformation. By means of this non-wave formulation, the same results are obtained as those from wave equation such as Kogelnik's ABCD law. As an example of the application on this model, the matching problem of optical cavity is solved

  5. Track structure model of cell damage in space flight

    Science.gov (United States)

    Katz, Robert; Cucinotta, Francis A.; Wilson, John W.; Shinn, Judy L.; Ngo, Duc M.

    1992-01-01

    The phenomenological track-structure model of cell damage is discussed. A description of the application of the track-structure model with the NASA Langley transport code for laboratory and space radiation is given. Comparisons to experimental results for cell survival during exposure to monoenergetic, heavy-ion beams are made. The model is also applied to predict cell damage rates and relative biological effectiveness for deep-space exposures.

  6. A Simulation and Modeling Framework for Space Situational Awareness

    International Nuclear Information System (INIS)

    Olivier, S.S.

    2008-01-01

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated

  7. Modeling the cometary environment using a fluid approach

    Science.gov (United States)

    Shou, Yinsi

    Comets are believed to have preserved the building material of the early solar system and to hold clues to the origin of life on Earth. Abundant remote observations of comets by telescopes and the in-situ measurements by a handful of space missions reveal that the cometary environments are complicated by various physical and chemical processes among the neutral gases and dust grains released from comets, cometary ions, and the solar wind in the interplanetary space. Therefore, physics-based numerical models are in demand to interpret the observational data and to deepen our understanding of the cometary environment. In this thesis, three models using a fluid approach, which include important physical and chemical processes underlying the cometary environment, have been developed to study the plasma, neutral gas, and the dust grains, respectively. Although models based on the fluid approach have limitations in capturing all of the correct physics for certain applications, especially for very low gas density environment, they are computationally much more efficient than alternatives. In the simulations of comet 67P/Churyumov-Gerasimenko at various heliocentric distances with a wide range of production rates, our multi-fluid cometary neutral gas model and multi-fluid cometary dust model have achieved comparable results to the Direct Simulation Monte Carlo (DSMC) model, which is based on a kinetic approach that is valid in all collisional regimes. Therefore, our model is a powerful alternative to the particle-based model, especially for some computationally intensive simulations. Capable of accounting for the varying heating efficiency under various physical conditions in a self-consistent way, the multi-fluid cometary neutral gas model is a good tool to study the dynamics of the cometary coma with different production rates and heliocentric distances. The modeled H2O expansion speeds reproduce the general trend and the speed's nonlinear dependencies of production rate

  8. Making Faces - State-Space Models Applied to Multi-Modal Signal Processing

    DEFF Research Database (Denmark)

    Lehn-Schiøler, Tue

    2005-01-01

    The two main focus areas of this thesis are State-Space Models and multi modal signal processing. The general State-Space Model is investigated and an addition to the class of sequential sampling methods is proposed. This new algorithm is denoted as the Parzen Particle Filter. Furthermore...... optimizer can be applied to speed up convergence. The linear version of the State-Space Model, the Kalman Filter, is applied to multi modal signal processing. It is demonstrated how a State-Space Model can be used to map from speech to lip movements. Besides the State-Space Model and the multi modal...... application an information theoretic vector quantizer is also proposed. Based on interactions between particles, it is shown how a quantizing scheme based on an analytic cost function can be derived....

  9. Minimizing Human Risk: Human Performance Models in the Space Human Factors and Habitability and Behavioral Health and Performance Elements

    Science.gov (United States)

    Gore, Brian F.

    2016-01-01

    Human space exploration has never been more exciting than it is today. Human presence to outer worlds is becoming a reality as humans are leveraging much of our prior knowledge to the new mission of going to Mars. Exploring the solar system at greater distances from Earth than ever before will possess some unique challenges, which can be overcome thanks to the advances in modeling and simulation technologies. The National Aeronautics and Space Administration (NASA) is at the forefront of exploring our solar system. NASA's Human Research Program (HRP) focuses on discovering the best methods and technologies that support safe and productive human space travel in the extreme and harsh space environment. HRP uses various methods and approaches to answer questions about the impact of long duration missions on the human in space including: gravity's impact on the human body, isolation and confinement on the human, hostile environments impact on the human, space radiation, and how the distance is likely to impact the human. Predictive models are included in the HRP research portfolio as these models provide valuable insights into human-system operations. This paper will provide an overview of NASA's HRP and will present a number of projects that have used modeling and simulation to provide insights into human-system issues (e.g. automation, habitat design, schedules) in anticipation of space exploration.

  10. The Faster, Better, Cheaper Approach to Space Missions: An Engineering Management Assessment

    Science.gov (United States)

    Hamaker, Joe

    2000-01-01

    This paper describes, in viewgraph form, the faster, better, cheaper approach to space missions. The topics include: 1) What drives "Faster, Better, Cheaper"? 2) Why Space Programs are Costly; 3) Background; 4) Aerospace Project Management (Old Culture); 5) Aerospace Project Management (New Culture); 6) Scope of Analysis Limited to Engineering Management Culture; 7) Qualitative Analysis; 8) Some Basic Principles of the New Culture; 9) Cause and Effect; 10) "New Ways of Doing Business" Survey Results; 11) Quantitative Analysis; 12) Recent Space System Cost Trends; 13) Spacecraft Dry Weight Trend; 14) Complexity Factor Trends; 15) Cost Normalization; 16) Cost Normalization Algorithm; 17) Unnormalized Cost vs. Normalized Cost; and 18) Concluding Observations.

  11. A new approach to spatially explicit modelling of forest dynamics: spacing, ageing and neighbourhood competition of mangrove trees

    NARCIS (Netherlands)

    Berger, U.; Hildenbrandt, H.

    2000-01-01

    This paper presents a new approach to spatially explicit modelling that enables the influence of neighbourhood effects on the dynamics of forests and plant communities to be analysed. We refer to this approach as 'field of neighbourhood' (FON). It combines the 'neighbourhood philosophy' of

  12. Lag space estimation in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    The purpose of this article is to investigate some techniques for finding the relevant lag-space, i.e. input information, for time series modelling. This is an important aspect of time series modelling, as it conditions the design of the model through the regressor vector a.k.a. the input layer...

  13. Real-space local polynomial basis for solid-state electronic-structure calculations: A finite-element approach

    International Nuclear Information System (INIS)

    Pask, J.E.; Klein, B.M.; Fong, C.Y.; Sterne, P.A.

    1999-01-01

    We present an approach to solid-state electronic-structure calculations based on the finite-element method. In this method, the basis functions are strictly local, piecewise polynomials. Because the basis is composed of polynomials, the method is completely general and its convergence can be controlled systematically. Because the basis functions are strictly local in real space, the method allows for variable resolution in real space; produces sparse, structured matrices, enabling the effective use of iterative solution methods; and is well suited to parallel implementation. The method thus combines the significant advantages of both real-space-grid and basis-oriented approaches and so promises to be particularly well suited for large, accurate ab initio calculations. We develop the theory of our approach in detail, discuss advantages and disadvantages, and report initial results, including electronic band structures and details of the convergence of the method. copyright 1999 The American Physical Society

  14. Parametric Cost Modeling of Space Missions Using the Develop New Projects (DMP) Implementation Process

    Science.gov (United States)

    Rosenberg, Leigh; Hihn, Jairus; Roust, Kevin; Warfield, Keith

    2000-01-01

    This paper presents an overview of a parametric cost model that has been built at JPL to estimate costs of future, deep space, robotic science missions. Due to the recent dramatic changes in JPL business practices brought about by an internal reengineering effort known as develop new products (DNP), high-level historic cost data is no longer considered analogous to future missions. Therefore, the historic data is of little value in forecasting costs for projects developed using the DNP process. This has lead to the development of an approach for obtaining expert opinion and also for combining actual data with expert opinion to provide a cost database for future missions. In addition, the DNP cost model has a maximum of objective cost drivers which reduces the likelihood of model input error. Version 2 is now under development which expands the model capabilities, links it more tightly with key design technical parameters, and is grounded in more rigorous statistical techniques. The challenges faced in building this model will be discussed, as well as it's background, development approach, status, validation, and future plans.

  15. State Machine Modeling of the Space Launch System Solid Rocket Boosters

    Science.gov (United States)

    Harris, Joshua A.; Patterson-Hine, Ann

    2013-01-01

    The Space Launch System is a Shuttle-derived heavy-lift vehicle currently in development to serve as NASA's premiere launch vehicle for space exploration. The Space Launch System is a multistage rocket with two Solid Rocket Boosters and multiple payloads, including the Multi-Purpose Crew Vehicle. Planned Space Launch System destinations include near-Earth asteroids, the Moon, Mars, and Lagrange points. The Space Launch System is a complex system with many subsystems, requiring considerable systems engineering and integration. To this end, state machine analysis offers a method to support engineering and operational e orts, identify and avert undesirable or potentially hazardous system states, and evaluate system requirements. Finite State Machines model a system as a finite number of states, with transitions between states controlled by state-based and event-based logic. State machines are a useful tool for understanding complex system behaviors and evaluating "what-if" scenarios. This work contributes to a state machine model of the Space Launch System developed at NASA Ames Research Center. The Space Launch System Solid Rocket Booster avionics and ignition subsystems are modeled using MATLAB/Stateflow software. This model is integrated into a larger model of Space Launch System avionics used for verification and validation of Space Launch System operating procedures and design requirements. This includes testing both nominal and o -nominal system states and command sequences.

  16. Data-Model and Inter-Model Comparisons of the GEM Outflow Events Using the Space Weather Modeling Framework

    Science.gov (United States)

    Welling, D. T.; Eccles, J. V.; Barakat, A. R.; Kistler, L. M.; Haaland, S.; Schunk, R. W.; Chappell, C. R.

    2015-12-01

    Two storm periods were selected by the Geospace Environment Modeling Ionospheric Outflow focus group for community collaborative study because of its high magnetospheric activity and extensive data coverage: the September 27 - October 4, 2002 corotating interaction region event and the October 22 - 29 coronal mass ejection event. During both events, the FAST, Polar, Cluster, and other missions made key observations, creating prime periods for data-model comparison. The GEM community has come together to simulate this period using many different methods in order to evaluate models, compare results, and expand our knowledge of ionospheric outflow and its effects on global dynamics. This paper presents Space Weather Modeling Framework (SWMF) simulations of these important periods compared against observations from the Polar TIDE, Cluster CODIF and EFW instruments. Emphasis will be given to the second event. Density and velocity of oxygen and hydrogen throughout the lobes, plasma sheet, and inner magnetosphere will be the focus of these comparisons. For these simulations, the SWMF couples the multifluid version of BATS-R-US MHD to a variety of ionospheric outflow models of varying complexity. The simplest is outflow arising from constant MHD inner boundary conditions. Two first-principles-based models are also leveraged: the Polar Wind Outflow Model (PWOM), a fluid treatment of outflow dynamics, and the Generalized Polar Wind (GPW) model, which combines fluid and particle-in-cell approaches. Each model is capable of capturing a different set of energization mechanisms, yielding different outflow results. The data-model comparisons will illustrate how well each approach captures reality and which energization mechanisms are most important. Inter-model comparisons will illustrate how the different outflow specifications affect the magnetosphere. Specifically, it is found that the GPW provides increased heavy ion outflow over a broader spatial range than the alternative

  17. Exclusive vector meson production with leading neutrons in a saturation model for the dipole amplitude in mixed space

    Science.gov (United States)

    Amaral, J. T.; Becker, V. M.

    2018-05-01

    We investigate ρ vector meson production in e p collisions at HERA with leading neutrons in the dipole formalism. The interaction of the dipole and the pion is described in a mixed-space approach, in which the dipole-pion scattering amplitude is given by the Marquet-Peschanski-Soyez saturation model, which is based on the traveling wave solutions of the nonlinear Balitsky-Kovchegov equation. We estimate the magnitude of the absorption effects and compare our results with a previous analysis of the same process in full coordinate space. In contrast with this approach, the present study leads to absorption K factors in the range of those predicted by previous theoretical studies on semi-inclusive processes.

  18. Design and implementation of space physics multi-model application integration based on web

    Science.gov (United States)

    Jiang, Wenping; Zou, Ziming

    With the development of research on space environment and space science, how to develop network online computing environment of space weather, space environment and space physics models for Chinese scientific community is becoming more and more important in recent years. Currently, There are two software modes on space physics multi-model application integrated system (SPMAIS) such as C/S and B/S. the C/S mode which is traditional and stand-alone, demands a team or workshop from many disciplines and specialties to build their own multi-model application integrated system, that requires the client must be deployed in different physical regions when user visits the integrated system. Thus, this requirement brings two shortcomings: reducing the efficiency of researchers who use the models to compute; inconvenience of accessing the data. Therefore, it is necessary to create a shared network resource access environment which could help users to visit the computing resources of space physics models through the terminal quickly for conducting space science research and forecasting spatial environment. The SPMAIS develops high-performance, first-principles in B/S mode based on computational models of the space environment and uses these models to predict "Space Weather", to understand space mission data and to further our understanding of the solar system. the main goal of space physics multi-model application integration system (SPMAIS) is to provide an easily and convenient user-driven online models operating environment. up to now, the SPMAIS have contained dozens of space environment models , including international AP8/AE8 IGRF T96 models and solar proton prediction model geomagnetic transmission model etc. which are developed by Chinese scientists. another function of SPMAIS is to integrate space observation data sets which offers input data for models online high-speed computing. In this paper, service-oriented architecture (SOA) concept that divides system into

  19. A risk-based approach to flammable gas detector spacing.

    Science.gov (United States)

    Defriend, Stephen; Dejmek, Mark; Porter, Leisa; Deshotels, Bob; Natvig, Bernt

    2008-11-15

    Flammable gas detectors allow an operating company to address leaks before they become serious, by automatically alarming and by initiating isolation and safe venting. Without effective gas detection, there is very limited defense against a flammable gas leak developing into a fire or explosion that could cause loss of life or escalate to cascading failures of nearby vessels, piping, and equipment. While it is commonly recognized that some gas detectors are needed in a process plant containing flammable gas or volatile liquids, there is usually a question of how many are needed. The areas that need protection can be determined by dispersion modeling from potential leak sites. Within the areas that must be protected, the spacing of detectors (or alternatively, number of detectors) should be based on risk. Detector design can be characterized by spacing criteria, which is convenient for design - or alternatively by number of detectors, which is convenient for cost reporting. The factors that influence the risk are site-specific, including process conditions, chemical composition, number of potential leak sites, piping design standards, arrangement of plant equipment and structures, design of isolation and depressurization systems, and frequency of detector testing. Site-specific factors such as those just mentioned affect the size of flammable gas cloud that must be detected (within a specified probability) by the gas detection system. A probability of detection must be specified that gives a design with a tolerable risk of fires and explosions. To determine the optimum spacing of detectors, it is important to consider the probability that a detector will fail at some time and be inoperative until replaced or repaired. A cost-effective approach is based on the combined risk from a representative selection of leakage scenarios, rather than a worst-case evaluation. This means that probability and severity of leak consequences must be evaluated together. In marine and

  20. Static models, recursive estimators and the zero-variance approach

    KAUST Repository

    Rubino, Gerardo

    2016-01-07

    When evaluating dependability aspects of complex systems, most models belong to the static world, where time is not an explicit variable. These models suffer from the same problems than dynamic ones (stochastic processes), such as the frequent combinatorial explosion of the state spaces. In the Monte Carlo domain, on of the most significant difficulties is the rare event situation. In this talk, we describe this context and a recent technique that appears to be at the top performance level in the area, where we combined ideas that lead to very fast estimation procedures with another approach called zero-variance approximation. Both ideas produced a very efficient method that has the right theoretical property concerning robustness, the Bounded Relative Error one. Some examples illustrate the results.

  1. Exploring a Multiresolution Modeling Approach within the Shallow-Water Equations

    Energy Technology Data Exchange (ETDEWEB)

    Ringler, Todd D.; Jacobsen, Doug; Gunzburger, Max; Ju, Lili; Duda, Michael; Skamarock, William

    2011-11-01

    The ability to solve the global shallow-water equations with a conforming, variable-resolution mesh is evaluated using standard shallow-water test cases. While the long-term motivation for this study is the creation of a global climate modeling framework capable of resolving different spatial and temporal scales in different regions, the process begins with an analysis of the shallow-water system in order to better understand the strengths and weaknesses of the approach developed herein. The multiresolution meshes are spherical centroidal Voronoi tessellations where a single, user-supplied density function determines the region(s) of fine- and coarsemesh resolution. The shallow-water system is explored with a suite of meshes ranging from quasi-uniform resolution meshes, where the grid spacing is globally uniform, to highly variable resolution meshes, where the grid spacing varies by a factor of 16 between the fine and coarse regions. The potential vorticity is found to be conserved to within machine precision and the total available energy is conserved to within a time-truncation error. This result holds for the full suite of meshes, ranging from quasi-uniform resolution and highly variable resolution meshes. Based on shallow-water test cases 2 and 5, the primary conclusion of this study is that solution error is controlled primarily by the grid resolution in the coarsest part of the model domain. This conclusion is consistent with results obtained by others.When these variable-resolution meshes are used for the simulation of an unstable zonal jet, the core features of the growing instability are found to be largely unchanged as the variation in the mesh resolution increases. The main differences between the simulations occur outside the region of mesh refinement and these differences are attributed to the additional truncation error that accompanies increases in grid spacing. Overall, the results demonstrate support for this approach as a path toward

  2. Testing for Level Shifts in Fractionally Integrated Processes: a State Space Approach

    DEFF Research Database (Denmark)

    Monache, Davide Delle; Grassi, Stefano; Santucci de Magistris, Paolo

    Short memory models contaminated by level shifts have similar long-memory features as fractionally integrated processes. This makes hard to verify whether the true data generating process is a pure fractionally integrated process when employing standard estimation methods based on the autocorrela......Short memory models contaminated by level shifts have similar long-memory features as fractionally integrated processes. This makes hard to verify whether the true data generating process is a pure fractionally integrated process when employing standard estimation methods based...... on the autocorrelation function or the periodogram. In this paper, we propose a robust testing procedure, based on an encompassing parametric specification that allows to disentangle the level shifts from the fractionally integrated component. The estimation is carried out on the basis of a state-space methodology...... and it leads to a robust estimate of the fractional integration parameter also in presence of level shifts. Once the memory parameter is correctly estimated, we use the KPSS test for presence of level shift. The Monte Carlo simulations show how this approach produces unbiased estimates of the memory parameter...

  3. From scores to face templates: a model-based approach.

    Science.gov (United States)

    Mohanty, Pranab; Sarkar, Sudeep; Kasturi, Rangachar

    2007-12-01

    Regeneration of templates from match scores has security and privacy implications related to any biometric authentication system. We propose a novel paradigm to reconstruct face templates from match scores using a linear approach. It proceeds by first modeling the behavior of the given face recognition algorithm by an affine transformation. The goal of the modeling is to approximate the distances computed by a face recognition algorithm between two faces by distances between points, representing these faces, in an affine space. Given this space, templates from an independent image set (break-in) are matched only once with the enrolled template of the targeted subject and match scores are recorded. These scores are then used to embed the targeted subject in the approximating affine (non-orthogonal) space. Given the coordinates of the targeted subject in the affine space, the original template of the targeted subject is reconstructed using the inverse of the affine transformation. We demonstrate our ideas using three, fundamentally different, face recognition algorithms: Principal Component Analysis (PCA) with Mahalanobis cosine distance measure, Bayesian intra-extrapersonal classifier (BIC), and a feature-based commercial algorithm. To demonstrate the independence of the break-in set with the gallery set, we select face templates from two different databases: Face Recognition Grand Challenge (FRGC) and Facial Recognition Technology (FERET) Database (FERET). With an operational point set at 1 percent False Acceptance Rate (FAR) and 99 percent True Acceptance Rate (TAR) for 1,196 enrollments (FERET gallery), we show that at most 600 attempts (score computations) are required to achieve a 73 percent chance of breaking in as a randomly chosen target subject for the commercial face recognition system. With similar operational set up, we achieve a 72 percent and 100 percent chance of breaking in for the Bayesian and PCA based face recognition systems, respectively. With

  4. Phase-Space Models of Solitary Electron Hoies

    DEFF Research Database (Denmark)

    Lynov, Jens-Peter; Michelsen, Poul; Pécseli, Hans

    1985-01-01

    Two different phase-space models of solitary electron holes are investigated and compared with results from computer simulations of an actual laboratory experiment, carried out in a strongly magnetized, cylindrical plasma column. In the two models, the velocity distribution of the electrons...

  5. Space Weather Forecasting and Research at the Community Coordinated Modeling Center

    Science.gov (United States)

    Aronne, M.

    2015-12-01

    The Space Weather Research Center (SWRC), within the Community Coordinated Modeling Center (CCMC), provides experimental research forecasts and analysis for NASA's robotic mission operators. Space weather conditions are monitored to provide advance warning and forecasts based on observations and modeling using the integrated Space Weather Analysis Network (iSWA). Space weather forecasters come from a variety of backgrounds, ranging from modelers to astrophysicists to undergraduate students. This presentation will discuss space weather operations and research from an undergraduate perspective. The Space Weather Research, Education, and Development Initiative (SW REDI) is the starting point for many undergraduate opportunities in space weather forecasting and research. Space weather analyst interns play an active role year-round as entry-level space weather analysts. Students develop the technical and professional skills to forecast space weather through a summer internship that includes a two week long space weather boot camp, mentorship, poster session, and research opportunities. My unique development of research projects includes studying high speed stream events as well as a study of 20 historic, high-impact solar energetic particle events. This unique opportunity to combine daily real-time analysis with related research prepares students for future careers in Heliophysics.

  6. Fuzzy parametric uncertainty analysis of linear dynamical systems: A surrogate modeling approach

    Science.gov (United States)

    Chowdhury, R.; Adhikari, S.

    2012-10-01

    Uncertainty propagation engineering systems possess significant computational challenges. This paper explores the possibility of using correlated function expansion based metamodelling approach when uncertain system parameters are modeled using Fuzzy variables. In particular, the application of High-Dimensional Model Representation (HDMR) is proposed for fuzzy finite element analysis of dynamical systems. The HDMR expansion is a set of quantitative model assessment and analysis tools for capturing high-dimensional input-output system behavior based on a hierarchy of functions of increasing dimensions. The input variables may be either finite-dimensional (i.e., a vector of parameters chosen from the Euclidean space RM) or may be infinite-dimensional as in the function space CM[0,1]. The computational effort to determine the expansion functions using the alpha cut method scales polynomially with the number of variables rather than exponentially. This logic is based on the fundamental assumption underlying the HDMR representation that only low-order correlations among the input variables are likely to have significant impacts upon the outputs for most high-dimensional complex systems. The proposed method is integrated with a commercial Finite Element software. Modal analysis of a simplified aircraft wing with Fuzzy parameters has been used to illustrate the generality of the proposed approach. In the numerical examples, triangular membership functions have been used and the results have been validated against direct Monte Carlo simulations.

  7. Real-Space Analysis of Scanning Tunneling Microscopy Topography Datasets Using Sparse Modeling Approach

    Science.gov (United States)

    Miyama, Masamichi J.; Hukushima, Koji

    2018-04-01

    A sparse modeling approach is proposed for analyzing scanning tunneling microscopy topography data, which contain numerous peaks originating from the electron density of surface atoms and/or impurities. The method, based on the relevance vector machine with L1 regularization and k-means clustering, enables separation of the peaks and peak center positioning with accuracy beyond the resolution of the measurement grid. The validity and efficiency of the proposed method are demonstrated using synthetic data in comparison with the conventional least-squares method. An application of the proposed method to experimental data of a metallic oxide thin-film clearly indicates the existence of defects and corresponding local lattice distortions.

  8. Space-time PM2.5 mapping in the severe haze region of Jing-Jin-Ji (China) using a synthetic approach.

    Science.gov (United States)

    He, Junyu; Christakos, George

    2018-05-07

    Long- and short-term exposure to PM 2.5 is of great concern in China due to its adverse population health effects. Characteristic of the severity of the situation in China is that in the Jing-Jin-Ji region considered in this work a total of 2725 excess deaths have been attributed to short-term PM 2.5 exposure during the period January 10-31, 2013. Technically, the processing of large space-time PM 2.5 datasets and the mapping of the space-time distribution of PM 2.5 concentrations often constitute high-cost projects. To address this situation, we propose a synthetic modeling framework based on the integration of (a) the Bayesian maximum entropy method that assimilates auxiliary information from land-use regression and artificial neural network (ANN) model outputs based on PM 2.5 monitoring, satellite remote sensing data, land use and geographical records, with (b) a space-time projection technique that transforms the PM 2.5 concentration values from the original spatiotemporal domain onto a spatial domain that moves along the direction of the PM 2.5 velocity spread. An interesting methodological feature of the synthetic approach is that its components (methods or models) are complementary, i.e., one component can compensate for the occasional limitations of another component. Insight is gained in terms of a PM 2.5 case study covering the severe haze Jing-Jin-Ji region during October 1-31, 2015. The proposed synthetic approach explicitly accounted for physical space-time dependencies of the PM 2.5 distribution. Moreover, the assimilation of auxiliary information and the dimensionality reduction achieved by the synthetic approach produced rather impressive results: It generated PM 2.5 concentration maps with low estimation uncertainty (even at counties and villages far away from the monitoring stations, whereas during the haze periods the uncertainty reduction was over 50% compared to standard PM 2.5 mapping techniques); and it also proved to be computationally very

  9. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    International Nuclear Information System (INIS)

    Herbert, L.T.; Hansen, Z.N.L.

    2016-01-01

    This paper presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults and incorporate an intention preserving stochastic semantics able to model both probabilistic- and non-deterministic behaviour. Stochastic model checking techniques are employed to generate the state-space of a given workflow. Possible improvements obtained by restructuring are measured by employing the framework's capacity for tracking real-valued quantities associated with states and transitions of the workflow. The space of possible restructurings of a workflow is explored by means of an evolutionary algorithm, where the goals for improvement are defined in terms of optimising quantities, typically employed to model resources, associated with a workflow. The approach is fully automated and only the modelling of the production workflows, potential faults and the expression of the goals require manual input. We present the design of a software tool implementing this framework and explore the practical utility of this approach through an industrial case study in which the risk of production failures and their impact are reduced by restructuring the workflow. - Highlights: • We present a framework which allows for the automated restructuring of workflows. • This framework seeks to minimise the impact of errors on the workflow. • We illustrate a scalable software implementation of this framework. • We explore the practical utility of this approach through an industry case. • The impact of errors can be substantially reduced by restructuring the workflow.

  10. A Bayesian approach to identifying and compensating for model misspecification in population models.

    Science.gov (United States)

    Thorson, James T; Ono, Kotaro; Munch, Stephan B

    2014-02-01

    State-space estimation methods are increasingly used in ecology to estimate productivity and abundance of natural populations while accounting for variability in both population dynamics and measurement processes. However, functional forms for population dynamics and density dependence often will not match the true biological process, and this may degrade the performance of state-space methods. We therefore developed a Bayesian semiparametric state-space model, which uses a Gaussian process (GP) to approximate the population growth function. This offers two benefits for population modeling. First, it allows data to update a specified "prior" on the population growth function, while reverting to this prior when data are uninformative. Second, it allows variability in population dynamics to be decomposed into random errors around the population growth function ("process error") and errors due to the mismatch between the specified prior and estimated growth function ("model error"). We used simulation modeling to illustrate the utility of GP methods in state-space population dynamics models. Results confirmed that the GP model performs similarly to a conventional state-space model when either (1) the prior matches the true process or (2) data are relatively uninformative. However, GP methods improve estimates of the population growth function when the function is misspecified. Results also demonstrated that the estimated magnitude of "model error" can be used to distinguish cases of model misspecification. We conclude with a discussion of the prospects for GP methods in other state-space models, including age and length-structured, meta-analytic, and individual-movement models.

  11. Field space entanglement entropy, zero modes and Lifshitz models

    Science.gov (United States)

    Huffel, Helmuth; Kelnhofer, Gerald

    2017-12-01

    The field space entanglement entropy of a quantum field theory is obtained by integrating out a subset of its fields. We study an interacting quantum field theory consisting of massless scalar fields on a closed compact manifold M. To this model we associate its Lifshitz dual model. The ground states of both models are invariant under constant shifts. We interpret this invariance as gauge symmetry and subject the models to proper gauge fixing. By applying the heat kernel regularization one can show that the field space entanglement entropies of the massless scalar field model and of its Lifshitz dual are agreeing.

  12. Evaluation of Private Sector Roles in Space Resource Development

    Science.gov (United States)

    Lamassoure, Elisabeth S.; Blair, Brad R.; Diaz, Javier; Oderman, Mark; Duke, Michael B.; Vaucher, Marc; Manvi, Ramachandra; Easter, Robert W.

    2003-01-01

    An integrated engineering and financial modeling approach has been developed and used to evaluate the potential for private sector investment in space resource development, and to assess possible roles of the public sector in fostering private interest. This paper presents the modeling approach and its results for a transportation service using propellant extracted from lunar regolith. The analysis starts with careful case study definition, including an analysis of the customer base and market requirements, which are the basis for design of a modular, scalable space architecture. The derived non-recurring, recurring and operations costs become inputs for a `standard' financial model, as used in any commercial business plan. This model generates pro forma financial statements, calculates the amount of capitalization required, and generates return on equity calculations using two valuation metrics of direct interest to private investors: market enterprise value and multiples of key financial measures. Use of this model on an architecture to sell transportation services in Earth orbit based on lunar propellants shows how to rapidly test various assumptions and identify interesting architectural options, key areas for investment in exploration and technology, or innovative business approaches that could produce an economically viable industry. The same approach can be used to evaluate any other possible private ventures in space, and conclude on the respective roles of NASA and the private sector in space resource development and solar system exploration.

  13. Exactly solvable string models of curved space-time backgrounds

    CERN Document Server

    Russo, J.G.; Russo, J G; Tseytlin, A A

    1995-01-01

    We consider a new 3-parameter class of exact 4-dimensional solutions in closed string theory and solve the corresponding string model, determining the physical spectrum and the partition function. The background fields (4-metric, antisymmetric tensor, two Kaluza-Klein vector fields, dilaton and modulus) generically describe axially symmetric stationary rotating (electro)magnetic flux-tube type universes. Backgrounds of this class include both the dilatonic Melvin solution and the uniform magnetic field solution discussed earlier as well as some singular space-times. Solvability of the string sigma model is related to its connection via duality to a much simpler looking model which is a "twisted" product of a flat 2-space and a space dual to 2-plane. We discuss some physical properties of this model as well as a number of generalizations leading to larger classes of exact 4-dimensional string solutions.

  14. Microscopic calculation of level densities: the shell model Monte Carlo approach

    International Nuclear Information System (INIS)

    Alhassid, Yoram

    2012-01-01

    The shell model Monte Carlo (SMMC) approach provides a powerful technique for the microscopic calculation of level densities in model spaces that are many orders of magnitude larger than those that can be treated by conventional methods. We discuss a number of developments: (i) Spin distribution. We used a spin projection method to calculate the exact spin distribution of energy levels as a function of excitation energy. In even-even nuclei we find an odd-even staggering effect (in spin). Our results were confirmed in recent analysis of experimental data. (ii) Heavy nuclei. The SMMC approach was extended to heavy nuclei. We have studied the crossover between vibrational and rotational collectivity in families of samarium and neodymium isotopes in model spaces of dimension approx. 10 29 . We find good agreement with experimental results for both state densities and 2 > (where J is the total spin). (iii) Collective enhancement factors. We have calculated microscopically the vibrational and rotational enhancement factors of level densities versus excitation energy. We find that the decay of these enhancement factors in heavy nuclei is correlated with the pairing and shape phase transitions. (iv) Odd-even and odd-odd nuclei. The projection on an odd number of particles leads to a sign problem in SMMC. We discuss a novel method to calculate state densities in odd-even and odd-odd nuclei despite the sign problem. (v) State densities versus level densities. The SMMC approach has been used extensively to calculate state densities. However, experiments often measure level densities (where levels are counted without including their spin degeneracies.) A spin projection method enables us to also calculate level densities in SMMC. We have calculated the SMMC level density of 162 Dy and found it to agree well with experiments

  15. Model-driven methodology for rapid deployment of smart spaces based on resource-oriented architectures.

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  16. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  17. NASA Models of Space Radiation Induced Cancer, Circulatory Disease, and Central Nervous System Effects

    Science.gov (United States)

    Cucinotta, Francis A.; Chappell, Lori J.; Kim, Myung-Hee Y.

    2013-01-01

    effectiveness of radiation mitigator's. The NSRM- 2014 approaches to model radiation quality dependent lethality and NTE's will be described. CNS effects include both early changes that may occur during long space missions and late effects such as Alzheimer's disease (AD). AD effects 50% of the population above age 80-yr, is a degenerative disease that worsens with time after initial onset leading to death, and has no known cure. AD is difficult to detect at early stages and the small number of low LET epidemiology studies undertaken have not identified an association with low dose radiation. However experimental studies in mice suggest GCR may lead to early onset AD. We discuss modeling approaches to consider mechanisms whereby radiation would lead to earlier onset of occurrence of AD. Biomarkers of AD include amyloid beta (A(Beta)) plaques, and neurofibrillary tangles (NFT) made up of aggregates of the hyperphosphorylated form of the micro-tubule associated, tau protein. Related markers include synaptic degeneration, dentritic spine loss, and neuronal cell loss through apoptosis. Radiation may affect these processes by causing oxidative stress, aberrant signaling following DNA damage, and chronic neuroinflammation. Cell types to be considered in multi-scale models are neurons, astrocytes, and microglia. We developed biochemical and cell kinetics models of DNA damage signaling related to glycogen synthase kinase-3(Beta) (GSK3(Beta)) and neuroinflammation, and considered multi-scale modeling approaches to develop computer simulations of cell interactions and their relationships to A(Beta) plaques and NFTs. Comparison of model results to experimental data for the age specific development of A(Beta) plaques in transgenic mice will be discussed.

  18. Infinite-mode squeezed coherent states and non-equilibrium statistical mechanics (phase-space-picture approach)

    International Nuclear Information System (INIS)

    Yeh, L.

    1992-01-01

    The phase-space-picture approach to quantum non-equilibrium statistical mechanics via the characteristic function of infinite- mode squeezed coherent states is introduced. We use quantum Brownian motion as an example to show how this approach provides an interesting geometrical interpretation of quantum non-equilibrium phenomena

  19. Crane cabins' interior space multivariate anthropometric modeling.

    Science.gov (United States)

    Essdai, Ahmed; Spasojević Brkić, Vesna K; Golubović, Tamara; Brkić, Aleksandar; Popović, Vladimir

    2018-01-01

    Previous research has shown that today's crane cabins fail to meet the needs of a large proportion of operators. Performance and financial losses and effects on safety should not be overlooked as well. The first aim of this survey is to model the crane cabin interior space using up-to-date crane operator anthropometric data and to compare the multivariate and univariate method anthropometric models. The second aim of the paper is to define the crane cabin interior space dimensions that enable anthropometric convenience. To facilitate the cabin design, the anthropometric dimensions of 64 crane operators in the first sample and 19 more in the second sample were collected in Serbia. The multivariate anthropometric models, spanning 95% of the population on the basis of a set of 8 anthropometric dimensions, have been developed. The percentile method was also used on the same set of data. The dimensions of the interior space, necessary for the accommodation of the crane operator, are 1174×1080×1865 mm. The percentiles results for the 5th and 95th model are within the obtained dimensions. The results of this study may prove useful to crane cabin designers in eliminating anthropometric inconsistencies and improving the health of operators, but can also aid in improving the safety, performance and financial results of the companies where crane cabins operate.

  20. Validation of nuclear models used in space radiation shielding applications

    International Nuclear Information System (INIS)

    Norman, Ryan B.; Blattnig, Steve R.

    2013-01-01

    A program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as these models are developed over time. In this work, simple validation metrics applicable to testing both model accuracy and consistency with experimental data are developed. The developed metrics treat experimental measurement uncertainty as an interval and are therefore applicable to cases in which epistemic uncertainty dominates the experimental data. To demonstrate the applicability of the metrics, nuclear physics models used by NASA for space radiation shielding applications are compared to an experimental database consisting of over 3600 experimental cross sections. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by examining subsets of the model parameter space.

  1. Efficient Neural Network Modeling for Flight and Space Dynamics Simulation

    Directory of Open Access Journals (Sweden)

    Ayman Hamdy Kassem

    2011-01-01

    Full Text Available This paper represents an efficient technique for neural network modeling of flight and space dynamics simulation. The technique will free the neural network designer from guessing the size and structure for the required neural network model and will help to minimize the number of neurons. For linear flight/space dynamics systems, the technique can find the network weights and biases directly by solving a system of linear equations without the need for training. Nonlinear flight dynamic systems can be easily modeled by training its linearized models keeping the same network structure. The training is fast, as it uses the linear system knowledge to speed up the training process. The technique is tested on different flight/space dynamic models and showed promising results.

  2. Mathematical Model of the Public Understanding of Space Science

    Science.gov (United States)

    Prisniakov, V.; Prisniakova, L.

    The success in deployment of the space programs now in many respects depends on comprehension by the citizens of necessity of programs, from "space" erudition of country. Purposefulness and efficiency of the "space" teaching and educational activity depend on knowledge of relationships between separate variables of such process. The empirical methods of ``space'' well-information of the taxpayers should be supplemented by theoretical models permitting to demonstrate a ways of control by these processes. Authors on the basis of their experience of educational activity during 50- years of among the students of space-rocket profession obtain an equation of ``space" state of the society determining a degree of its knowledge about Space, about achievements in its development, about indispensable lines of investigations, rates of informatization of the population. It is supposed, that the change of the space information consists of two parts: (1) - from going of the information about practical achievements, about development special knowledge requiring of independent financing, and (2) from intensity of dissemination of the ``free" information of a general educational line going to the population through mass-media, book, in family, in educational institutions, as a part of obligatory knowledge of any man, etc. In proposed model the level space well-information of the population depends on intensity of dissemination in the society of the space information, and also from a volume of financing of space-rocket technology, from a part of population of the employment in the space-rocket programs, from a factor of education of the population in adherence to space problems, from welfare and mentality of the people, from a rate of unemployment and material inequality. Obtained in the report on these principles the equation of a space state of the society corresponds to catastrophe such as cusp, the analysis has shown which one ways of control of the public understanding of space

  3. Validated TRNSYS Model for Solar Assisted Space Heating System

    International Nuclear Information System (INIS)

    Abdalla, Nedal

    2014-01-01

    The present study involves a validated TRNSYS model for solar assisted space heating system as applied to a residential building in Jordan using new detailed radiation models of the TRNSYS 17.1 and geometric building model Trnsys3d for the Google SketchUp 3D drawing program. The annual heating load for a building (Solar House) which is located at the Royal ScientiFIc Society (RS5) in Jordan is estimated under climatological conditions of Amman. The aim of this Paper is to compare measured thermal performance of the Solar House with that modeled using TRNSYS. The results showed that the annual measured space heating load for the building was 6,188 kWh while the heati.ng load for the modeled building was 6,391 kWh. Moreover, the measured solar fraction for the solar system was 50% while the modeled solar fraction was 55%. A comparison of modeled and measured data resulted in percentage mean absolute errors for solar energy for space heating, auxiliary heating and solar fraction of 13%, 7% and 10%, respectively. The validated model will be useful for long-term performance simulation under different weather and operating conditions.(author)

  4. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  5. A Web Based Approach to Integrate Space Culture and Education

    Science.gov (United States)

    Gerla, F.

    2002-01-01

    Our intention is to dedicate a large section of our web site to space education. As the national User Support and Operation Center (USOC) for the International Space Station, MARS Center is also willing to provide material, such as videos and data, for educational purposes. In order to base our initiative on authoritative precedents, our first step has been a comparative analysis between different space agency education web sites, such as ESA and NASA. As is well known, Internet is a powerful reality, capable of connecting people all over the world and rendering public a huge amount of information. The first problem, then, is to organize this information, in order to use the web as an efficient education tool. That is why studies such as User Modeling (UM), Human Computer Interaction (HCI) and Semantic Web have become more important in Information Technology and Science. Traditional search engines are unable to provide an optimal retrieval of contents really searched for by users. Semantic Web is a valid alternative: according to its theories, web information should be represented using metadata language. Users should be able and enabled to successfully search, obtain and study new information from web. Forging knowledge in an intelligent manner, preventing users from making errors, and making this formidable quantity of information easily available have also been the starting points for HCI methodologies for defining Adaptable Interfaces. Here the information is divided into different sets, on the basis of the intended user profile, in order to prevent users from getting lost. Realized as an adaptable interface, an education web site can help users to effectively retrieve the information necessary for their scopes (teaching for a teacher and learning for a student). For students it's a great advantage to use interfaces designed on the basis of their age and scholastic level. Indeed, an adaptable interface is intended not just for students, but also for teachers

  6. Preliminary Multivariable Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. Previously, the authors published two single variable cost models based on 19 flight missions. The current paper presents the development of a multi-variable space telescopes cost model. The validity of previously published models are tested. Cost estimating relationships which are and are not significant cost drivers are identified. And, interrelationships between variables are explored

  7. Discrete Variational Approach for Modeling Laser-Plasma Interactions

    Science.gov (United States)

    Reyes, J. Paxon; Shadwick, B. A.

    2014-10-01

    The traditional approach for fluid models of laser-plasma interactions begins by approximating fields and derivatives on a grid in space and time, leading to difference equations that are manipulated to create a time-advance algorithm. In contrast, by introducing the spatial discretization at the level of the action, the resulting Euler-Lagrange equations have particular differencing approximations that will exactly satisfy discrete versions of the relevant conservation laws. For example, applying a spatial discretization in the Lagrangian density leads to continuous-time, discrete-space equations and exact energy conservation regardless of the spatial grid resolution. We compare the results of two discrete variational methods using the variational principles from Chen and Sudan and Brizard. Since the fluid system conserves energy and momentum, the relative errors in these conserved quantities are well-motivated physically as figures of merit for a particular method. This work was supported by the U. S. Department of Energy under Contract No. DE-SC0008382 and by the National Science Foundation under Contract No. PHY-1104683.

  8. Astronaut Ross Approaches Assembly Concept for Construction of Erectable Space Structure (ACCESS)

    Science.gov (United States)

    1999-01-01

    The crew assigned to the STS-61B mission included Bryan D. O'Conner, pilot; Brewster H. Shaw, commander; Charles D. Walker, payload specialist; mission specialists Jerry L. Ross, Mary L. Cleave, and Sherwood C. Spring; and Rodolpho Neri Vela, payload specialist. Launched aboard the Space Shuttle Atlantis November 28, 1985 at 7:29:00 pm (EST), the STS-61B mission's primary payload included three communications satellites: MORELOS-B (Mexico); AUSSAT-2 (Australia); and SATCOM KU-2 (RCA Americom). Two experiments were conducted to test assembling erectable structures in space: EASE (Experimental Assembly of Structures in Extravehicular Activity), and ACCESS (Assembly Concept for Construction of Erectable Space Structure). In a joint venture between NASA/Langley Research Center in Hampton, Virginia, and the Marshall Space Flight Center (MSFC), EASE and ACCESS were developed and demonstrated at MSFC's Neutral Buoyancy Simulator (NBS). In this STS-61B onboard photo, astronaut Ross, perched on the Manipulator Foot Restraint (MFR) approaches the erected ACCESS. The primary objective of these experiments was to test the structural assembly concepts for suitability as the framework for larger space structures and to identify ways to improve the productivity of space construction.

  9. Field space entanglement entropy, zero modes and Lifshitz models

    Directory of Open Access Journals (Sweden)

    Helmuth Huffel

    2017-12-01

    Full Text Available The field space entanglement entropy of a quantum field theory is obtained by integrating out a subset of its fields. We study an interacting quantum field theory consisting of massless scalar fields on a closed compact manifold M. To this model we associate its Lifshitz dual model. The ground states of both models are invariant under constant shifts. We interpret this invariance as gauge symmetry and subject the models to proper gauge fixing. By applying the heat kernel regularization one can show that the field space entanglement entropies of the massless scalar field model and of its Lifshitz dual are agreeing.

  10. The "Carbon Data Explorer": Web-Based Space-Time Visualization of Modeled Carbon Fluxes

    Science.gov (United States)

    Billmire, M.; Endsley, K. A.

    2014-12-01

    The visualization of and scientific "sense-making" from large datasets varying in both space and time is a challenge; one that is still being addressed in a number of different fields. The approaches taken thus far are often specific to a given academic field due to the unique questions that arise in different disciplines, however, basic approaches such as geographic maps and time series plots are still widely useful. The proliferation of model estimates of increasing size and resolution further complicates what ought to be a simple workflow: Model some geophysical phenomen(on), obtain results and measure uncertainty, organize and display the data, make comparisons across trials, and share findings. A new tool is in development that is intended to help scientists with the latter parts of that workflow. The tentatively-titled "Carbon Data Explorer" (http://spatial.mtri.org/flux-client/) enables users to access carbon science and related spatio-temporal science datasets over the web. All that is required to access multiple interactive visualizations of carbon science datasets is a compatible web browser and an internet connection. While the application targets atmospheric and climate science datasets, particularly spatio-temporal model estimates of carbon products, the software architecture takes an agnostic approach to the data to be visualized. Any atmospheric, biophysical, or geophysical quanity that varies in space and time, including one or more measures of uncertainty, can be visualized within the application. Within the web application, users have seamless control over a flexible and consistent symbology for map-based visualizations and plots. Where time series data are represented by one or more data "frames" (e.g. a map), users can animate the data. In the "coordinated view," users can make direct comparisons between different frames and different models or model runs, facilitating intermodal comparisons and assessments of spatio-temporal variability. Map

  11. A potential theory approach to an algorithm of conceptual space partitioning

    Directory of Open Access Journals (Sweden)

    Roman Urban

    2017-12-01

    Full Text Available A potential theory approach to an algorithm of conceptual space partitioning This paper proposes a new classification algorithm for the partitioning of a conceptual space. All the algorithms which have been used until now have mostly been based on the theory of Voronoi diagrams. This paper proposes an approach based on potential theory, with the criteria for measuring similarities between objects in the conceptual space being based on the Newtonian potential function. The notion of a fuzzy prototype, which generalizes the previous definition of a prototype, is introduced. Furthermore, the necessary conditions that a natural concept must meet are discussed. Instead of convexity, as proposed by Gärdenfors, the notion of geodesically convex sets is used. Thus, if a concept corresponds to a set which is geodesically convex, it is a natural concept. This definition applies, for example, if the conceptual space is an Euclidean space. As a by-product of the construction of the algorithm, an extension of the conceptual space to d-dimensional Riemannian manifolds is obtained.   Algorytm podziału przestrzeni konceptualnych przy użyciu teorii potencjału W niniejszej pracy zaproponowany został nowy algorytm podziału przestrzeni konceptualnej. Dotąd podział taki zazwyczaj wykorzystywał teorię diagramów Voronoi. Nasze podejście do problemu oparte jest na teorii potencjału Miara podobieństwa pomiędzy elementami przestrzeni konceptualnej bazuje na Newtonowskiej funkcji potencjału. Definiujemy pojęcie rozmytego prototypu, który uogólnia dotychczas stosowane definicje prototypu. Ponadto zajmujemy się warunkiem koniecznym, który musi spełniać naturalny koncept. Zamiast wypukłości zaproponowanej przez Gärdenforsa, rozważamy linie geodezyjne w obszarze odpowiadającym danemu konceptowi naturalnemu, otrzymując warunek mówiący, że koncept jest konceptem naturalnym, jeżeli zbiór odpowiadający temu konceptowi jest geodezyjnie wypuk

  12. Fitted Hanbury-Brown Twiss radii versus space-time variances in flow-dominated models

    Science.gov (United States)

    Frodermann, Evan; Heinz, Ulrich; Lisa, Michael Annan

    2006-04-01

    The inability of otherwise successful dynamical models to reproduce the Hanbury-Brown Twiss (HBT) radii extracted from two-particle correlations measured at the Relativistic Heavy Ion Collider (RHIC) is known as the RHIC HBT Puzzle. Most comparisons between models and experiment exploit the fact that for Gaussian sources the HBT radii agree with certain combinations of the space-time widths of the source that can be directly computed from the emission function without having to evaluate, at significant expense, the two-particle correlation function. We here study the validity of this approach for realistic emission function models, some of which exhibit significant deviations from simple Gaussian behavior. By Fourier transforming the emission function, we compute the two-particle correlation function, and fit it with a Gaussian to partially mimic the procedure used for measured correlation functions. We describe a novel algorithm to perform this Gaussian fit analytically. We find that for realistic hydrodynamic models the HBT radii extracted from this procedure agree better with the data than the values previously extracted from the space-time widths of the emission function. Although serious discrepancies between the calculated and the measured HBT radii remain, we show that a more apples-to-apples comparison of models with data can play an important role in any eventually successful theoretical description of RHIC HBT data.

  13. Fitted Hanbury-Brown-Twiss radii versus space-time variances in flow-dominated models

    International Nuclear Information System (INIS)

    Frodermann, Evan; Heinz, Ulrich; Lisa, Michael Annan

    2006-01-01

    The inability of otherwise successful dynamical models to reproduce the Hanbury-Brown-Twiss (HBT) radii extracted from two-particle correlations measured at the Relativistic Heavy Ion Collider (RHIC) is known as the RHIC HBT Puzzle. Most comparisons between models and experiment exploit the fact that for Gaussian sources the HBT radii agree with certain combinations of the space-time widths of the source that can be directly computed from the emission function without having to evaluate, at significant expense, the two-particle correlation function. We here study the validity of this approach for realistic emission function models, some of which exhibit significant deviations from simple Gaussian behavior. By Fourier transforming the emission function, we compute the two-particle correlation function, and fit it with a Gaussian to partially mimic the procedure used for measured correlation functions. We describe a novel algorithm to perform this Gaussian fit analytically. We find that for realistic hydrodynamic models the HBT radii extracted from this procedure agree better with the data than the values previously extracted from the space-time widths of the emission function. Although serious discrepancies between the calculated and the measured HBT radii remain, we show that a more apples-to-apples comparison of models with data can play an important role in any eventually successful theoretical description of RHIC HBT data

  14. Supporting Indoor Navigation Using Access Rights to Spaces Based on Combined Use of IndoorGML and LADM Models

    Directory of Open Access Journals (Sweden)

    Abdullah Alattas

    2017-11-01

    Full Text Available The aim of this research is to investigate the combined use of IndoorGML and the Land Administration Domain Model (LADM to define the accessibility of the indoor spaces based on the ownership and/or the functional right for use. The users of the indoor spaces create a relationship with the space depending on the type of the building and the function of the spaces. The indoor spaces of each building have different usage functions and associated users. By defining the user types of the indoor spaces, LADM makes it possible to establish a relationship between the indoor spaces and the users. LADM assigns rights, restrictions, and responsibilities to each indoor space, which indicates the accessible spaces for each type of user. The three-dimensional (3D geometry of the building will be impacted by assigning such functional rights, and will provide additional knowledge to path computation for an individual or a group of users. As a result, the navigation process will be more appropriate and simpler because the navigation path will avoid all of the non-accessible spaces based on the rights of the party. The combined use of IndoorGML and LADM covers a broad range of information classes: (indoor 3D cell spaces, connectivity, spatial units/boundaries, (access/use rights and restrictions, parties/persons/actors, and groups of them. The new specialized classes for individual students, individual staff members, groups of students, groups of staff members are able to represent cohorts of education programmes and the organizational structure (organogram: faculty, department, group. The model is capable to represent the access times to lecture rooms (based on education/teaching schedules, use rights of meeting rooms, opening hours of offices, etc. The two original standard models remain independent in our approach, we do not propose yet another model, but applications can fully benefit of the potential of the combined use, which is an important contribution

  15. Converting boundary representation solid models to half-space representation models for Monte Carlo analysis

    International Nuclear Information System (INIS)

    Davis, J. E.; Eddy, M. J.; Sutton, T. M.; Altomari, T. J.

    2007-01-01

    Solid modeling computer software systems provide for the design of three-dimensional solid models used in the design and analysis of physical components. The current state-of-the-art in solid modeling representation uses a boundary representation format in which geometry and topology are used to form three-dimensional boundaries of the solid. The geometry representation used in these systems is cubic B-spline curves and surfaces - a network of cubic B-spline functions in three-dimensional Cartesian coordinate space. Many Monte Carlo codes, however, use a geometry representation in which geometry units are specified by intersections and unions of half-spaces. This paper describes an algorithm for converting from a boundary representation to a half-space representation. (authors)

  16. A novel approach to finely tuned supersymmetric standard models: The case of the non-universal Higgs mass model

    Science.gov (United States)

    Yamaguchi, Masahiro; Yin, Wen

    2018-02-01

    Discarding the prejudice about fine tuning, we propose a novel and efficient approach to identify relevant regions of fundamental parameter space in supersymmetric models with some amount of fine tuning. The essential idea is the mapping of experimental constraints at a low-energy scale, rather than the parameter sets, to those of the fundamental parameter space. Applying this method to the non-universal Higgs mass model, we identify a new interesting superparticle mass pattern where some of the first two generation squarks are light whilst the stops are kept heavy as 6 TeV. Furthermore, as another application of this method, we show that the discrepancy of the muon anomalous magnetic dipole moment can be filled by a supersymmetric contribution within the 1{σ} level of the experimental and theoretical errors, which was overlooked by previous studies due to the extremely fine tuning required.

  17. A growing social network model in geographical space

    Science.gov (United States)

    Antonioni, Alberto; Tomassini, Marco

    2017-09-01

    In this work we propose a new model for the generation of social networks that includes their often ignored spatial aspects. The model is a growing one and links are created either taking space into account, or disregarding space and only considering the degree of target nodes. These two effects can be mixed linearly in arbitrary proportions through a parameter. We numerically show that for a given range of the combination parameter, and for given mean degree, the generated network class shares many important statistical features with those observed in actual social networks, including the spatial dependence of connections. Moreover, we show that the model provides a good qualitative fit to some measured social networks.

  18. The development of a model of creative space and its potential for transfer from non-formal to formal education

    Science.gov (United States)

    White, Irene; Lorenzi, Francesca

    2016-12-01

    Creativity has been emerging as a key concept in educational policies since the mid-1990s, with many Western countries restructuring their education systems to embrace innovative approaches likely to stimulate creative and critical thinking. But despite current intentions of putting more emphasis on creativity in education policies worldwide, there is still a relative dearth of viable models which capture the complexity of creativity and the conditions for its successful infusion into formal school environments. The push for creativity is in direct conflict with the results-driven/competitive performance-oriented culture which continues to dominate formal education systems. The authors of this article argue that incorporating creativity into mainstream education is a complex task and is best tackled by taking a systematic and multifaceted approach. They present a multidimensional model designed to help educators in tackling the challenges of the promotion of creativity. Their model encompasses three distinct yet interrelated dimensions of a creative space - physical, social-emotional and critical. The authors use the metaphor of space to refer to the interplay of the three identified dimensions. Drawing on confluence approaches to the theorisation of creativity, this paper exemplifies the development of a model before the background of a growing trend of systems theories. The aim of the model is to be helpful in systematising creativity by offering parameters - derived from the evaluation of an example offered by a non-formal educational environment - for the development of creative environments within mainstream secondary schools.

  19. Studying Economic Space: Synthesis of Balance and Game-Theoretic Methods of Modelling

    Directory of Open Access Journals (Sweden)

    Natalia Gennadyevna Zakharchenko

    2015-12-01

    Full Text Available The article introduces questions about development of models used to study economic space. The author proposes the model that combines balance and game-theoretic methods for estimating system effects of economic agents’ interactions in multi-level economic space. The model is applied to research interactions between economic agents that are spatially heterogeneous within the Russian Far East. In the model the economic space of region is considered in a territorial dimension (the first level of decomposing space and also in territorial and product dimensions (the second level of decomposing space. The paper shows the mechanism of system effects formation that exists in the economic space of region. The author estimates system effects, analyses the real allocation of these effects between economic agents and identifies three types of local industrial markets: with zero, positive and negative system effects

  20. A new epidemic modeling approach: Multi-regions discrete-time model with travel-blocking vicinity optimal control strategy.

    Science.gov (United States)

    Zakary, Omar; Rachik, Mostafa; Elmouki, Ilias

    2017-08-01

    First, we devise in this paper, a multi-regions discrete-time model which describes the spatial-temporal spread of an epidemic which starts from one region and enters to regions which are connected with their neighbors by any kind of anthropological movement. We suppose homogeneous Susceptible-Infected-Removed (SIR) populations, and we consider in our simulations, a grid of colored cells, which represents the whole domain affected by the epidemic while each cell can represent a sub-domain or region. Second, in order to minimize the number of infected individuals in one region, we propose an optimal control approach based on a travel-blocking vicinity strategy which aims to control only one cell by restricting movements of infected people coming from all neighboring cells. Thus, we show the influence of the optimal control approach on the controlled cell. We should also note that the cellular modeling approach we propose here, can also describes infection dynamics of regions which are not necessarily attached one to an other, even if no empty space can be viewed between cells. The theoretical method we follow for the characterization of the travel-locking optimal controls, is based on a discrete version of Pontryagin's maximum principle while the numerical approach applied to the multi-points boundary value problems we obtain here, is based on discrete progressive-regressive iterative schemes. We illustrate our modeling and control approaches by giving an example of 100 regions.

  1. Using SpaceClaimTD Direct for Modeling Components with Complex Geometries for the Thermal Desktop-Based Advanced Stirling Radioisotope Generator Model

    Science.gov (United States)

    Fabanich, William A., Jr.

    2014-01-01

    SpaceClaim/TD Direct has been used extensively in the development of the Advanced Stirling Radioisotope Generator (ASRG) thermal model. This paper outlines the workflow for that aspect of the task and includes proposed best practices and lessons learned. The ASRG thermal model was developed to predict component temperatures and power output and to provide insight into the prime contractor's thermal modeling efforts. The insulation blocks, heat collectors, and cold side adapter flanges (CSAFs) were modeled with this approach. The model was constructed using mostly TD finite difference (FD) surfaces/solids. However, some complex geometry could not be reproduced with TD primitives while maintaining the desired degree of geometric fidelity. Using SpaceClaim permitted the import of original CAD files and enabled the defeaturing/repair of those geometries. TD Direct (a SpaceClaim add-on from CRTech) adds features that allowed the "mark-up" of that geometry. These so-called "mark-ups" control how finite element (FE) meshes are to be generated through the "tagging" of features (e.g. edges, solids, surfaces). These tags represent parameters that include: submodels, material properties, material orienters, optical properties, and radiation analysis groups. TD aliases were used for most tags to allow analysis to be performed with a variety of parameter values. "Domain-tags" were also attached to individual and groups of surfaces and solids to allow them to be used later within TD to populate objects like, for example, heaters and contactors. These tools allow the user to make changes to the geometry in SpaceClaim and then easily synchronize the mesh in TD without having to redefine the objects each time as one would if using TDMesher. The use of SpaceClaim/TD Direct helps simplify the process for importing existing geometries and in the creation of high fidelity FE meshes to represent complex parts. It also saves time and effort in the subsequent analysis.

  2. Using SpaceClaim/TD Direct for Modeling Components with Complex Geometries for the Thermal Desktop-Based Advanced Stirling Radioisotope Generator Model

    Science.gov (United States)

    Fabanich, William

    2014-01-01

    SpaceClaim/TD Direct has been used extensively in the development of the Advanced Stirling Radioisotope Generator (ASRG) thermal model. This paper outlines the workflow for that aspect of the task and includes proposed best practices and lessons learned. The ASRG thermal model was developed to predict component temperatures and power output and to provide insight into the prime contractors thermal modeling efforts. The insulation blocks, heat collectors, and cold side adapter flanges (CSAFs) were modeled with this approach. The model was constructed using mostly TD finite difference (FD) surfaces solids. However, some complex geometry could not be reproduced with TD primitives while maintaining the desired degree of geometric fidelity. Using SpaceClaim permitted the import of original CAD files and enabled the defeaturing repair of those geometries. TD Direct (a SpaceClaim add-on from CRTech) adds features that allowed the mark-up of that geometry. These so-called mark-ups control how finite element (FE) meshes were generated and allowed the tagging of features (e.g. edges, solids, surfaces). These tags represent parameters that include: submodels, material properties, material orienters, optical properties, and radiation analysis groups. TD aliases were used for most tags to allow analysis to be performed with a variety of parameter values. Domain-tags were also attached to individual and groups of surfaces and solids to allow them to be used later within TD to populate objects like, for example, heaters and contactors. These tools allow the user to make changes to the geometry in SpaceClaim and then easily synchronize the mesh in TD without having to redefine these objects each time as one would if using TD Mesher.The use of SpaceClaim/TD Direct has helped simplify the process for importing existing geometries and in the creation of high fidelity FE meshes to represent complex parts. It has also saved time and effort in the subsequent analysis.

  3. Off-take Model of the SPACE Code and Its Validation

    International Nuclear Information System (INIS)

    Oh, Myung Taek; Park, Chan Eok; Sohn, Jong Joo

    2011-01-01

    Liquid entrainment and vapor pull-through models of horizontal pipe have been implemented in the SPACE code. The model of SPACE accounts for the phase separation phenomena and computes the flux of mass and energy through an off-take attached to a horizontal pipe when stratified conditions occur in the horizontal pipe. This model is referred to as the off-take model. The importance of predicting the fluid conditions through an off-take in a small-break LOCA has been well known. In this case, the occurrence of the stratification can affect the break node void fraction and thus the break flow discharged from the primary system. In order to validate the off-take model newly developed for the SPACE code, a simulation of the HDU experiments has been performed. The main feature of the off-take model and its application results will be presented in this paper

  4. Parameter retrieval of chiral metamaterials based on the state-space approach.

    Science.gov (United States)

    Zarifi, Davoud; Soleimani, Mohammad; Abdolali, Ali

    2013-08-01

    This paper deals with the introduction of an approach for the electromagnetic characterization of homogeneous chiral layers. The proposed method is based on the state-space approach and properties of a 4×4 state transition matrix. Based on this, first, the forward problem analysis through the state-space method is reviewed and properties of the state transition matrix of a chiral layer are presented and proved as two theorems. The formulation of a proposed electromagnetic characterization method is then presented. In this method, scattering data for a linearly polarized plane wave incident normally on a homogeneous chiral slab are combined with properties of a state transition matrix and provide a powerful characterization method. The main difference with respect to other well-established retrieval procedures based on the use of the scattering parameters relies on the direct computation of the transfer matrix of the slab as opposed to the conventional calculation of the propagation constant and impedance of the modes supported by the medium. The proposed approach allows avoiding nonlinearity of the problem but requires getting enough equations to fulfill the task which was provided by considering some properties of the state transition matrix. To demonstrate the applicability and validity of the method, the constitutive parameters of two well-known dispersive chiral metamaterial structures at microwave frequencies are retrieved. The results show that the proposed method is robust and reliable.

  5. Influence of Population Variation of Physiological Parameters in Computational Models of Space Physiology

    Science.gov (United States)

    Myers, J. G.; Feola, A.; Werner, C.; Nelson, E. S.; Raykin, J.; Samuels, B.; Ethier, C. R.

    2016-01-01

    The earliest manifestations of Visual Impairment and Intracranial Pressure (VIIP) syndrome become evident after months of spaceflight and include a variety of ophthalmic changes, including posterior globe flattening and distension of the optic nerve sheath. Prevailing evidence links the occurrence of VIIP to the cephalic fluid shift induced by microgravity and the subsequent pressure changes around the optic nerve and eye. Deducing the etiology of VIIP is challenging due to the wide range of physiological parameters that may be influenced by spaceflight and are required to address a realistic spectrum of physiological responses. Here, we report on the application of an efficient approach to interrogating physiological parameter space through computational modeling. Specifically, we assess the influence of uncertainty in input parameters for two models of VIIP syndrome: a lumped-parameter model (LPM) of the cardiovascular and central nervous systems, and a finite-element model (FEM) of the posterior eye, optic nerve head (ONH) and optic nerve sheath. Methods: To investigate the parameter space in each model, we employed Latin hypercube sampling partial rank correlation coefficient (LHSPRCC) strategies. LHS techniques outperform Monte Carlo approaches by enforcing efficient sampling across the entire range of all parameters. The PRCC method estimates the sensitivity of model outputs to these parameters while adjusting for the linear effects of all other inputs. The LPM analysis addressed uncertainties in 42 physiological parameters, such as initial compartmental volume and nominal compartment percentage of total cardiac output in the supine state, while the FEM evaluated the effects on biomechanical strain from uncertainties in 23 material and pressure parameters for the ocular anatomy. Results and Conclusion: The LPM analysis identified several key factors including high sensitivity to the initial fluid distribution. The FEM study found that intraocular pressure and

  6. In-Space Manufacturing Baseline Property Development

    Science.gov (United States)

    Stockman, Tom; Schneider, Judith; Prater, Tracie; Bean, Quincy; Werkheiser, Nicki

    2016-01-01

    The In-Space Manufacturing (ISM) project at NASA Marshall Space Flight Center currently operates a 3D FDM (fused deposition modeling) printer onboard the International Space Station. In order to enable utilization of this capability by designer, the project needs to establish characteristic material properties for materials produced using the process. This is difficult for additive manufacturing since standards and specifications do not yet exist for these technologies. Due to availability of crew time, there are limitations to the sample size which in turn limits the application of the traditional design allowables approaches to develop a materials property database for designers. In this study, various approaches to development of material databases were evaluated for use by designers of space systems who wish to leverage in-space manufacturing capabilities. This study focuses on alternative statistical techniques for baseline property development to support in-space manufacturing.

  7. Symbols, spaces and materiality: a transmission-based approach to Aegean Bronze Age ritual.

    OpenAIRE

    Briault, C.

    2005-01-01

    This thesis explores the transmission of ritual practices in the second millennium BC Aegean. In contrast to previous approaches, which often overlook gaps in the diachronic record, emphasising continuity in cult practice over very long timescales, it is argued here that through charting the spatial and temporal distributions of three broad material types (cult symbols, spaces and objects), it is possible to document the spread of cult practice over time and space, and, crucially, to monitor ...

  8. Contaminant ingress into multizone buildings: An analytical state-space approach

    KAUST Repository

    Parker, Simon; Coffey, Chris; Gravesen, Jens; Kirkpatrick, James; Ratcliffe, Keith; Lingard, Bryan; Nally, James

    2013-01-01

    The ingress of exterior contaminants into buildings is often assessed by treating the building interior as a single well-mixed space. Multizone modelling provides an alternative way of representing buildings that can estimate concentration time

  9. Ensemble downscaling in coupled solar wind-magnetosphere modeling for space weather forecasting.

    Science.gov (United States)

    Owens, M J; Horbury, T S; Wicks, R T; McGregor, S L; Savani, N P; Xiong, M

    2014-06-01

    Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind "noise," which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical "downscaling" of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme. Solar wind models must be downscaled in order to drive magnetospheric models Ensemble downscaling is more effective than deterministic downscaling The magnetosphere responds nonlinearly to small-scale solar wind fluctuations.

  10. Gamow-Teller response in the configuration space of a density-functional-theory-rooted no-core configuration-interaction model

    Science.gov (United States)

    Konieczka, M.; Kortelainen, M.; Satuła, W.

    2018-03-01

    Background: The atomic nucleus is a unique laboratory in which to study fundamental aspects of the electroweak interaction. This includes a question concerning in medium renormalization of the axial-vector current, which still lacks satisfactory explanation. Study of spin-isospin or Gamow-Teller (GT) response may provide valuable information on both the quenching of the axial-vector coupling constant as well as on nuclear structure and nuclear astrophysics. Purpose: We have performed a seminal calculation of the GT response by using the no-core configuration-interaction approach rooted in multireference density functional theory (DFT-NCCI). The model treats properly isospin and rotational symmetries and can be applied to calculate both the nuclear spectra and transition rates in atomic nuclei, irrespectively of their mass and particle-number parity. Methods: The DFT-NCCI calculation proceeds as follows: First, one builds a configuration space by computing relevant, for a given physical problem, (multi)particle-(multi)hole Slater determinants. Next, one applies the isospin and angular-momentum projections and performs the isospin and K mixing in order to construct a model space composed of linearly dependent states of good angular momentum. Eventually, one mixes the projected states by solving the Hill-Wheeler-Griffin equation. Results: The method is applied to compute the GT strength distribution in selected N ≈Z nuclei including the p -shell 8Li and 8Be nuclei and the s d -shell well-deformed nucleus 24Mg. In order to demonstrate a flexibility of the approach we present also a calculation of the superallowed GT β decay in doubly-magic spherical 100Sn and the low-spin spectrum in 100In. Conclusions: It is demonstrated that the DFT-NCCI model is capable of capturing the GT response satisfactorily well by using a relatively small configuration space, exhausting simultaneously the GT sum rule. The model, due to its flexibility and broad range of applicability, may

  11. The Living With a Star Program Space Environment Testbed

    Science.gov (United States)

    Barth, Janet; Day, John H. (Technical Monitor)

    2001-01-01

    This viewgraph presentation describes the objective, approach, and scope of the Living With a Star (LWS) program at the Marshall Space Flight Center. Scientists involved in the project seek to refine the understanding of space weather and the role of solar variability in terrestrial climate change. Research and the development of improved analytic methods have led to increased predictive capabilities and the improvement of environment specification models. Specifically, the Space Environment Testbed (SET) project of LWS is responsible for the implementation of improved engineering approaches to observing solar effects on climate change. This responsibility includes technology development, ground test protocol development, and the development of a technology application model/engineering tool.

  12. Space, the Final Frontier”: How Good are Agent-Based Models at Simulating Individuals and Space in Cities?

    Directory of Open Access Journals (Sweden)

    Alison Heppenstall

    2016-01-01

    Full Text Available Cities are complex systems, comprising of many interacting parts. How we simulate and understand causality in urban systems is continually evolving. Over the last decade the agent-based modeling (ABM paradigm has provided a new lens for understanding the effects of interactions of individuals and how through such interactions macro structures emerge, both in the social and physical environment of cities. However, such a paradigm has been hindered due to computational power and a lack of large fine scale datasets. Within the last few years we have witnessed a massive increase in computational processing power and storage, combined with the onset of Big Data. Today geographers find themselves in a data rich era. We now have access to a variety of data sources (e.g., social media, mobile phone data, etc. that tells us how, and when, individuals are using urban spaces. These data raise several questions: can we effectively use them to understand and model cities as complex entities? How well have ABM approaches lent themselves to simulating the dynamics of urban processes? What has been, or will be, the influence of Big Data on increasing our ability to understand and simulate cities? What is the appropriate level of spatial analysis and time frame to model urban phenomena? Within this paper we discuss these questions using several examples of ABM applied to urban geography to begin a dialogue about the utility of ABM for urban modeling. The arguments that the paper raises are applicable across the wider research environment where researchers are considering using this approach.

  13. Linking Time and Space Scales in Distributed Hydrological Modelling - a case study for the VIC model

    Science.gov (United States)

    Melsen, Lieke; Teuling, Adriaan; Torfs, Paul; Zappa, Massimiliano; Mizukami, Naoki; Clark, Martyn; Uijlenhoet, Remko

    2015-04-01

    One of the famous paradoxes of the Greek philosopher Zeno of Elea (~450 BC) is the one with the arrow: If one shoots an arrow, and cuts its motion into such small time steps that at every step the arrow is standing still, the arrow is motionless, because a concatenation of non-moving parts does not create motion. Nowadays, this reasoning can be refuted easily, because we know that motion is a change in space over time, which thus by definition depends on both time and space. If one disregards time by cutting it into infinite small steps, motion is also excluded. This example shows that time and space are linked and therefore hard to evaluate separately. As hydrologists we want to understand and predict the motion of water, which means we have to look both in space and in time. In hydrological models we can account for space by using spatially explicit models. With increasing computational power and increased data availability from e.g. satellites, it has become easier to apply models at a higher spatial resolution. Increasing the resolution of hydrological models is also labelled as one of the 'Grand Challenges' in hydrology by Wood et al. (2011) and Bierkens et al. (2014), who call for global modelling at hyperresolution (~1 km and smaller). A literature survey on 242 peer-viewed articles in which the Variable Infiltration Capacity (VIC) model was used, showed that the spatial resolution at which the model is applied has decreased over the past 17 years: From 0.5 to 2 degrees when the model was just developed, to 1/8 and even 1/32 degree nowadays. On the other hand the literature survey showed that the time step at which the model is calibrated and/or validated remained the same over the last 17 years; mainly daily or monthly. Klemeš (1983) stresses the fact that space and time scales are connected, and therefore downscaling the spatial scale would also imply downscaling of the temporal scale. Is it worth the effort of downscaling your model from 1 degree to 1

  14. Topological Schemas of Memory Spaces

    Science.gov (United States)

    Babichev, Andrey; Dabaghian, Yuri A.

    2018-01-01

    Hippocampal cognitive map—a neuronal representation of the spatial environment—is widely discussed in the computational neuroscience literature for decades. However, more recent studies point out that hippocampus plays a major role in producing yet another cognitive framework—the memory space—that incorporates not only spatial, but also non-spatial memories. Unlike the cognitive maps, the memory spaces, broadly understood as “networks of interconnections among the representations of events,” have not yet been studied from a theoretical perspective. Here we propose a mathematical approach that allows modeling memory spaces constructively, as epiphenomena of neuronal spiking activity and thus to interlink several important notions of cognitive neurophysiology. First, we suggest that memory spaces have a topological nature—a hypothesis that allows treating both spatial and non-spatial aspects of hippocampal function on equal footing. We then model the hippocampal memory spaces in different environments and demonstrate that the resulting constructions naturally incorporate the corresponding cognitive maps and provide a wider context for interpreting spatial information. Lastly, we propose a formal description of the memory consolidation process that connects memory spaces to the Morris' cognitive schemas-heuristic representations of the acquired memories, used to explain the dynamics of learning and memory consolidation in a given environment. The proposed approach allows evaluating these constructs as the most compact representations of the memory space's structure. PMID:29740306

  15. Topological Schemas of Memory Spaces

    Directory of Open Access Journals (Sweden)

    Andrey Babichev

    2018-04-01

    Full Text Available Hippocampal cognitive map—a neuronal representation of the spatial environment—is widely discussed in the computational neuroscience literature for decades. However, more recent studies point out that hippocampus plays a major role in producing yet another cognitive framework—the memory space—that incorporates not only spatial, but also non-spatial memories. Unlike the cognitive maps, the memory spaces, broadly understood as “networks of interconnections among the representations of events,” have not yet been studied from a theoretical perspective. Here we propose a mathematical approach that allows modeling memory spaces constructively, as epiphenomena of neuronal spiking activity and thus to interlink several important notions of cognitive neurophysiology. First, we suggest that memory spaces have a topological nature—a hypothesis that allows treating both spatial and non-spatial aspects of hippocampal function on equal footing. We then model the hippocampal memory spaces in different environments and demonstrate that the resulting constructions naturally incorporate the corresponding cognitive maps and provide a wider context for interpreting spatial information. Lastly, we propose a formal description of the memory consolidation process that connects memory spaces to the Morris' cognitive schemas-heuristic representations of the acquired memories, used to explain the dynamics of learning and memory consolidation in a given environment. The proposed approach allows evaluating these constructs as the most compact representations of the memory space's structure.

  16. The SPACE 1.0 model: a Landlab component for 2-D calculation of sediment transport, bedrock erosion, and landscape evolution

    Science.gov (United States)

    Shobe, Charles M.; Tucker, Gregory E.; Barnhart, Katherine R.

    2017-12-01

    Models of landscape evolution by river erosion are often either transport-limited (sediment is always available but may or may not be transportable) or detachment-limited (sediment must be detached from the bed but is then always transportable). While several models incorporate elements of, or transition between, transport-limited and detachment-limited behavior, most require that either sediment or bedrock, but not both, are eroded at any given time. Modeling landscape evolution over large spatial and temporal scales requires a model that can (1) transition freely between transport-limited and detachment-limited behavior, (2) simultaneously treat sediment transport and bedrock erosion, and (3) run in 2-D over large grids and be coupled with other surface process models. We present SPACE (stream power with alluvium conservation and entrainment) 1.0, a new model for simultaneous evolution of an alluvium layer and a bedrock bed based on conservation of sediment mass both on the bed and in the water column. The model treats sediment transport and bedrock erosion simultaneously, embracing the reality that many rivers (even those commonly defined as bedrock rivers) flow over a partially alluviated bed. SPACE improves on previous models of bedrock-alluvial rivers by explicitly calculating sediment erosion and deposition rather than relying on a flux-divergence (Exner) approach. The SPACE model is a component of the Landlab modeling toolkit, a Python-language library used to create models of Earth surface processes. Landlab allows efficient coupling between the SPACE model and components simulating basin hydrology, hillslope evolution, weathering, lithospheric flexure, and other surface processes. Here, we first derive the governing equations of the SPACE model from existing sediment transport and bedrock erosion formulations and explore the behavior of local analytical solutions for sediment flux and alluvium thickness. We derive steady-state analytical solutions for

  17. Space Environment Modelling with the Use of Artificial Intelligence Methods

    Science.gov (United States)

    Lundstedt, H.; Wintoft, P.; Wu, J.-G.; Gleisner, H.; Dovheden, V.

    1996-12-01

    Space based technological systems are affected by the space weather in many ways. Several severe failures of satellites have been reported at times of space storms. Our society also increasingly depends on satellites for communication, navigation, exploration, and research. Predictions of the conditions in the satellite environment have therefore become very important. We will here present predictions made with the use of artificial intelligence (AI) techniques, such as artificial neural networks (ANN) and hybrids of AT methods. We are developing a space weather model based on intelligence hybrid systems (IHS). The model consists of different forecast modules, each module predicts the space weather on a specific time-scale. The time-scales range from minutes to months with the fundamental time-scale of 1-5 minutes, 1-3 hours, 1-3 days, and 27 days. Solar and solar wind data are used as input data. From solar magnetic field measurements, either made on the ground at Wilcox Solar Observatory (WSO) at Stanford, or made from space by the satellite SOHO, solar wind parameters can be predicted and modelled with ANN and MHD models. Magnetograms from WSO are available on a daily basis. However, from SOHO magnetograms will be available every 90 minutes. SOHO magnetograms as input to ANNs will therefore make it possible to even predict solar transient events. Geomagnetic storm activity can today be predicted with very high accuracy by means of ANN methods using solar wind input data. However, at present real-time solar wind data are only available during part of the day from the satellite WIND. With the launch of ACE in 1997, solar wind data will on the other hand be available during 24 hours per day. The conditions of the satellite environment are not only disturbed at times of geomagnetic storms but also at times of intense solar radiation and highly energetic particles. These events are associated with increased solar activity. Predictions of these events are therefore

  18. On the structure of the space of geometric product-form models

    NARCIS (Netherlands)

    Bayer, Nimrod; Boucherie, Richardus J.

    2002-01-01

    This article deals with Markovian models defined on a finite-dimensional discrete state space and possess a stationary state distribution of a product-form. We view the space of such models as a mathematical object and explore its structure. We focus on models on an orthant [script Z]+n, which are

  19. State-space approaches for modelling and control in financial engineering systems theory and machine learning methods

    CERN Document Server

    Rigatos, Gerasimos G

    2017-01-01

    The book conclusively solves problems associated with the control and estimation of nonlinear and chaotic dynamics in financial systems when these are described in the form of nonlinear ordinary differential equations. It then addresses problems associated with the control and estimation of financial systems governed by partial differential equations (e.g. the Black–Scholes partial differential equation (PDE) and its variants). Lastly it an offers optimal solution to the problem of statistical validation of computational models and tools used to support financial engineers in decision making. The application of state-space models in financial engineering means that the heuristics and empirical methods currently in use in decision-making procedures for finance can be eliminated. It also allows methods of fault-free performance and optimality in the management of assets and capitals and methods assuring stability in the functioning of financial systems to be established. Covering the following key are...

  20. SpaceWire model development technology for satellite architecture.

    Energy Technology Data Exchange (ETDEWEB)

    Eldridge, John M.; Leemaster, Jacob Edward; Van Leeuwen, Brian P.

    2011-09-01

    Packet switched data communications networks that use distributed processing architectures have the potential to simplify the design and development of new, increasingly more sophisticated satellite payloads. In addition, the use of reconfigurable logic may reduce the amount of redundant hardware required in space-based applications without sacrificing reliability. These concepts were studied using software modeling and simulation, and the results are presented in this report. Models of the commercially available, packet switched data interconnect SpaceWire protocol were developed and used to create network simulations of data networks containing reconfigurable logic with traffic flows for timing system distribution.

  1. Application of a computationally efficient geostatistical approach to characterizing variably spaced water-table data

    International Nuclear Information System (INIS)

    Quinn, J.J.

    1996-01-01

    Geostatistical analysis of hydraulic head data is useful in producing unbiased contour plots of head estimates and relative errors. However, at most sites being characterized, monitoring wells are generally present at different densities, with clusters of wells in some areas and few wells elsewhere. The problem that arises when kriging data at different densities is in achieving adequate resolution of the grid while maintaining computational efficiency and working within software limitations. For the site considered, 113 data points were available over a 14-mi 2 study area, including 57 monitoring wells within an area of concern of 1.5 mi 2 . Variogram analyses of the data indicate a linear model with a negligible nugget effect. The geostatistical package used in the study allows a maximum grid of 100 by 100 cells. Two-dimensional kriging was performed for the entire study area with a 500-ft grid spacing, while the smaller zone was modeled separately with a 100-ft spacing. In this manner, grid cells for the dense area and the sparse area remained small relative to the well separation distances, and the maximum dimensions of the program were not exceeded. The spatial head results for the detailed zone were then nested into the regional output by use of a graphical, object-oriented database that performed the contouring of the geostatistical output. This study benefitted from the two-scale approach and from very fine geostatistical grid spacings relative to typical data separation distances. The combining of the sparse, regional results with those from the finer-resolution area of concern yielded contours that honored the actual data at every measurement location. The method applied in this study can also be used to generate reproducible, unbiased representations of other types of spatial data

  2. Innovative Approaches to Space-Based Manufacturing and Rapid Prototyping of Composite Materials

    Science.gov (United States)

    Hill, Charles S.

    2012-01-01

    The ability to deploy large habitable structures, construct, and service exploration vehicles in low earth orbit will be an enabling capability for continued human exploration of the solar system. It is evident that advanced manufacturing methods to fabricate replacement parts and re-utilize launch vehicle structural mass by converting it to different uses will be necessary to minimize costs and allow flexibility to remote crews engaged in space travel. Recent conceptual developments and the combination of inter-related approaches to low-cost manufacturing of composite materials and structures are described in context leading to the possibility of on-orbit and space-based manufacturing.

  3. Approaches and models of intercultural education

    Directory of Open Access Journals (Sweden)

    Iván Manuel Sánchez Fontalvo

    2013-10-01

    Full Text Available Needed to be aware of the need to build an intercultural society, awareness must be assumed in all social spheres, where stands the role play education. A role of transcendental, since it must promote educational spaces to form people with virtues and powers that allow them to live together / as in multicultural contexts and social diversities (sometimes uneven in an increasingly globalized and interconnected world, and foster the development of feelings of civic belonging shared before the neighborhood, city, region and country, allowing them concern and critical judgement to marginalization, poverty, misery and inequitable distribution of wealth, causes of structural violence, but at the same time, wanting to work for the welfare and transformation of these scenarios. Since these budgets, it is important to know the approaches and models of intercultural education that have been developed so far, analysing their impact on the contexts educational where apply.   

  4. A mixture model-based approach to the clustering of microarray expression data.

    Science.gov (United States)

    McLachlan, G J; Bean, R W; Peel, D

    2002-03-01

    This paper introduces the software EMMIX-GENE that has been developed for the specific purpose of a model-based approach to the clustering of microarray expression data, in particular, of tissue samples on a very large number of genes. The latter is a nonstandard problem in parametric cluster analysis because the dimension of the feature space (the number of genes) is typically much greater than the number of tissues. A feasible approach is provided by first selecting a subset of the genes relevant for the clustering of the tissue samples by fitting mixtures of t distributions to rank the genes in order of increasing size of the likelihood ratio statistic for the test of one versus two components in the mixture model. The imposition of a threshold on the likelihood ratio statistic used in conjunction with a threshold on the size of a cluster allows the selection of a relevant set of genes. However, even this reduced set of genes will usually be too large for a normal mixture model to be fitted directly to the tissues, and so the use of mixtures of factor analyzers is exploited to reduce effectively the dimension of the feature space of genes. The usefulness of the EMMIX-GENE approach for the clustering of tissue samples is demonstrated on two well-known data sets on colon and leukaemia tissues. For both data sets, relevant subsets of the genes are able to be selected that reveal interesting clusterings of the tissues that are either consistent with the external classification of the tissues or with background and biological knowledge of these sets. EMMIX-GENE is available at http://www.maths.uq.edu.au/~gjm/emmix-gene/

  5. A Monte Carlo Approach to Modeling the Breakup of the Space Launch System EM-1 Core Stage with an Integrated Blast and Fragment Catalogue

    Science.gov (United States)

    Richardson, Erin; Hays, M. J.; Blackwood, J. M.; Skinner, T.

    2014-01-01

    The Liquid Propellant Fragment Overpressure Acceleration Model (L-FOAM) is a tool developed by Bangham Engineering Incorporated (BEi) that produces a representative debris cloud from an exploding liquid-propellant launch vehicle. Here it is applied to the Core Stage (CS) of the National Aeronautics and Space Administration (NASA) Space Launch System (SLS launch vehicle). A combination of Probability Density Functions (PDF) based on empirical data from rocket accidents and applicable tests, as well as SLS specific geometry are combined in a MATLAB script to create unique fragment catalogues each time L-FOAM is run-tailored for a Monte Carlo approach for risk analysis. By accelerating the debris catalogue with the BEi blast model for liquid hydrogen / liquid oxygen explosions, the result is a fully integrated code that models the destruction of the CS at a given point in its trajectory and generates hundreds of individual fragment catalogues with initial imparted velocities. The BEi blast model provides the blast size (radius) and strength (overpressure) as probabilities based on empirical data and anchored with analytical work. The coupling of the L-FOAM catalogue with the BEi blast model is validated with a simulation of the Project PYRO S-IV destruct test. When running a Monte Carlo simulation, L-FOAM can accelerate all catalogues with the same blast (mean blast, 2 s blast, etc.), or vary the blast size and strength based on their respective probabilities. L-FOAM then propagates these fragments until impact with the earth. Results from L-FOAM include a description of each fragment (dimensions, weight, ballistic coefficient, type and initial location on the rocket), imparted velocity from the blast, and impact data depending on user desired application. LFOAM application is for both near-field (fragment impact to escaping crew capsule) and far-field (fragment ground impact footprint) safety considerations. The user is thus able to use statistics from a Monte Carlo

  6. Effective use of integrated hydrological models in basin-scale water resources management: surrogate modeling approaches

    Science.gov (United States)

    Zheng, Y.; Wu, B.; Wu, X.

    2015-12-01

    Integrated hydrological models (IHMs) consider surface water and subsurface water as a unified system, and have been widely adopted in basin-scale water resources studies. However, due to IHMs' mathematical complexity and high computational cost, it is difficult to implement them in an iterative model evaluation process (e.g., Monte Carlo Simulation, simulation-optimization analysis, etc.), which diminishes their applicability for supporting decision-making in real-world situations. Our studies investigated how to effectively use complex IHMs to address real-world water issues via surrogate modeling. Three surrogate modeling approaches were considered, including 1) DYCORS (DYnamic COordinate search using Response Surface models), a well-established response surface-based optimization algorithm; 2) SOIM (Surrogate-based Optimization for Integrated surface water-groundwater Modeling), a response surface-based optimization algorithm that we developed specifically for IHMs; and 3) Probabilistic Collocation Method (PCM), a stochastic response surface approach. Our investigation was based on a modeling case study in the Heihe River Basin (HRB), China's second largest endorheic river basin. The GSFLOW (Coupled Ground-Water and Surface-Water Flow Model) model was employed. Two decision problems were discussed. One is to optimize, both in time and in space, the conjunctive use of surface water and groundwater for agricultural irrigation in the middle HRB region; and the other is to cost-effectively collect hydrological data based on a data-worth evaluation. Overall, our study results highlight the value of incorporating an IHM in making decisions of water resources management and hydrological data collection. An IHM like GSFLOW can provide great flexibility to formulating proper objective functions and constraints for various optimization problems. On the other hand, it has been demonstrated that surrogate modeling approaches can pave the path for such incorporation in real

  7. Preliminary results from a four-working space, double-acting piston, Stirling engine controls model

    Science.gov (United States)

    Daniele, C. J.; Lorenzo, C. F.

    1980-01-01

    A four working space, double acting piston, Stirling engine simulation is being developed for controls studies. The development method is to construct two simulations, one for detailed fluid behavior, and a second model with simple fluid behaviour but containing the four working space aspects and engine inertias, validate these models separately, then upgrade the four working space model by incorporating the detailed fluid behaviour model for all four working spaces. The single working space (SWS) model contains the detailed fluid dynamics. It has seven control volumes in which continuity, energy, and pressure loss effects are simulated. Comparison of the SWS model with experimental data shows reasonable agreement in net power versus speed characteristics for various mean pressure levels in the working space. The four working space (FWS) model was built to observe the behaviour of the whole engine. The drive dynamics and vehicle inertia effects are simulated. To reduce calculation time, only three volumes are used in each working space and the gas temperature are fixed (no energy equation). Comparison of the FWS model predicted power with experimental data shows reasonable agreement. Since all four working spaces are simulated, the unique capabilities of the model are exercised to look at working fluid supply transients, short circuit transients, and piston ring leakage effects.

  8. The joint space-time statistics of macroweather precipitation, space-time statistical factorization and macroweather models

    International Nuclear Information System (INIS)

    Lovejoy, S.; Lima, M. I. P. de

    2015-01-01

    Over the range of time scales from about 10 days to 30–100 years, in addition to the familiar weather and climate regimes, there is an intermediate “macroweather” regime characterized by negative temporal fluctuation exponents: implying that fluctuations tend to cancel each other out so that averages tend to converge. We show theoretically and numerically that macroweather precipitation can be modeled by a stochastic weather-climate model (the Climate Extended Fractionally Integrated Flux, model, CEFIF) first proposed for macroweather temperatures and we show numerically that a four parameter space-time CEFIF model can approximately reproduce eight or so empirical space-time exponents. In spite of this success, CEFIF is theoretically and numerically difficult to manage. We therefore propose a simplified stochastic model in which the temporal behavior is modeled as a fractional Gaussian noise but the spatial behaviour as a multifractal (climate) cascade: a spatial extension of the recently introduced ScaLIng Macroweather Model, SLIMM. Both the CEFIF and this spatial SLIMM model have a property often implicitly assumed by climatologists that climate statistics can be “homogenized” by normalizing them with the standard deviation of the anomalies. Physically, it means that the spatial macroweather variability corresponds to different climate zones that multiplicatively modulate the local, temporal statistics. This simplified macroweather model provides a framework for macroweather forecasting that exploits the system's long range memory and spatial correlations; for it, the forecasting problem has been solved. We test this factorization property and the model with the help of three centennial, global scale precipitation products that we analyze jointly in space and in time

  9. Model-free adaptive control optimization using a chaotic particle swarm approach

    Energy Technology Data Exchange (ETDEWEB)

    Santos Coelho, Leandro dos [Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Pontifical Catholic University of Parana, PUCPR, Imaculada Conceicao, 1155, 80215-901 Curitiba, Parana (Brazil)], E-mail: leandro.coelho@pucpr.br; Rodrigues Coelho, Antonio Augusto [Department of Automation and Systems, Federal University of Santa Catarina, Box 476, 88040-900 Florianopolis, Santa Catarina (Brazil)], E-mail: aarc@das.ufsc.br

    2009-08-30

    It is well known that conventional control theories are widely suited for applications where the processes can be reasonably described in advance. However, when the plant's dynamics are hard to characterize precisely or are subject to environmental uncertainties, one may encounter difficulties in applying the conventional controller design methodologies. Despite the difficulty in achieving high control performance, the fine tuning of controller parameters is a tedious task that always requires experts with knowledge in both control theory and process information. Nowadays, more and more studies have focused on the development of adaptive control algorithms that can be directly applied to complex processes whose dynamics are poorly modeled and/or have severe nonlinearities. In this context, the design of a Model-Free Learning Adaptive Control (MFLAC) based on pseudo-gradient concepts and optimization procedure by a Particle Swarm Optimization (PSO) approach using constriction coefficient and Henon chaotic sequences (CPSOH) is presented in this paper. PSO is a stochastic global optimization technique inspired by social behavior of bird flocking. The PSO models the exploration of a problem space by a population of particles. Each particle in PSO has a randomized velocity associated to it, which moves through the space of the problem. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed CPSOH introduces chaos mapping which introduces some flexibility in particle movements in each iteration. The chaotic sequences allow also explorations at early stages and exploitations at later stages during the search procedure of CPSOH. Motivation for application of CPSOH approach is to overcome the limitation of the conventional MFLAC design, which cannot guarantee satisfactory control performance when the plant has different gains for the operational range when designed by trial-and-error by user. Numerical results of the MFLAC with

  10. Model-free adaptive control optimization using a chaotic particle swarm approach

    International Nuclear Information System (INIS)

    Santos Coelho, Leandro dos; Rodrigues Coelho, Antonio Augusto

    2009-01-01

    It is well known that conventional control theories are widely suited for applications where the processes can be reasonably described in advance. However, when the plant's dynamics are hard to characterize precisely or are subject to environmental uncertainties, one may encounter difficulties in applying the conventional controller design methodologies. Despite the difficulty in achieving high control performance, the fine tuning of controller parameters is a tedious task that always requires experts with knowledge in both control theory and process information. Nowadays, more and more studies have focused on the development of adaptive control algorithms that can be directly applied to complex processes whose dynamics are poorly modeled and/or have severe nonlinearities. In this context, the design of a Model-Free Learning Adaptive Control (MFLAC) based on pseudo-gradient concepts and optimization procedure by a Particle Swarm Optimization (PSO) approach using constriction coefficient and Henon chaotic sequences (CPSOH) is presented in this paper. PSO is a stochastic global optimization technique inspired by social behavior of bird flocking. The PSO models the exploration of a problem space by a population of particles. Each particle in PSO has a randomized velocity associated to it, which moves through the space of the problem. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed CPSOH introduces chaos mapping which introduces some flexibility in particle movements in each iteration. The chaotic sequences allow also explorations at early stages and exploitations at later stages during the search procedure of CPSOH. Motivation for application of CPSOH approach is to overcome the limitation of the conventional MFLAC design, which cannot guarantee satisfactory control performance when the plant has different gains for the operational range when designed by trial-and-error by user. Numerical results of the MFLAC with CPSOH

  11. A State-Based Modeling Approach for Efficient Performance Evaluation of Embedded System Architectures at Transaction Level

    Directory of Open Access Journals (Sweden)

    Anthony Barreteau

    2012-01-01

    Full Text Available Abstract models are necessary to assist system architects in the evaluation process of hardware/software architectures and to cope with the still increasing complexity of embedded systems. Efficient methods are required to create reliable models of system architectures and to allow early performance evaluation and fast exploration of the design space. In this paper, we present a specific transaction level modeling approach for performance evaluation of hardware/software architectures. This approach relies on a generic execution model that exhibits light modeling effort. Created models are used to evaluate by simulation expected processing and memory resources according to various architectures. The proposed execution model relies on a specific computation method defined to improve the simulation speed of transaction level models. The benefits of the proposed approach are highlighted through two case studies. The first case study is a didactic example illustrating the modeling approach. In this example, a simulation speed-up by a factor of 7,62 is achieved by using the proposed computation method. The second case study concerns the analysis of a communication receiver supporting part of the physical layer of the LTE protocol. In this case study, architecture exploration is led in order to improve the allocation of processing functions.

  12. Overall feature of EAST operation space by using simple Core-SOL-Divertor model

    International Nuclear Information System (INIS)

    Hiwatari, R.; Hatayama, A.; Zhu, S.; Takizuka, T.; Tomita, Y.

    2005-01-01

    We have developed a simple Core-SOL-Divertor (C-S-D) model to investigate qualitatively the overall features of the operational space for the integrated core and edge plasma. To construct the simple C-S-D model, a simple core plasma model of ITER physics guidelines and a two-point SOL-divertor model are used. The simple C-S-D model is applied to the study of the EAST operational space with lower hybrid current drive experiments under various kinds of trade-off for the basic plasma parameters. Effective methods for extending the operation space are also presented. As shown by this study for the EAST operation space, it is evident that the C-S-D model is a useful tool to understand qualitatively the overall features of the plasma operation space. (author)

  13. Application of Interval Predictor Models to Space Radiation Shielding

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy,Daniel P.; Norman, Ryan B.; Blattnig, Steve R.

    2016-01-01

    This paper develops techniques for predicting the uncertainty range of an output variable given input-output data. These models are called Interval Predictor Models (IPM) because they yield an interval valued function of the input. This paper develops IPMs having a radial basis structure. This structure enables the formal description of (i) the uncertainty in the models parameters, (ii) the predicted output interval, and (iii) the probability that a future observation would fall in such an interval. In contrast to other metamodeling techniques, this probabilistic certi cate of correctness does not require making any assumptions on the structure of the mechanism from which data are drawn. Optimization-based strategies for calculating IPMs having minimal spread while containing all the data are developed. Constraints for bounding the minimum interval spread over the continuum of inputs, regulating the IPMs variation/oscillation, and centering its spread about a target point, are used to prevent data over tting. Furthermore, we develop an approach for using expert opinion during extrapolation. This metamodeling technique is illustrated using a radiation shielding application for space exploration. In this application, we use IPMs to describe the error incurred in predicting the ux of particles resulting from the interaction between a high-energy incident beam and a target.

  14. A novel Generalized State-Space Averaging (GSSA) model for advanced aircraft electric power systems

    International Nuclear Information System (INIS)

    Ebrahimi, Hadi; El-Kishky, Hassan

    2015-01-01

    Highlights: • A study model is developed for aircraft electric power systems. • A novel GSSA model is developed for the interconnected power grid. • The system’s dynamics are characterized under various conditions. • The averaged results are compared and verified with the actual model. • The obtained measured values are validated with available aircraft standards. - Abstract: The growing complexity of Advanced Aircraft Electric Power Systems (AAEPS) has made conventional state-space averaging models inadequate for systems analysis and characterization. This paper presents a novel Generalized State-Space Averaging (GSSA) model for the system analysis, control and characterization of AAEPS. The primary objective of this paper is to introduce a mathematically elegant and computationally simple model to copy the AAEPS behavior at the critical nodes of the electric grid. Also, to reduce some or all of the drawbacks (complexity, cost, simulation time…, etc) associated with sensor-based monitoring and computer aided design software simulations popularly used for AAEPS characterization. It is shown in this paper that the GSSA approach overcomes the limitations of the conventional state-space averaging method, which fails to predict the behavior of AC signals in a circuit analysis. Unlike conventional averaging method, the GSSA model presented in this paper includes both DC and AC components. This would capture the key dynamic and steady-state characteristics of the aircraft electric systems. The developed model is then examined for the aircraft system’s visualization and accuracy of computation under different loading scenarios. Through several case studies, the applicability and effectiveness of the GSSA method is verified by comparing to the actual real-time simulation model obtained from Powersim 9 (PSIM9) software environment. The simulations results represent voltage, current and load power at the major nodes of the AAEPS. It has been demonstrated that

  15. Modeling, Analysis, and Optimization Issues for Large Space Structures

    Science.gov (United States)

    Pinson, L. D. (Compiler); Amos, A. K. (Compiler); Venkayya, V. B. (Compiler)

    1983-01-01

    Topics concerning the modeling, analysis, and optimization of large space structures are discussed including structure-control interaction, structural and structural dynamics modeling, thermal analysis, testing, and design.

  16. Computation of External Quality Factors for RF Structures by Means of Model Order Reduction and a Perturbation Approach

    CERN Document Server

    Flisgen, Thomas; van Rienen, Ursula

    2016-01-01

    External quality factors are significant quantities to describe losses via waveguide ports in radio frequency resonators. The current contribution presents a novel approach to determine external quality factors by means of a two-step procedure: First, a state-space model for the lossless radio frequency structure is generated and its model order is reduced. Subsequently, a perturbation method is applied on the reduced model so that external losses are accounted for. The advantage of this approach results from the fact that the challenges in dealing with lossy systems are shifted to the reduced order model. This significantly saves computational costs. The present paper provides a short overview on existing methods to compute external quality factors. Then, the novel approach is introduced and validated in terms of accuracy and computational time by means of commercial software.

  17. Systems Biology Approach and Mathematical Modeling for Analyzing Phase-Space Switch During Epithelial-Mesenchymal Transition.

    Science.gov (United States)

    Simeoni, Chiara; Dinicola, Simona; Cucina, Alessandra; Mascia, Corrado; Bizzarri, Mariano

    2018-01-01

    In this report, we aim at presenting a viable strategy for the study of Epithelial-Mesenchymal Transition (EMT) and its opposite Mesenchymal-Epithelial Transition (MET) by means of a Systems Biology approach combined with a suitable Mathematical Modeling analysis. Precisely, it is shown how the presence of a metastable state, that is identified at a mesoscopic level of description, is crucial for making possible the appearance of a phase transition mechanism in the framework of fast-slow dynamics for Ordinary Differential Equations (ODEs).

  18. Truncated Hilbert Space Approach for the 1+1D phi^4 Theory

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    (an informal seminar, not a regular string seminar) We used the massive analogue of the truncated conformal space approach to study the broken phase of the 1+1 dimensional scalar phi^4 model in finite volume, similarly to the work by S. Rychkov and L. Vitale. In our work, the finite size spectrum was determined numerically using an effective eigensolver routine, which was followed by a simple extrapolation in the cutoff energy. We analyzed both the periodic and antiperiodic sectors. The results were compared with semiclassical and Bethe-Yang results as well as perturbation theory. We obtained the coupling dependence of the infinite volume breather and kink masses for moderate couplings. The results fit well with semiclassics and perturbative estimations, and confirm the conjecture of Mussardo that at most two neutral excitations can exist in the spectrum. We believe that improving our method with the renormalization procedure of Rychkov et al. enables to measure further interesting quantities such as decay ra...

  19. THE MODEL OF REALIZATION OF INTEGRATIVE APPROACH TO THE WORK WITH PEDAGOGICALLY GIFTED STUDENTS

    Directory of Open Access Journals (Sweden)

    Golubova Anna Vasilievna

    2013-05-01

    Full Text Available The article examines the components and criteria of pedagogical giftedness of students; the levels of formedness and pedagogical conditions of its development are identified. Purpose: to describe the model of realization of integrative approach to the work with pedagogically gifted students in educational space of university. Methodology: theoretical level methods of pedagogical phenomena learning, methods of empirical level (observation, interviewing, questionnaires, conversations, psychological tests etc.. The results of the research proved that phased realization of integrative approach to the work with pedagogically gifted students in educational space of university provides the rising of pedagogically giftedness formation level. The next pedagogical conditions of pedagogically giftedness formation are described: the creation of pedagogically-oriented creative environment; promotion of positive motivational setting for future professional and educational activities; attracting future teachers to the creative professional-oriented learning and cognitive activity. Practical implications: the educational process of higher pedagogical educational institutions.

  20. An Online Causal Inference Framework for Modeling and Designing Systems Involving User Preferences: A State-Space Approach

    Directory of Open Access Journals (Sweden)

    Ibrahim Delibalta

    2017-01-01

    Full Text Available We provide a causal inference framework to model the effects of machine learning algorithms on user preferences. We then use this mathematical model to prove that the overall system can be tuned to alter those preferences in a desired manner. A user can be an online shopper or a social media user, exposed to digital interventions produced by machine learning algorithms. A user preference can be anything from inclination towards a product to a political party affiliation. Our framework uses a state-space model to represent user preferences as latent system parameters which can only be observed indirectly via online user actions such as a purchase activity or social media status updates, shares, blogs, or tweets. Based on these observations, machine learning algorithms produce digital interventions such as targeted advertisements or tweets. We model the effects of these interventions through a causal feedback loop, which alters the corresponding preferences of the user. We then introduce algorithms in order to estimate and later tune the user preferences to a particular desired form. We demonstrate the effectiveness of our algorithms through experiments in different scenarios.

  1. The Space Laser Business Model

    Science.gov (United States)

    2005-01-01

    Creating long-duration, high-powered lasers, for satellites, that can withstand the type of optical misalignment and damage dished out by the unforgiving environment of space, is work that is unique to NASA. It is complicated, specific work, where each step forward is into uncharted territory. In the 1990s, as this technology was first being created, NASA gave free reign to a group of "laser jocks" to develop their own business model and supply the Space Agency with the technology it needed. It was still to be a part of NASA as a division of Goddard Space Flight Center, but would operate independently out of a remote office. The idea for this satellite laboratory was based on the Skunk Works concept at Lockheed Martin Corporation. Formerly known as the Lockheed Corporation, in 1943, the aerospace firm, realizing that the type of advanced research it needed done could not be performed within the confines of a larger company, allowed a group of researchers and engineers to essentially run their own microbusiness without the corporate oversight. The Skunk Works project, in Burbank, California, produced America s first jet fighter, the world s most successful spy plane (U-2), the first 3-times-the-speed-of-sound surveillance aircraft, and the F-117A Nighthawk Stealth Fighter. Boeing followed suit with its Phantom Works, an advanced research and development branch of the company that operates independent of the larger unit and is responsible for a great deal of its most cutting-edge research. NASA s version of this advanced business model was the Space Lidar Technology Center (SLTC), just south of Goddard, in College Park, Maryland. Established in 1998 under a Cooperative Agreement between Goddard and the University of Maryland s A. James Clark School of Engineering, it was a high-tech laser shop where a small group of specialists, never more than 20 employees, worked all hours of the day and night to create the cutting- edge technology the Agency required of them. Drs

  2. Recent trends in space mapping technology

    DEFF Research Database (Denmark)

    Bandler, John W.; Cheng, Qingsha S.; Hailu, Daniel

    2004-01-01

    We review recent trends in the art of Space Mapping (SM) technology for modeling and design of engineering devices and systems. The SM approach aims at achieving a satisfactory solution with a handful of computationally expensive so-called "fine" model evaluations. SM procedures iteratively update...

  3. Modelling an industrial anaerobic granular reactor using a multi-scale approach

    DEFF Research Database (Denmark)

    Feldman, Hannah; Flores Alsina, Xavier; Ramin, Pedram

    2017-01-01

    The objective of this paper is to show the results of an industrial project dealing with modelling of anaerobic digesters. A multi-scale mathematical approach is developed to describe reactor hydrodynamics, granule growth/distribution and microbial competition/inhibition for substrate/space within...... the biofilm. The main biochemical and physico-chemical processes in the model are based on the Anaerobic Digestion Model No 1 (ADM1) extended with the fate of phosphorus (P), sulfur (S) and ethanol (Et-OH). Wastewater dynamic conditions are reproduced and data frequency increased using the Benchmark...... simulations show the effects on the overall process performance when operational (pH) and loading (S:COD) conditions are modified. Lastly, the effect of intra-granular precipitation on the overall organic/inorganic distribution is assessed at: 1) different times; and, 2) reactor heights. Finally...

  4. A Situative Space Model for Mobile Mixed-Reality Computing

    DEFF Research Database (Denmark)

    Pederson, Thomas; Janlert, Lars-Erik; Surie, Dipak

    2011-01-01

    This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time.......This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time....

  5. State-Space Modelling of Loudspeakers using Fractional Derivatives

    DEFF Research Database (Denmark)

    King, Alexander Weider; Agerkvist, Finn T.

    2015-01-01

    This work investigates the use of fractional order derivatives in modeling moving-coil loudspeakers. A fractional order state-space solution is developed, leading the way towards incorporating nonlinearities into a fractional order system. The method is used to calculate the response of a fractio......This work investigates the use of fractional order derivatives in modeling moving-coil loudspeakers. A fractional order state-space solution is developed, leading the way towards incorporating nonlinearities into a fractional order system. The method is used to calculate the response...... of a fractional harmonic oscillator, representing the mechanical part of a loudspeaker, showing the effect of the fractional derivative and its relationship to viscoelasticity. Finally, a loudspeaker model with a fractional order viscoelastic suspension and fractional order voice coil is fit to measurement data...

  6. The standard model on non-commutative space-time

    International Nuclear Information System (INIS)

    Calmet, X.; Jurco, B.; Schupp, P.; Wohlgenannt, M.; Wess, J.

    2002-01-01

    We consider the standard model on a non-commutative space and expand the action in the non-commutativity parameter θ μν . No new particles are introduced; the structure group is SU(3) x SU(2) x U(1). We derive the leading order action. At zeroth order the action coincides with the ordinary standard model. At leading order in θ μν we find new vertices which are absent in the standard model on commutative space-time. The most striking features are couplings between quarks, gluons and electroweak bosons and many new vertices in the charged and neutral currents. We find that parity is violated in non-commutative QCD. The Higgs mechanism can be applied. QED is not deformed in the minimal version of the NCSM to the order considered. (orig.)

  7. Space-time modeling of timber prices

    Science.gov (United States)

    Mo Zhou; Joseph Buongriorno

    2006-01-01

    A space-time econometric model was developed for pine sawtimber timber prices of 21 geographically contiguous regions in the southern United States. The correlations between prices in neighboring regions helped predict future prices. The impulse response analysis showed that although southern pine sawtimber markets were not globally integrated, local supply and demand...

  8. Space Weather Models and Their Validation and Verification at the CCMC

    Science.gov (United States)

    Hesse, Michael

    2010-01-01

    The Community Coordinated l\\lodeling Center (CCMC) is a US multi-agency activity with a dual mission. With equal emphasis, CCMC strives to provide science support to the international space research community through the execution of advanced space plasma simulations, and it endeavors to support the space weather needs of the CS and partners. Space weather support involves a broad spectrum, from designing robust forecasting systems and transitioning them to forecasters, to providing space weather updates and forecasts to NASA's robotic mission operators. All of these activities have to rely on validation and verification of models and their products, so users and forecasters have the means to assign confidence levels to the space weather information. In this presentation, we provide an overview of space weather models resident at CCMC, as well as of validation and verification activities undertaken at CCMC or through the use of CCMC services.

  9. Phase transitions in de Sitter space

    Directory of Open Access Journals (Sweden)

    Alexander Vilenkin

    1983-10-01

    Full Text Available An effective potential in de Sitter space is calculated for a model of two interacting scalar fields in one-loop approximation and in a self-consistent approximation which takes into account an infinite set of diagrams. Various approaches to renormalization in de Sitter space are discussed. The results are applied to analyze the phase transition in the Hawking-Moss version of the inflationary universe scenario. Requiring that inflation is sufficiently large, we derive constraints on the parameters of the model.

  10. A semiclassical approach to many-body interference in Fock-space

    Energy Technology Data Exchange (ETDEWEB)

    Engl, Thomas

    2015-11-01

    Many-body systems draw ever more physicists' attention. Such an increase of interest often comes along with the development of new theoretical methods. In this thesis, a non-perturbative semiclassical approach is developed, which allows to analytically study many-body interference effects both in bosonic and fermionic Fock space and is expected to be applicable to many research areas in physics ranging from Quantum Optics and Ultracold Atoms to Solid State Theory and maybe even High Energy Physics. After the derivation of the semiclassical approximation, which is valid in the limit of large total number of particles, first applications manifesting the presence of many-body interference effects are shown. Some of them are confirmed numerically thus verifying the semiclassical predictions. Among these results are coherent back-/forward-scattering in bosonic and fermionic Fock space as well as a many-body spin echo, to name only the two most important ones.

  11. Zeta-function regularization approach to finite temperature effects in Kaluza-Klein space-times

    International Nuclear Information System (INIS)

    Bytsenko, A.A.; Vanzo, L.; Zerbini, S.

    1992-01-01

    In the framework of heat-kernel approach to zeta-function regularization, in this paper the one-loop effective potential at finite temperature for scalar and spinor fields on Kaluza-Klein space-time of the form M p x M c n , where M p is p-dimensional Minkowski space-time is evaluated. In particular, when the compact manifold is M c n = H n /Γ, the Selberg tracer formula associated with discrete torsion-free group Γ of the n-dimensional Lobachevsky space H n is used. An explicit representation for the thermodynamic potential valid for arbitrary temperature is found. As a result a complete high temperature expansion is presented and the roles of zero modes and topological contributions is discussed

  12. Strategy for design NIR calibration sets based on process spectrum and model space: An innovative approach for process analytical technology.

    Science.gov (United States)

    Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M

    2015-10-10

    The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Space, time, and the third dimension (model error)

    Science.gov (United States)

    Moss, Marshall E.

    1979-01-01

    The space-time tradeoff of hydrologic data collection (the ability to substitute spatial coverage for temporal extension of records or vice versa) is controlled jointly by the statistical properties of the phenomena that are being measured and by the model that is used to meld the information sources. The control exerted on the space-time tradeoff by the model and its accompanying errors has seldom been studied explicitly. The technique, known as Network Analyses for Regional Information (NARI), permits such a study of the regional regression model that is used to relate streamflow parameters to the physical and climatic characteristics of the drainage basin.The NARI technique shows that model improvement is a viable and sometimes necessary means of improving regional data collection systems. Model improvement provides an immediate increase in the accuracy of regional parameter estimation and also increases the information potential of future data collection. Model improvement, which can only be measured in a statistical sense, cannot be quantitatively estimated prior to its achievement; thus an attempt to upgrade a particular model entails a certain degree of risk on the part of the hydrologist.

  14. Dynamic multibody modeling for tethered space elevators

    Science.gov (United States)

    Williams, Paul

    2009-08-01

    This paper presents a fundamental modeling strategy for dealing with powered and propelled bodies moving along space tethers. The tether is divided into a large number of discrete masses, which are connected by viscoelastic springs. The tether is subject to the full range of forces expected in Earth orbit in a relatively simple manner. Two different models of the elevator dynamics are presented. In order to capture the effect of the elevator moving along the tether, the elevator dynamics are included as a separate body in both models. One model treats the elevator's motion dynamically, where propulsive and friction forces are applied to the elevator body. The second model treats the elevator's motion kinematically, where the distance along the tether is determined by adjusting the lengths of tether on either side of the elevator. The tether model is used to determine optimal configurations for the space elevator. A modal analysis of two different configurations is presented which show that the fundamental mode of oscillation is a pendular one around the anchor point with a period on the order of 160 h for the in-plane motion, and 24 h for the out-of-plane motion. Numerical simulation results of the effects of the elevator moving along the cable are presented for different travel velocities and different elevator masses.

  15. Social Network Analysis and Nutritional Behavior: An Integrated Modeling Approach.

    Science.gov (United States)

    Senior, Alistair M; Lihoreau, Mathieu; Buhl, Jerome; Raubenheimer, David; Simpson, Stephen J

    2016-01-01

    Animals have evolved complex foraging strategies to obtain a nutritionally balanced diet and associated fitness benefits. Recent research combining state-space models of nutritional geometry with agent-based models (ABMs), show how nutrient targeted foraging behavior can also influence animal social interactions, ultimately affecting collective dynamics and group structures. Here we demonstrate how social network analyses can be integrated into such a modeling framework and provide a practical analytical tool to compare experimental results with theory. We illustrate our approach by examining the case of nutritionally mediated dominance hierarchies. First we show how nutritionally explicit ABMs that simulate the emergence of dominance hierarchies can be used to generate social networks. Importantly the structural properties of our simulated networks bear similarities to dominance networks of real animals (where conflicts are not always directly related to nutrition). Finally, we demonstrate how metrics from social network analyses can be used to predict the fitness of agents in these simulated competitive environments. Our results highlight the potential importance of nutritional mechanisms in shaping dominance interactions in a wide range of social and ecological contexts. Nutrition likely influences social interactions in many species, and yet a theoretical framework for exploring these effects is currently lacking. Combining social network analyses with computational models from nutritional ecology may bridge this divide, representing a pragmatic approach for generating theoretical predictions for nutritional experiments.

  16. Applying MDA to SDR for Space to Model Real-time Issues

    Science.gov (United States)

    Blaser, Tammy M.

    2007-01-01

    NASA space communications systems have the challenge of designing SDRs with highly-constrained Size, Weight and Power (SWaP) resources. A study is being conducted to assess the effectiveness of applying the MDA Platform-Independent Model (PIM) and one or more Platform-Specific Models (PSM) specifically to address NASA space domain real-time issues. This paper will summarize our experiences with applying MDA to SDR for Space to model real-time issues. Real-time issues to be examined, measured, and analyzed are: meeting waveform timing requirements and efficiently applying Real-time Operating System (RTOS) scheduling algorithms, applying safety control measures, and SWaP verification. Real-time waveform algorithms benchmarked with the worst case environment conditions under the heaviest workload will drive the SDR for Space real-time PSM design.

  17. A shared-world conceptual model for integrating space station life sciences telescience operations

    Science.gov (United States)

    Johnson, Vicki; Bosley, John

    1988-01-01

    Mental models of the Space Station and its ancillary facilities will be employed by users of the Space Station as they draw upon past experiences, perform tasks, and collectively plan for future activities. The operational environment of the Space Station will incorporate telescience, a new set of operational modes. To investigate properties of the operational environment, distributed users, and the mental models they employ to manipulate resources while conducting telescience, an integrating shared-world conceptual model of Space Station telescience is proposed. The model comprises distributed users and resources (active elements); agents who mediate interactions among these elements on the basis of intelligent processing of shared information; and telescience protocols which structure the interactions of agents as they engage in cooperative, responsive interactions on behalf of users and resources distributed in space and time. Examples from the life sciences are used to instantiate and refine the model's principles. Implications for transaction management and autonomy are discussed. Experiments employing the model are described which the authors intend to conduct using the Space Station Life Sciences Telescience Testbed currently under development at Ames Research Center.

  18. A Mathematical Approach to Establishing Constitutive Models for Geomaterials

    Directory of Open Access Journals (Sweden)

    Guang-hua Yang

    2013-01-01

    Full Text Available The mathematical foundation of the traditional elastoplastic constitutive theory for geomaterials is presented from the mathematical point of view, that is, the expression of stress-strain relationship in principal stress/strain space being transformed to the expression in six-dimensional space. A new framework is then established according to the mathematical theory of vectors and tensors, which is applicable to establishing elastoplastic models both in strain space and in stress space. Traditional constitutive theories can be considered as its special cases. The framework also enables modification of traditional constitutive models.

  19. A parsimonious approach to modeling animal movement data.

    Directory of Open Access Journals (Sweden)

    Yann Tremblay

    Full Text Available Animal tracking is a growing field in ecology and previous work has shown that simple speed filtering of tracking data is not sufficient and that improvement of tracking location estimates are possible. To date, this has required methods that are complicated and often time-consuming (state-space models, resulting in limited application of this technique and the potential for analysis errors due to poor understanding of the fundamental framework behind the approach. We describe and test an alternative and intuitive approach consisting of bootstrapping random walks biased by forward particles. The model uses recorded data accuracy estimates, and can assimilate other sources of data such as sea-surface temperature, bathymetry and/or physical boundaries. We tested our model using ARGOS and geolocation tracks of elephant seals that also carried GPS tags in addition to PTTs, enabling true validation. Among pinnipeds, elephant seals are extreme divers that spend little time at the surface, which considerably impact the quality of both ARGOS and light-based geolocation tracks. Despite such low overall quality tracks, our model provided location estimates within 4.0, 5.5 and 12.0 km of true location 50% of the time, and within 9, 10.5 and 20.0 km 90% of the time, for above, equal or below average elephant seal ARGOS track qualities, respectively. With geolocation data, 50% of errors were less than 104.8 km (<0.94 degrees, and 90% were less than 199.8 km (<1.80 degrees. Larger errors were due to lack of sea-surface temperature gradients. In addition we show that our model is flexible enough to solve the obstacle avoidance problem by assimilating high resolution coastline data. This reduced the number of invalid on-land location by almost an order of magnitude. The method is intuitive, flexible and efficient, promising extensive utilization in future research.

  20. On modeling human reliability in space flights - Redundancy and recovery operations

    Science.gov (United States)

    Aarset, M.; Wright, J. F.

    The reliability of humans is of paramount importance to the safety of space flight systems. This paper describes why 'back-up' operators might not be the best solution, and in some cases, might even degrade system reliability. The problem associated with human redundancy calls for special treatment in reliability analyses. The concept of Standby Redundancy is adopted, and psychological and mathematical models are introduced to improve the way such problems can be estimated and handled. In the past, human reliability has practically been neglected in most reliability analyses, and, when included, the humans have been modeled as a component and treated numerically the way technical components are. This approach is not wrong in itself, but it may lead to systematic errors if too simple analogies from the technical domain are used in the modeling of human behavior. In this paper redundancy in a man-machine system will be addressed. It will be shown how simplification from the technical domain, when applied to human components of a system, may give non-conservative estimates of system reliability.

  1. Modeling extreme (Carrington-type) space weather events using three-dimensional MHD code simulations

    Science.gov (United States)

    Ngwira, C. M.; Pulkkinen, A. A.; Kuznetsova, M. M.; Glocer, A.

    2013-12-01

    There is growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure and systems. In the last two decades, significant progress has been made towards the modeling of space weather events. Three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, and have played a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for existing global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events that have a ground footprint comparable (or larger) to the Carrington superstorm. Results are presented for an initial simulation run with ``very extreme'' constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated ground induced geoelectric field to such extreme driving conditions. We also discuss the results and what they might mean for the accuracy of the simulations. The model is further tested using input data for an observed space weather event to verify the MHD model consistence and to draw guidance for future work. This extreme space weather MHD model is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in earth conductors such as power transmission grids.

  2. Metric-based approach and tool for modeling the I and C system using Markov chains

    International Nuclear Information System (INIS)

    Butenko, Valentyna; Kharchenko, Vyacheslav; Odarushchenko, Elena; Butenko, Dmitriy

    2015-01-01

    Markov's chains (MC) are well-know and widely applied in dependability and performability analysis of safety-critical systems, because of the flexible representation of system components dependencies and synchronization. There are few radblocks for greater application of the MC: accounting the additional system components increases the model state-space and complicates analysis; the non-numerically sophisticated user may find it difficult to decide between the variety of numerical methods to determine the most suitable and accurate for their application. Thus obtaining the high accurate and trusted modeling results becomes a nontrivial task. In this paper, we present the metric-based approach for selection of the applicable solution approach, based on the analysis of MCs stiffness, decomposability, sparsity and fragmentedness. Using this selection procedure the modeler can provide the verification of earlier obtained results. The presented approach was implemented in utility MSMC, which supports the MC construction, metric-based analysis, recommendations shaping and model solution. The model can be exported to the wall-known off-the-shelf mathematical packages for verification. The paper presents the case study of the industrial NPP I and C system, manufactured by RPC Radiy. The paper shows an application of metric-based approach and MSMC fool for dependability and safety analysis of RTS, and procedure of results verification. (author)

  3. Modelling of Patterns in Space and Time

    CERN Document Server

    Murray, James

    1984-01-01

    This volume contains a selection of papers presented at the work­ shop "Modelling of Patterns in Space and Time", organized by the 80nderforschungsbereich 123, "8tochastische Mathematische Modelle", in Heidelberg, July 4-8, 1983. The main aim of this workshop was to bring together physicists, chemists, biologists and mathematicians for an exchange of ideas and results in modelling patterns. Since the mathe­ matical problems arising depend only partially on the particular field of applications the interdisciplinary cooperation proved very useful. The workshop mainly treated phenomena showing spatial structures. The special areas covered were morphogenesis, growth in cell cultures, competition systems, structured populations, chemotaxis, chemical precipitation, space-time oscillations in chemical reactors, patterns in flames and fluids and mathematical methods. The discussions between experimentalists and theoreticians were especially interesting and effective. The editors hope that these proceedings reflect ...

  4. Resolving kinematic redundancy with constraints using the FSP (Full Space Parameterization) approach

    International Nuclear Information System (INIS)

    Pin, F.G.; Tulloch, F.A.

    1996-01-01

    A solution method is presented for the motion planning and control of kinematically redundant serial-link manipulators in the presence of motion constraints such as joint limits or obstacles. Given a trajectory for the end-effector, the approach utilizes the recently proposed Full Space Parameterization (FSP) method to generate a parameterized expression for the entire space of solutions of the unconstrained system. At each time step, a constrained optimization technique is then used to analytically find the specific joint motion solution that satisfies the desired task objective and all the constraints active during the time step. The method is applicable to systems operating in a priori known environments or in unknown environments with sensor-based obstacle detection. The derivation of the analytical solution is first presented for a general type of kinematic constraint and is then applied to the problem of motion planning for redundant manipulators with joint limits and obstacle avoidance. Sample results using planar and 3-D manipulators with various degrees of redundancy are presented to illustrate the efficiency and wide applicability of constrained motion planning using the FSP approach

  5. Spinning particle approach to higher spin field theory

    International Nuclear Information System (INIS)

    Corradini, Olindo

    2011-01-01

    We shortly review on the connection between higher-spin gauge field theories and supersymmetric spinning particle models. In such approach the higher spin equations of motion are linked to the first-class constraint algebra associated with the quantization of particle models. Here we consider a class of spinning particle models characterized by local O(N)-extended supersymmetry since these models are known to provide an alternative approach to the geometric formulation of higher spin field theory. We describe the canonical quantization of the models in curved target space and discuss the obstructions that appear in presence of an arbitrarily curved background. We then point out the special role that conformally flat spaces appear to have in such models and present a derivation of the higher-spin curvatures for maximally symmetric spaces.

  6. LADM AND INDOORGML FOR SUPPORT OF INDOOR SPACE IDENTIFICATION

    Directory of Open Access Journals (Sweden)

    S. Zlatanova

    2016-10-01

    Full Text Available Guidance and security in large public buildings such as airports, museums and shopping malls requires much more information that traditional 2D methods offer. Therefore 3D semantically-reach models have been actively investigated with the aim to gather knowledge about availability and accessibility of spaces. Spaces can be unavailable to specific users because of plenty of reasons: the 3D geometry of spaces (too low, too narrow, the properties of the objects to be guided to a specific part of the building (walking, driving, flying, the status of the indoor environment (e.g. crowded, limited light, under reconstruction, property regulations (private areas, security considerations and so on. However, such information is not explicitly avaible in the existing 3D semantically-reach models. IFC and CityGML are restricted to architectural building components and provide little to no means to describe such properties. IndoorGML has been designed to establish a generic approach for space identification allowing a space subdivision and automatic creation of a network for route computation. But currently it also represents only spaces as they are defined by the architectural layout of the building. The Land Administration Domain Model is currently the only available model to specify spaces on the basis of ownership and rights for use. In this paper we compare the principles of IndoorGML and LADM, investigate the approaches to define spaces and suggest options to the linking of the two types of spaces. We argue that LADM space subdivision on basis of properties and rights of use can be used to define to semantically and geometrically available and accessible spaces and therefore can enrich the IndoorGML concept.

  7. The Japanese Medakafish (Oryzias latipes) as Animal Model for Space-related Bone Research

    Science.gov (United States)

    Renn, J.; Schaedel, M.; Elmasri, H.; Wagner, T.; Goerlich, R.; Furutani-Seiki, M.; Kondoh, H.; Schartl, M.; Winkler, C.

    Long-term space flight leads to bone loss due to reduced mechanical load. Animal models are needed to support the analysis of the underlying mechanisms at the molecular and cellular level that are presently largely unclear. For this, small laboratory fish offer many experimental advantages as in vivo models to study disease related processes. They produce large numbers of completely transparent embryos, are easy to keep under laboratory and space conditions and have relatively compact genomes. We are using the Japanese Medaka to characterize the genetic networks regulating bone formation and to study bone formation and remodeling under microgravity. We showed that despite the large evolutionary distance many known factors regulating bone formation are conserved between fish and humans. This includes osteoprotegerin (opg), a key regulator of bone resorption that is altered at the transcriptional level by simulated microgravity in mammals in vitro (Kanematsu et al., Bone 30, 2002). To monitor, how opg is regulated by altered gravity in vivo in fish and how fish react to microgravity, we isolated the Medaka opg regulatory region and produced transgenic fish that carry the green fluorescent protein reporter under the control of the Medaka opg promoter. This model will be useful to monitor gravity-induced changes at the molecular level in vivo. Fish also provide the opportunity to identify novel genes involved in bone formation by using large-scale mutagenesis screens. We have characterized several lines of mutant fish subjected to ENU mutagenesis that show morphological defects in the formation of the bone precursor cell compartment of the axial skeleton, the sclerotome. Using this genetic approach, the identification of the mutated genes is expected to reveal novel components of the genetic cascades that regulate bone formation. In an attempt to identify genes specifically expressed in the sclerotome in Medaka, we identified and characterized dmrt2, a gene that so far

  8. The standard model on non-commutative space-time

    Energy Technology Data Exchange (ETDEWEB)

    Calmet, X.; Jurco, B.; Schupp, P.; Wohlgenannt, M. [Sektion Physik, Universitaet Muenchen (Germany); Wess, J. [Sektion Physik, Universitaet Muenchen (Germany); Max-Planck-Institut fuer Physik, Muenchen (Germany)

    2002-03-01

    We consider the standard model on a non-commutative space and expand the action in the non-commutativity parameter {theta}{sup {mu}}{sup {nu}}. No new particles are introduced; the structure group is SU(3) x SU(2) x U(1). We derive the leading order action. At zeroth order the action coincides with the ordinary standard model. At leading order in {theta}{sup {mu}}{sup {nu}} we find new vertices which are absent in the standard model on commutative space-time. The most striking features are couplings between quarks, gluons and electroweak bosons and many new vertices in the charged and neutral currents. We find that parity is violated in non-commutative QCD. The Higgs mechanism can be applied. QED is not deformed in the minimal version of the NCSM to the order considered. (orig.)

  9. Global energy modeling - A biophysical approach

    Energy Technology Data Exchange (ETDEWEB)

    Dale, Michael

    2010-09-15

    This paper contrasts the standard economic approach to energy modelling with energy models using a biophysical approach. Neither of these approaches includes changing energy-returns-on-investment (EROI) due to declining resource quality or the capital intensive nature of renewable energy sources. Both of these factors will become increasingly important in the future. An extension to the biophysical approach is outlined which encompasses a dynamic EROI function that explicitly incorporates technological learning. The model is used to explore several scenarios of long-term future energy supply especially concerning the global transition to renewable energy sources in the quest for a sustainable energy system.

  10. Applying the system engineering approach to devise a master’s degree program in space technology in developing countries

    Science.gov (United States)

    Jazebizadeh, Hooman; Tabeshian, Maryam; Taheran Vernoosfaderani, Mahsa

    2010-11-01

    Although more than half a century is passed since space technology was first developed, developing countries are just beginning to enter the arena, focusing mainly on educating professionals. Space technology by itself is an interdisciplinary science, is costly, and developing at a fast pace. Moreover, a fruitful education system needs to remain dynamic if the quality of education is the main concern, making it a complicated system. This paper makes use of the System Engineering Approach and the experiences of developed countries in this area while incorporating the needs of the developing countries to devise a comprehensive program in space engineering at the Master's level. The needs of the developing countries as regards space technology education may broadly be put into two categories: to raise their knowledge of space technology which requires hard work and teamwork skills, and to transfer and domesticate space technology while minimizing the costs and maximizing its effectiveness. The requirements of such space education system, which include research facilities, courses, and student projects are then defined using a model drawn from the space education systems in universities in North America and Europe that has been modified to include the above-mentioned needs. Three design concepts have been considered and synthesized through functional analysis. The first one is Modular and Detail Study which helps students specialize in a particular area in space technology. Second is referred to as Integrated and Interdisciplinary Study which focuses on understanding and development of space systems. Finally, the third concept which has been chosen for the purpose of this study, is a combination of the other two, categorizing the required curriculum into seven modules, setting aside space applications. This helps students to not only specialize in one of these modules but also to get hands-on experience in a real space project through participation in summer group

  11. A multi-element cosmological model with a complex space-time topology

    Science.gov (United States)

    Kardashev, N. S.; Lipatova, L. N.; Novikov, I. D.; Shatskiy, A. A.

    2015-02-01

    Wormhole models with a complex topology having one entrance and two exits into the same space-time of another universe are considered, as well as models with two entrances from the same space-time and one exit to another universe. These models are used to build a model of a multi-sheeted universe (a multi-element model of the "Multiverse") with a complex topology. Spherical symmetry is assumed in all the models. A Reissner-Norström black-hole model having no singularity beyond the horizon is constructed. The strength of the central singularity of the black hole is analyzed.

  12. Processing models for conflicting user requests in ubiquitous corporate smart spaces

    Directory of Open Access Journals (Sweden)

    Levonevskiy Dmitriy

    2018-01-01

    Full Text Available This paper considers processing of conflicting user requests in ubiquitous corporate smart spaces. The formulated problem consists in the contradiction between the limitation of available smart space resources to perform the conflicting user requests and necessity to provide the proper quality of service in corporate smart spaces. The principles of constructing the simulation model are described. The experiments were carried out basing on a model of the SPIIRAS digital signage service. Several task management strategies are discussed, an assessment of their effectiveness is given. The research is aimed at improving the quality of service and user experience in human-computer interaction within the corporate smart spaces.

  13. Preliminary Multi-Variable Parametric Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Hendrichs, Todd

    2010-01-01

    This slide presentation reviews creating a preliminary multi-variable cost model for the contract costs of making a space telescope. There is discussion of the methodology for collecting the data, definition of the statistical analysis methodology, single variable model results, testing of historical models and an introduction of the multi variable models.

  14. Modelling the diffusion-available pore space of an unaltered granitic rock matrix using a micro-DFN approach

    Science.gov (United States)

    Svensson, Urban; Löfgren, Martin; Trinchero, Paolo; Selroos, Jan-Olof

    2018-04-01

    In sparsely fractured rock, the ubiquitous heterogeneity of the matrix, which has been observed in different laboratory and in situ experiments, has been shown to have a significant influence on retardation mechanisms that are of importance for the safety of deep geological repositories for nuclear waste. Here, we propose a conceptualisation of a typical heterogeneous granitic rock matrix based on micro-Discrete Fracture Networks (micro-DFN). Different sets of fractures are used to represent grain-boundary pores as well as micro fractures that transect different mineral grains. The micro-DFN model offers a great flexibility in the way inter- and intra-granular space is represented as the different parameters that characterise each fracture set can be fine tuned to represent samples of different characteristics. Here, the parameters of the model have been calibrated against experimental observations from granitic rock samples taken at Forsmark (Sweden) and different variant cases have been used to illustrate how the model can be tied to rock samples with different attributes. Numerical through-diffusion simulations have been carried out to infer the bulk properties of the model as well as to compare the computed mass flux with the experimental data from an analogous laboratory experiment. The general good agreement between the model results and the experimental observations shows that the model presented here is a reliable tool for the understanding of retardation mechanisms occurring at the mm-scale in the matrix.

  15. Statistical learning modeling method for space debris photometric measurement

    Science.gov (United States)

    Sun, Wenjing; Sun, Jinqiu; Zhang, Yanning; Li, Haisen

    2016-03-01

    Photometric measurement is an important way to identify the space debris, but the present methods of photometric measurement have many constraints on star image and need complex image processing. Aiming at the problems, a statistical learning modeling method for space debris photometric measurement is proposed based on the global consistency of the star image, and the statistical information of star images is used to eliminate the measurement noises. First, the known stars on the star image are divided into training stars and testing stars. Then, the training stars are selected as the least squares fitting parameters to construct the photometric measurement model, and the testing stars are used to calculate the measurement accuracy of the photometric measurement model. Experimental results show that, the accuracy of the proposed photometric measurement model is about 0.1 magnitudes.

  16. Quantum metric spaces as a model for pregeometry

    International Nuclear Information System (INIS)

    Alvarez, E.; Cespedes, J.; Verdaguer, E.

    1992-01-01

    A new arena for the dynamics of spacetime is proposed, in which the basic quantum variable is the two-point distance on a metric space. The scaling dimension (that is, the Kolmogorov capacity) in the neighborhood of each point then defines in a natural way a local concept of dimension. We study our model in the region of parameter space in which the resulting spacetime is not too different from a smooth manifold

  17. Space Colonization Using Space-Elevators from Phobos

    Science.gov (United States)

    Weinstein, Leonard M.

    2003-01-01

    A novel approach is examined for creating an industrial civilization beyond Earth. The approach would take advantage of the unique configuration of Mars and its moon Phobos to make a transportation system capable of raising mass from the surface of Mars to space at a low cost. Mars would be used as the primary location for support personnel and infrastructure. Phobos would be used as a source of raw materials for space-based activity, and as an anchor for tethered carbon-nanotube-based space-elevators. One space-elevator would terminate at the upper edge of Mars' atmosphere. Small craft would be launched from Mars' surface to rendezvous with the moving elevator tip and their payloads detached and raised with solar powered loop elevators to Phobos. Another space-elevator would be extended outward from Phobos to launch craft toward the Earth/Moon system or the asteroid belt. The outward tip would also be used to catch arriving craft. This approach would allow Mars to be colonized, and allow transportation of people and supplies from Mars to support the space industry. In addition, large quantities of material obtained from Phobos could be used to construct space habitats and also supply propellant and material for space industry in the Earth/Moon system as well as around Mars.

  18. Comparisons of Multilevel Modeling and Structural Equation Modeling Approaches to Actor-Partner Interdependence Model.

    Science.gov (United States)

    Hong, Sehee; Kim, Soyoung

    2018-01-01

    There are basically two modeling approaches applicable to analyzing an actor-partner interdependence model: the multilevel modeling (hierarchical linear model) and the structural equation modeling. This article explains how to use these two models in analyzing an actor-partner interdependence model and how these two approaches work differently. As an empirical example, marital conflict data were used to analyze an actor-partner interdependence model. The multilevel modeling and the structural equation modeling produced virtually identical estimates for a basic model. However, the structural equation modeling approach allowed more realistic assumptions on measurement errors and factor loadings, rendering better model fit indices.

  19. Autonomous Motion Learning for Intra-Vehicular Activity Space Robot

    Science.gov (United States)

    Watanabe, Yutaka; Yairi, Takehisa; Machida, Kazuo

    Space robots will be needed in the future space missions. So far, many types of space robots have been developed, but in particular, Intra-Vehicular Activity (IVA) space robots that support human activities should be developed to reduce human-risks in space. In this paper, we study the motion learning method of an IVA space robot with the multi-link mechanism. The advantage point is that this space robot moves using reaction force of the multi-link mechanism and contact forces from the wall as space walking of an astronaut, not to use a propulsion. The control approach is determined based on a reinforcement learning with the actor-critic algorithm. We demonstrate to clear effectiveness of this approach using a 5-link space robot model by simulation. First, we simulate that a space robot learn the motion control including contact phase in two dimensional case. Next, we simulate that a space robot learn the motion control changing base attitude in three dimensional case.

  20. Multivariable Wind Modeling in State Space

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Pedersen, B. J.

    2011-01-01

    Turbulence of the incoming wind field is of paramount importance to the dynamic response of wind turbines. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper an empirical...... for the vector turbulence process incorporating its phase spectrum in one stage, and its results are compared with a conventional ARMA modeling method....... the succeeding state space and ARMA modeling of the turbulence rely on the positive definiteness of the cross-spectral density matrix, the problem with the non-positive definiteness of such matrices is at first addressed and suitable treatments regarding it are proposed. From the adjusted positive definite cross...

  1. Path integral approach for superintegrable potentials on spaces of non-constant curvature. Pt. 2. Darboux spaces D{sub III} and D{sub IV}

    Energy Technology Data Exchange (ETDEWEB)

    Grosche, C. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Pogosyan, G.S. [Joint Inst. of Nuclear Research, Moscow (Russian Federation). Bogoliubov Lab. of Theoretical Physics]|[Guadalajara Univ., Jalisco (Mexico). Dept. de Matematicas CUCEI; Sissakian, A.N. [Joint Inst. of Nuclear Research, Moscow (Russian Federation). Bogoliubov Lab. of Theoretical Physics

    2006-08-15

    This is the second paper on the path integral approach of superintegrable systems on Darboux spaces, spaces of non-constant curvature. We analyze in the spaces D{sub III} and D{sub IV} five respectively four superintegrable potentials, which were first given by Kalnins et al. We are able to evaluate the path integral in most of the separating coordinate systems, leading to expressions for the Green functions, the discrete and continuous wave-functions, and the discrete energy-spectra. In some cases, however, the discrete spectrum cannot be stated explicitly, because it is determined by a higher order polynomial equation. We show that also the free motion in Darboux space of type III can contain bound states, provided the boundary conditions are appropriate. We state the energy spectrum and the wave-functions, respectively. (orig.)

  2. Refinement of protein termini in template-based modeling using conformational space annealing.

    Science.gov (United States)

    Park, Hahnbeom; Ko, Junsu; Joo, Keehyoung; Lee, Julian; Seok, Chaok; Lee, Jooyoung

    2011-09-01

    The rapid increase in the number of experimentally determined protein structures in recent years enables us to obtain more reliable protein tertiary structure models than ever by template-based modeling. However, refinement of template-based models beyond the limit available from the best templates is still needed for understanding protein function in atomic detail. In this work, we develop a new method for protein terminus modeling that can be applied to refinement of models with unreliable terminus structures. The energy function for terminus modeling consists of both physics-based and knowledge-based potential terms with carefully optimized relative weights. Effective sampling of both the framework and terminus is performed using the conformational space annealing technique. This method has been tested on a set of termini derived from a nonredundant structure database and two sets of termini from the CASP8 targets. The performance of the terminus modeling method is significantly improved over our previous method that does not employ terminus refinement. It is also comparable or superior to the best server methods tested in CASP8. The success of the current approach suggests that similar strategy may be applied to other types of refinement problems such as loop modeling or secondary structure rearrangement. Copyright © 2011 Wiley-Liss, Inc.

  3. Engineering Risk Assessment of Space Thruster Challenge Problem

    Science.gov (United States)

    Mathias, Donovan L.; Mattenberger, Christopher J.; Go, Susie

    2014-01-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center utilizes dynamic models with linked physics-of-failure analyses to produce quantitative risk assessments of space exploration missions. This paper applies the ERA approach to the baseline and extended versions of the PSAM Space Thruster Challenge Problem, which investigates mission risk for a deep space ion propulsion system with time-varying thruster requirements and operations schedules. The dynamic mission is modeled using a combination of discrete and continuous-time reliability elements within the commercially available GoldSim software. Loss-of-mission (LOM) probability results are generated via Monte Carlo sampling performed by the integrated model. Model convergence studies are presented to illustrate the sensitivity of integrated LOM results to the number of Monte Carlo trials. A deterministic risk model was also built for the three baseline and extended missions using the Ames Reliability Tool (ART), and results are compared to the simulation results to evaluate the relative importance of mission dynamics. The ART model did a reasonable job of matching the simulation models for the baseline case, while a hybrid approach using offline dynamic models was required for the extended missions. This study highlighted that state-of-the-art techniques can adequately adapt to a range of dynamic problems.

  4. A Multi-Model Approach for System Diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad; Bækgaard, Mikkel Ask Buur

    2007-01-01

    A multi-model approach for system diagnosis is presented in this paper. The relation with fault diagnosis as well as performance validation is considered. The approach is based on testing a number of pre-described models and find which one is the best. It is based on an active approach......,i.e. an auxiliary input to the system is applied. The multi-model approach is applied on a wind turbine system....

  5. Traffic model for commercial payloads in the Materials Experiment Assembly (MEA). [market research in commercial space processing

    Science.gov (United States)

    Tietzel, F. A.

    1979-01-01

    One hundred individuals representing universities, technical institutes, government agencies, and industrial facilities were surveyed to determine potential commercial use of a self-contained, automated assembly for the space processing of materials during frequent shuttle flights for the 1981 to 1987 period. The approach used and the results of the study are summarized. A time time-phased projection (traffic model) of commercial usage of the materials experiment assembly is provided.

  6. Personalized State-space Modeling of Glucose Dynamics for Type 1 Diabetes Using Continuously Monitored Glucose, Insulin Dose, and Meal Intake: An Extended Kalman Filter Approach.

    Science.gov (United States)

    Wang, Qian; Molenaar, Peter; Harsh, Saurabh; Freeman, Kenneth; Xie, Jinyu; Gold, Carol; Rovine, Mike; Ulbrecht, Jan

    2014-03-01

    An essential component of any artificial pancreas is on the prediction of blood glucose levels as a function of exogenous and endogenous perturbations such as insulin dose, meal intake, and physical activity and emotional tone under natural living conditions. In this article, we present a new data-driven state-space dynamic model with time-varying coefficients that are used to explicitly quantify the time-varying patient-specific effects of insulin dose and meal intake on blood glucose fluctuations. Using the 3-variate time series of glucose level, insulin dose, and meal intake of an individual type 1 diabetic subject, we apply an extended Kalman filter (EKF) to estimate time-varying coefficients of the patient-specific state-space model. We evaluate our empirical modeling using (1) the FDA-approved UVa/Padova simulator with 30 virtual patients and (2) clinical data of 5 type 1 diabetic patients under natural living conditions. Compared to a forgetting-factor-based recursive ARX model of the same order, the EKF model predictions have higher fit, and significantly better temporal gain and J index and thus are superior in early detection of upward and downward trends in glucose. The EKF based state-space model developed in this article is particularly suitable for model-based state-feedback control designs since the Kalman filter estimates the state variable of the glucose dynamics based on the measured glucose time series. In addition, since the model parameters are estimated in real time, this model is also suitable for adaptive control. © 2014 Diabetes Technology Society.

  7. Reliability models for Space Station power system

    Science.gov (United States)

    Singh, C.; Patton, A. D.; Kim, Y.; Wagner, H.

    1987-01-01

    This paper presents a methodology for the reliability evaluation of Space Station power system. The two options considered are the photovoltaic system and the solar dynamic system. Reliability models for both of these options are described along with the methodology for calculating the reliability indices.

  8. 3D printing the pterygopalatine fossa: a negative space model of a complex structure.

    Science.gov (United States)

    Bannon, Ross; Parihar, Shivani; Skarparis, Yiannis; Varsou, Ourania; Cezayirli, Enis

    2018-02-01

    The pterygopalatine fossa is one of the most complex anatomical regions to understand. It is poorly visualized in cadaveric dissection and most textbooks rely on schematic depictions. We describe our approach to creating a low-cost, 3D model of the pterygopalatine fossa, including its associated canals and foramina, using an affordable "desktop" 3D printer. We used open source software to create a volume render of the pterygopalatine fossa from axial slices of a head computerised tomography scan. These data were then exported to a 3D printer to produce an anatomically accurate model. The resulting 'negative space' model of the pterygopalatine fossa provides a useful and innovative aid for understanding the complex anatomical relationships of the pterygopalatine fossa. This model was designed primarily for medical students; however, it will also be of interest to postgraduates in ENT, ophthalmology, neurosurgery, and radiology. The technical process described may be replicated by other departments wishing to develop their own anatomical models whilst incurring minimal costs.

  9. A stochastic approach for model reduction and memory function design in hydrogeophysical inversion

    Science.gov (United States)

    Hou, Z.; Kellogg, A.; Terry, N.

    2009-12-01

    Geophysical (e.g., seismic, electromagnetic, radar) techniques and statistical methods are essential for research related to subsurface characterization, including monitoring subsurface flow and transport processes, oil/gas reservoir identification, etc. For deep subsurface characterization such as reservoir petroleum exploration, seismic methods have been widely used. Recently, electromagnetic (EM) methods have drawn great attention in the area of reservoir characterization. However, considering the enormous computational demand corresponding to seismic and EM forward modeling, it is usually a big problem to have too many unknown parameters in the modeling domain. For shallow subsurface applications, the characterization can be very complicated considering the complexity and nonlinearity of flow and transport processes in the unsaturated zone. It is warranted to reduce the dimension of parameter space to a reasonable level. Another common concern is how to make the best use of time-lapse data with spatial-temporal correlations. This is even more critical when we try to monitor subsurface processes using geophysical data collected at different times. The normal practice is to get the inverse images individually. These images are not necessarily continuous or even reasonably related, because of the non-uniqueness of hydrogeophysical inversion. We propose to use a stochastic framework by integrating minimum-relative-entropy concept, quasi Monto Carlo sampling techniques, and statistical tests. The approach allows efficient and sufficient exploration of all possibilities of model parameters and evaluation of their significances to geophysical responses. The analyses enable us to reduce the parameter space significantly. The approach can be combined with Bayesian updating, allowing us to treat the updated ‘posterior’ pdf as a memory function, which stores all the information up to date about the distributions of soil/field attributes/properties, then consider the

  10. Real Space Approach to CMB deboosting

    CERN Document Server

    Yoho, Amanda; Starkman, Glenn D.; Pereira, Thiago S.

    2013-01-01

    The effect of our Galaxy's motion through the Cosmic Microwave Background rest frame, which aberrates and Doppler shifts incoming photons measured by current CMB experiments, has been shown to produce mode-mixing in the multipole space temperature coefficients. However, multipole space determinations are subject to many difficulties, and a real-space analysis can provide a straightforward alternative. In this work we describe a numerical method for removing Lorentz- boost effects from real-space temperature maps. We show that to deboost a map so that one can accurately extract the temperature power spectrum requires calculating the boost kernel at a finer pixelization than one might naively expect. In idealized cases that allow for easy comparison to analytic results, we have confirmed that there is indeed mode mixing among the spherical harmonic coefficients of the temperature. We find that using a boost kernel calculated at Nside=8192 leads to a 1% bias in the binned boosted power spectrum at l~2000, while ...

  11. Modeling extreme "Carrington-type" space weather events using three-dimensional global MHD simulations

    Science.gov (United States)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti; Kuznetsova, Maria M.; Glocer, Alex

    2014-06-01

    There is a growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure. In the last two decades, significant progress has been made toward the first-principles modeling of space weather events, and three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, thereby playing a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for the modern global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events with a Dst footprint comparable to the Carrington superstorm of September 1859 based on the estimate by Tsurutani et. al. (2003). Results are presented for a simulation run with "very extreme" constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated induced geoelectric field on the ground to such extreme driving conditions. The model setup is further tested using input data for an observed space weather event of Halloween storm October 2003 to verify the MHD model consistence and to draw additional guidance for future work. This extreme space weather MHD model setup is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in ground-based conductor systems such as power transmission grids. Therefore, our ultimate goal is to explore the level of geoelectric fields that can be induced from an assumed storm of the reported magnitude, i.e., Dst˜=-1600 nT.

  12. A new approach to reduce uncertainties in space radiation cancer risk predictions.

    Directory of Open Access Journals (Sweden)

    Francis A Cucinotta

    Full Text Available The prediction of space radiation induced cancer risk carries large uncertainties with two of the largest uncertainties being radiation quality and dose-rate effects. In risk models the ratio of the quality factor (QF to the dose and dose-rate reduction effectiveness factor (DDREF parameter is used to scale organ doses for cosmic ray proton and high charge and energy (HZE particles to a hazard rate for γ-rays derived from human epidemiology data. In previous work, particle track structure concepts were used to formulate a space radiation QF function that is dependent on particle charge number Z, and kinetic energy per atomic mass unit, E. QF uncertainties where represented by subjective probability distribution functions (PDF for the three QF parameters that described its maximum value and shape parameters for Z and E dependences. Here I report on an analysis of a maximum QF parameter and its uncertainty using mouse tumor induction data. Because experimental data for risks at low doses of γ-rays are highly uncertain which impacts estimates of maximum values of relative biological effectiveness (RBEmax, I developed an alternate QF model, denoted QFγAcute where QFs are defined relative to higher acute γ-ray doses (0.5 to 3 Gy. The alternate model reduces the dependence of risk projections on the DDREF, however a DDREF is still needed for risk estimates for high-energy protons and other primary or secondary sparsely ionizing space radiation components. Risk projections (upper confidence levels (CL for space missions show a reduction of about 40% (CL∼50% using the QFγAcute model compared the QFs based on RBEmax and about 25% (CL∼35% compared to previous estimates. In addition, I discuss how a possible qualitative difference leading to increased tumor lethality for HZE particles compared to low LET radiation and background tumors remains a large uncertainty in risk estimates.

  13. Building spatio-temporal database model based on ontological approach using relational database environment

    International Nuclear Information System (INIS)

    Mahmood, N.; Burney, S.M.A.

    2017-01-01

    Everything in this world is encapsulated by space and time fence. Our daily life activities are utterly linked and related with other objects in vicinity. Therefore, a strong relationship exist with our current location, time (including past, present and future) and event through with we are moving as an object also affect our activities in life. Ontology development and its integration with database are vital for the true understanding of the complex systems involving both spatial and temporal dimensions. In this paper we propose a conceptual framework for building spatio-temporal database model based on ontological approach. We have used relational data model for modelling spatio-temporal data content and present our methodology with spatio-temporal ontological accepts and its transformation into spatio-temporal database model. We illustrate the implementation of our conceptual model through a case study related to cultivated land parcel used for agriculture to exhibit the spatio-temporal behaviour of agricultural land and related entities. Moreover, it provides a generic approach for designing spatiotemporal databases based on ontology. The proposed model is capable to understand the ontological and somehow epistemological commitments and to build spatio-temporal ontology and transform it into a spatio-temporal data model. Finally, we highlight the existing and future research challenges. (author)

  14. Space weather: Modeling and forecasting ionospheric

    International Nuclear Information System (INIS)

    Calzadilla Mendez, A.

    2008-01-01

    Full text: Space weather is the set of phenomena and interactions that take place in the interplanetary medium. It is regulated primarily by the activity originating in the Sun and affects both the artificial satellites that are outside of the protective cover of the Earth's atmosphere as the rest of the planets in the solar system. Among the phenomena that are of great relevance and impact on Earth are the auroras and geomagnetic storms , these are a direct result of irregularities in the flow of the solar wind and the interplanetary magnetic field . Given the high complexity of the physical phenomena involved (magnetic reconnection , particle inlet and ionizing radiation to the atmosphere) one of the great scientific challenges today is to forecast the state of plasmatic means either the interplanetary medium , the magnetosphere and ionosphere , for their importance to the development of various human activities such as radio , global positioning , navigation, etc. . It briefly address some of the international ionospheric modeling methods and contributions and participation that currently has the space group of the Institute of Geophysics Geophysics and Astronomy (IGA) in these activities of modeling and forecasting ionospheric. (author)

  15. A Programmatic and Engineering Approach to the Development of a Nuclear Thermal Rocket for Space Exploration

    Science.gov (United States)

    Bordelon, Wayne J., Jr.; Ballard, Rick O.; Gerrish, Harold P., Jr.

    2006-01-01

    With the announcement of the Vision for Space Exploration on January 14, 2004, there has been a renewed interest in nuclear thermal propulsion. Nuclear thermal propulsion is a leading candidate for in-space propulsion for human Mars missions; however, the cost to develop a nuclear thermal rocket engine system is uncertain. Key to determining the engine development cost will be the engine requirements, the technology used in the development and the development approach. The engine requirements and technology selection have not been defined and are awaiting definition of the Mars architecture and vehicle definitions. The paper discusses an engine development approach in light of top-level strategic questions and considerations for nuclear thermal propulsion and provides a suggested approach based on work conducted at the NASA Marshall Space Flight Center to support planning and requirements for the Prometheus Power and Propulsion Office. This work is intended to help support the development of a comprehensive strategy for nuclear thermal propulsion, to help reduce the uncertainty in the development cost estimate, and to help assess the potential value of and need for nuclear thermal propulsion for a human Mars mission.

  16. The Integrated Medical Model: A Risk Assessment and Decision Support Tool for Human Space Flight Missions

    Science.gov (United States)

    Kerstman, Eric L.; Minard, Charles; FreiredeCarvalho, Mary H.; Walton, Marlei E.; Myers, Jerry G., Jr.; Saile, Lynn G.; Lopez, Vilma; Butler, Douglas J.; Johnson-Throop, Kathy A.

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) and its use as a risk assessment and decision support tool for human space flight missions. The IMM is an integrated, quantified, evidence-based decision support tool useful to NASA crew health and mission planners. It is intended to assist in optimizing crew health, safety and mission success within the constraints of the space flight environment for in-flight operations. It uses ISS data to assist in planning for the Exploration Program and it is not intended to assist in post flight research. The IMM was used to update Probability Risk Assessment (PRA) for the purpose of updating forecasts for the conditions requiring evacuation (EVAC) or Loss of Crew Life (LOC) for the ISS. The IMM validation approach includes comparison with actual events and involves both qualitative and quantitaive approaches. The results of these comparisons are reviewed. Another use of the IMM is to optimize the medical kits taking into consideration the specific mission and the crew profile. An example of the use of the IMM to optimize the medical kits is reviewed.

  17. Integrated Space Asset Management Database and Modeling

    Science.gov (United States)

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

  18. Space - A unique environment for process modeling R&D

    Science.gov (United States)

    Overfelt, Tony

    1991-01-01

    Process modeling, the application of advanced computational techniques to simulate real processes as they occur in regular use, e.g., welding, casting and semiconductor crystal growth, is discussed. Using the low-gravity environment of space will accelerate the technical validation of the procedures and enable extremely accurate determinations of the many necessary thermophysical properties. Attention is given to NASA's centers for the commercial development of space; joint ventures of universities, industries, and goverment agencies to study the unique attributes of space that offer potential for applied R&D and eventual commercial exploitation.

  19. Awareness-based game-theoretic space resource management

    Science.gov (United States)

    Chen, Genshe; Chen, Huimin; Pham, Khanh; Blasch, Erik; Cruz, Jose B., Jr.

    2009-05-01

    Over recent decades, the space environment becomes more complex with a significant increase in space debris and a greater density of spacecraft, which poses great difficulties to efficient and reliable space operations. In this paper we present a Hierarchical Sensor Management (HSM) method to space operations by (a) accommodating awareness modeling and updating and (b) collaborative search and tracking space objects. The basic approach is described as follows. Firstly, partition the relevant region of interest into district cells. Second, initialize and model the dynamics of each cell with awareness and object covariance according to prior information. Secondly, explicitly assign sensing resources to objects with user specified requirements. Note that when an object has intelligent response to the sensing event, the sensor assigned to observe an intelligent object may switch from time-to-time between a strong, active signal mode and a passive mode to maximize the total amount of information to be obtained over a multi-step time horizon and avoid risks. Thirdly, if all explicitly specified requirements are satisfied and there are still more sensing resources available, we assign the additional sensing resources to objects without explicitly specified requirements via an information based approach. Finally, sensor scheduling is applied to each sensor-object or sensor-cell pair according to the object type. We demonstrate our method with realistic space resources management scenario using NASA's General Mission Analysis Tool (GMAT) for space object search and track with multiple space borne observers.

  20. Modelling dendritic ecological networks in space: anintegrated network perspective

    Science.gov (United States)

    Peterson, Erin E.; Ver Hoef, Jay M.; Isaak, Dan J.; Falke, Jeffrey A.; Fortin, Marie-Josée; Jordon, Chris E.; McNyset, Kristina; Monestiez, Pascal; Ruesch, Aaron S.; Sengupta, Aritra; Som, Nicholas; Steel, E. Ashley; Theobald, David M.; Torgersen, Christian E.; Wenger, Seth J.

    2013-01-01

    Dendritic ecological networks (DENs) are a unique form of ecological networks that exhibit a dendritic network topology (e.g. stream and cave networks or plant architecture). DENs have a dual spatial representation; as points within the network and as points in geographical space. Consequently, some analytical methods used to quantify relationships in other types of ecological networks, or in 2-D space, may be inadequate for studying the influence of structure and connectivity on ecological processes within DENs. We propose a conceptual taxonomy of network analysis methods that account for DEN characteristics to varying degrees and provide a synthesis of the different approaches within

  1. Physiology-based modelling approaches to characterize fish habitat suitability: Their usefulness and limitations

    Science.gov (United States)

    Teal, Lorna R.; Marras, Stefano; Peck, Myron A.; Domenici, Paolo

    2018-02-01

    Models are useful tools for predicting the impact of global change on species distribution and abundance. As ectotherms, fish are being challenged to adapt or track changes in their environment, either in time through a phenological shift or in space by a biogeographic shift. Past modelling efforts have largely been based on correlative Species Distribution Models, which use known occurrences of species across landscapes of interest to define sets of conditions under which species are likely to maintain populations. The practical advantages of this correlative approach are its simplicity and the flexibility in terms of data requirements. However, effective conservation management requires models that make projections beyond the range of available data. One way to deal with such an extrapolation is to use a mechanistic approach based on physiological processes underlying climate change effects on organisms. Here we illustrate two approaches for developing physiology-based models to characterize fish habitat suitability. (i) Aerobic Scope Models (ASM) are based on the relationship between environmental factors and aerobic scope (defined as the difference between maximum and standard (basal) metabolism). This approach is based on experimental data collected by using a number of treatments that allow a function to be derived to predict aerobic metabolic scope from the stressor/environmental factor(s). This function is then integrated with environmental (oceanographic) data of current and future scenarios. For any given species, this approach allows habitat suitability maps to be generated at various spatiotemporal scales. The strength of the ASM approach relies on the estimate of relative performance when comparing, for example, different locations or different species. (ii) Dynamic Energy Budget (DEB) models are based on first principles including the idea that metabolism is organised in the same way within all animals. The (standard) DEB model aims to describe

  2. Towards the development of a 3D digital city model as a real extension of public urban spaces

    DEFF Research Database (Denmark)

    Tournay, Bruno

    ; it only serves as a tool in the analogue world. The model is a passive picture for contemplation.   Another way of looking at a digital 3D model is to see it not as a virtual model of reality but as a real model that must fulfil real functions and to design it as a space of transition between the local...... new approaches to communication and participation. Who controls the Electronic Neighbourhood? Just as in the analogue world, control of central places in the digital world is power.   Finally, based on the experience gained in relation to the project, the paper will outline some guidelines for better...

  3. Learning in Earth and Space Science: A Review of Conceptual Change Instructional Approaches

    Science.gov (United States)

    Mills, Reece; Tomas, Louisa; Lewthwaite, Brian

    2016-01-01

    In response to calls for research into effective instruction in the Earth and space sciences, and to identify directions for future research, this systematic review of the literature explores research into instructional approaches designed to facilitate conceptual change. In total, 52 studies were identified and analyzed. Analysis focused on the…

  4. The Design Space of the Embryonic Cell Cycle Oscillator.

    Science.gov (United States)

    Mattingly, Henry H; Sheintuch, Moshe; Shvartsman, Stanislav Y

    2017-08-08

    One of the main tasks in the analysis of models of biomolecular networks is to characterize the domain of the parameter space that corresponds to a specific behavior. Given the large number of parameters in most models, this is no trivial task. We use a model of the embryonic cell cycle to illustrate the approaches that can be used to characterize the domain of parameter space corresponding to limit cycle oscillations, a regime that coordinates periodic entry into and exit from mitosis. Our approach relies on geometric construction of bifurcation sets, numerical continuation, and random sampling of parameters. We delineate the multidimensional oscillatory domain and use it to quantify the robustness of periodic trajectories. Although some of our techniques explore the specific features of the chosen system, the general approach can be extended to other models of the cell cycle engine and other biomolecular networks. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  5. An approach for generating trajectory-based dynamics which conserves the canonical distribution in the phase space formulation of quantum mechanics. II. Thermal correlation functions.

    Science.gov (United States)

    Liu, Jian; Miller, William H

    2011-03-14

    We show the exact expression of the quantum mechanical time correlation function in the phase space formulation of quantum mechanics. The trajectory-based dynamics that conserves the quantum canonical distribution-equilibrium Liouville dynamics (ELD) proposed in Paper I is then used to approximately evaluate the exact expression. It gives exact thermal correlation functions (of even nonlinear operators, i.e., nonlinear functions of position or momentum operators) in the classical, high temperature, and harmonic limits. Various methods have been presented for the implementation of ELD. Numerical tests of the ELD approach in the Wigner or Husimi phase space have been made for a harmonic oscillator and two strongly anharmonic model problems, for each potential autocorrelation functions of both linear and nonlinear operators have been calculated. It suggests ELD can be a potentially useful approach for describing quantum effects for complex systems in condense phase.

  6. Superfield Lax formalism of supersymmetric sigma model on symmetric spaces

    International Nuclear Information System (INIS)

    Saleem, U.; Hassan, M.

    2006-01-01

    We present a superfield Lax formalism of the superspace sigma model based on the target space G/H and show that a one-parameter family of flat superfield connections exists if the target space G/H is a symmetric space. The formalism has been related to the existence of an infinite family of local and non-local superfield conserved quantities. A few examples have been given to illustrate the results. (orig.)

  7. Mapping behavioral landscapes for animal movement: a finite mixture modeling approach

    Science.gov (United States)

    Tracey, Jeff A.; Zhu, Jun; Boydston, Erin E.; Lyren, Lisa M.; Fisher, Robert N.; Crooks, Kevin R.

    2013-01-01

    Because of its role in many ecological processes, movement of animals in response to landscape features is an important subject in ecology and conservation biology. In this paper, we develop models of animal movement in relation to objects or fields in a landscape. We take a finite mixture modeling approach in which the component densities are conceptually related to different choices for movement in response to a landscape feature, and the mixing proportions are related to the probability of selecting each response as a function of one or more covariates. We combine particle swarm optimization and an Expectation-Maximization (EM) algorithm to obtain maximum likelihood estimates of the model parameters. We use this approach to analyze data for movement of three bobcats in relation to urban areas in southern California, USA. A behavioral interpretation of the models revealed similarities and differences in bobcat movement response to urbanization. All three bobcats avoided urbanization by moving either parallel to urban boundaries or toward less urban areas as the proportion of urban land cover in the surrounding area increased. However, one bobcat, a male with a dispersal-like large-scale movement pattern, avoided urbanization at lower densities and responded strictly by moving parallel to the urban edge. The other two bobcats, which were both residents and occupied similar geographic areas, avoided urban areas using a combination of movements parallel to the urban edge and movement toward areas of less urbanization. However, the resident female appeared to exhibit greater repulsion at lower levels of urbanization than the resident male, consistent with empirical observations of bobcats in southern California. Using the parameterized finite mixture models, we mapped behavioral states to geographic space, creating a representation of a behavioral landscape. This approach can provide guidance for conservation planning based on analysis of animal movement data using

  8. Modelling an industrial anaerobic granular reactor using a multi-scale approach.

    Science.gov (United States)

    Feldman, H; Flores-Alsina, X; Ramin, P; Kjellberg, K; Jeppsson, U; Batstone, D J; Gernaey, K V

    2017-12-01

    The objective of this paper is to show the results of an industrial project dealing with modelling of anaerobic digesters. A multi-scale mathematical approach is developed to describe reactor hydrodynamics, granule growth/distribution and microbial competition/inhibition for substrate/space within the biofilm. The main biochemical and physico-chemical processes in the model are based on the Anaerobic Digestion Model No 1 (ADM1) extended with the fate of phosphorus (P), sulfur (S) and ethanol (Et-OH). Wastewater dynamic conditions are reproduced and data frequency increased using the Benchmark Simulation Model No 2 (BSM2) influent generator. All models are tested using two plant data sets corresponding to different operational periods (#D1, #D2). Simulation results reveal that the proposed approach can satisfactorily describe the transformation of organics, nutrients and minerals, the production of methane, carbon dioxide and sulfide and the potential formation of precipitates within the bulk (average deviation between computer simulations and measurements for both #D1, #D2 is around 10%). Model predictions suggest a stratified structure within the granule which is the result of: 1) applied loading rates, 2) mass transfer limitations and 3) specific (bacterial) affinity for substrate. Hence, inerts (X I ) and methanogens (X ac ) are situated in the inner zone, and this fraction lowers as the radius increases favouring the presence of acidogens (X su ,X aa , X fa ) and acetogens (X c4 ,X pro ). Additional simulations show the effects on the overall process performance when operational (pH) and loading (S:COD) conditions are modified. Lastly, the effect of intra-granular precipitation on the overall organic/inorganic distribution is assessed at: 1) different times; and, 2) reactor heights. Finally, the possibilities and opportunities offered by the proposed approach for conducting engineering optimization projects are discussed. Copyright © 2017 Elsevier Ltd. All

  9. Hamiltonian approach to the lattice massive Schwinger model

    International Nuclear Information System (INIS)

    Sidorov, A.V.; Zastavenko, L.G.

    1996-01-01

    The authors consider the limit e 2 /m 2 much-lt 1 of the lattice massive Schwinger model, i.e., the lattice massive QED in two space-time dimensions, up to lowest order in the effective coupling constant e 2 /m 2 . Here, m is the fermion mass parameter and e is the electron charge. They compare their lattice QED model with the analogous continuous space and lattice space models, (CSM and LSM), which do not take account of the zero momentum mode, z.m.m., of the vector potential. The difference is that (due to extra z.m.m. degree of freedom) to every eigenstate of the CSM and LSM there corresponds a family of eigenstates of the authors lattice QED with the parameter λ. They restrict their consideration to small values of the parameter λ. Then, the energies of the particle states of their lattice QED and LSM do coincide (in their approximation). In the infinite periodicity length limit the Hamiltonian of the authors lattice QED (as well as the Hamiltonian of the LSM) possesses two different Hilbert spaces of eigenfunctions. Thus, in this limit the authors lattice QED model (as well as LSM) describes something like two connected, but different, worlds

  10. TRILEX and G W +EDMFT approach to d -wave superconductivity in the Hubbard model

    Science.gov (United States)

    Vučičević, J.; Ayral, T.; Parcollet, O.

    2017-09-01

    We generalize the recently introduced TRILEX approach (TRiply irreducible local EXpansion) to superconducting phases. The method treats simultaneously Mott and spin-fluctuation physics using an Eliashberg theory supplemented by local vertex corrections determined by a self-consistent quantum impurity model. We show that, in the two-dimensional Hubbard model, at strong coupling, TRILEX yields a d -wave superconducting dome as a function of doping. Contrary to the standard cluster dynamical mean field theory (DMFT) approaches, TRILEX can capture d -wave pairing using only a single-site effective impurity model. We also systematically explore the dependence of the superconducting temperature on the bare dispersion at weak coupling, which shows a clear link between strong antiferromagnetic (AF) correlations and the onset of superconductivity. We identify a combination of hopping amplitudes particularly favorable to superconductivity at intermediate doping. Finally, we study within G W +EDMFT the low-temperature d -wave superconducting phase at strong coupling in a region of parameter space with reduced AF fluctuations.

  11. LADM and IndoorGML for Support of Indoor Space Identification

    Science.gov (United States)

    Zlatanova, S.; Van Oosterom, P. J. M.; Lee, J.; Li, K.-J.; Lemmen, C. H. J.

    2016-10-01

    Guidance and security in large public buildings such as airports, museums and shopping malls requires much more information that traditional 2D methods offer. Therefore 3D semantically-reach models have been actively investigated with the aim to gather knowledge about availability and accessibility of spaces. Spaces can be unavailable to specific users because of plenty of reasons: the 3D geometry of spaces (too low, too narrow), the properties of the objects to be guided to a specific part of the building (walking, driving, flying), the status of the indoor environment (e.g. crowded, limited light, under reconstruction), property regulations (private areas), security considerations and so on. However, such information is not explicitly avaible in the existing 3D semantically-reach models. IFC and CityGML are restricted to architectural building components and provide little to no means to describe such properties. IndoorGML has been designed to establish a generic approach for space identification allowing a space subdivision and automatic creation of a network for route computation. But currently it also represents only spaces as they are defined by the architectural layout of the building. The Land Administration Domain Model is currently the only available model to specify spaces on the basis of ownership and rights for use. In this paper we compare the principles of IndoorGML and LADM, investigate the approaches to define spaces and suggest options to the linking of the two types of spaces. We argue that LADM space subdivision on basis of properties and rights of use can be used to define to semantically and geometrically available and accessible spaces and therefore can enrich the IndoorGML concept.

  12. A Model of Representational Spaces in Human Cortex.

    Science.gov (United States)

    Guntupalli, J Swaroop; Hanke, Michael; Halchenko, Yaroslav O; Connolly, Andrew C; Ramadge, Peter J; Haxby, James V

    2016-06-01

    Current models of the functional architecture of human cortex emphasize areas that capture coarse-scale features of cortical topography but provide no account for population responses that encode information in fine-scale patterns of activity. Here, we present a linear model of shared representational spaces in human cortex that captures fine-scale distinctions among population responses with response-tuning basis functions that are common across brains and models cortical patterns of neural responses with individual-specific topographic basis functions. We derive a common model space for the whole cortex using a new algorithm, searchlight hyperalignment, and complex, dynamic stimuli that provide a broad sampling of visual, auditory, and social percepts. The model aligns representations across brains in occipital, temporal, parietal, and prefrontal cortices, as shown by between-subject multivariate pattern classification and intersubject correlation of representational geometry, indicating that structural principles for shared neural representations apply across widely divergent domains of information. The model provides a rigorous account for individual variability of well-known coarse-scale topographies, such as retinotopy and category selectivity, and goes further to account for fine-scale patterns that are multiplexed with coarse-scale topographies and carry finer distinctions. © The Author 2016. Published by Oxford University Press.

  13. A comprehensive approach to dark matter studies: exploration of simplified top-philic models

    Energy Technology Data Exchange (ETDEWEB)

    Arina, Chiara; Backović, Mihailo [Centre for Cosmology, Particle Physics and Phenomenology (CP3),Université catholique de Louvain, Chemin du Cyclotron 2, B-1348 Louvain-la-Neuve (Belgium); Conte, Eric [Groupe de Recherche de Physique des Hautes Énergies (GRPHE), Université de Haute-Alsace,IUT Colmar, F-68008 Colmar Cedex (France); Fuks, Benjamin [Sorbonne Universités, UPMC University Paris 06, UMR 7589, LPTHE, F-75005, Paris (France); CNRS, UMR 7589, LPTHE, F-75005, Paris (France); Guo, Jun [State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics,Chinese Academy of Sciences, Beijing 100190 (China); Institut Pluridisciplinaire Hubert Curien/Département Recherches Subatomiques,Université de Strasbourg/CNRS-IN2P3, F-67037 Strasbourg (France); Heisig, Jan [Institute for Theoretical Particle Physics and Cosmology, RWTH Aachen University,Sommerfeldstr. 16, D-52056 Aachen (Germany); Hespel, Benoît [Centre for Cosmology, Particle Physics and Phenomenology (CP3),Université catholique de Louvain, Chemin du Cyclotron 2, B-1348 Louvain-la-Neuve (Belgium); Krämer, Michael [Institute for Theoretical Particle Physics and Cosmology, RWTH Aachen University,Sommerfeldstr. 16, D-52056 Aachen (Germany); Maltoni, Fabio; Martini, Antony [Centre for Cosmology, Particle Physics and Phenomenology (CP3),Université catholique de Louvain, Chemin du Cyclotron 2, B-1348 Louvain-la-Neuve (Belgium); Mawatari, Kentarou [Laboratoire de Physique Subatomique et de Cosmologie, Université Grenoble-Alpes,CNRS/IN2P3, 53 Avenue des Martyrs, F-38026 Grenoble (France); Theoretische Natuurkunde and IIHE/ELEM, Vrije Universiteit Brussel andInternational Solvay Institutes, Pleinlaan 2, B-1050 Brussels (Belgium); Pellen, Mathieu [Universität Würzburg, Institut für Theoretische Physik und Astrophysik,Emil-Hilb-Weg 22, 97074 Würzburg (Germany); Vryonidou, Eleni [Centre for Cosmology, Particle Physics and Phenomenology (CP3),Université catholique de Louvain, Chemin du Cyclotron 2, B-1348 Louvain-la-Neuve (Belgium)

    2016-11-21

    Studies of dark matter lie at the interface of collider physics, astrophysics and cosmology. Constraining models featuring dark matter candidates entails the capability to provide accurate predictions for large sets of observables and compare them to a wide spectrum of data. We present a framework which, starting from a model Lagrangian, allows one to consistently and systematically make predictions, as well as to confront those predictions with a multitude of experimental results. As an application, we consider a class of simplified dark matter models where a scalar mediator couples only to the top quark and a fermionic dark sector (i.e. the simplified top-philic dark matter model). We study in detail the complementarity of relic density, direct/indirect detection and collider searches in constraining the multi-dimensional model parameter space, and efficiently identify regions where individual approaches to dark matter detection provide the most stringent bounds. In the context of collider studies of dark matter, we point out the complementarity of LHC searches in probing different regions of the model parameter space with final states involving top quarks, photons, jets and/or missing energy. Our study of dark matter production at the LHC goes beyond the tree-level approximation and we show examples of how higher-order corrections to dark matter production processes can affect the interpretation of the experimental results.

  14. Pose Space Surface Manipulation

    Directory of Open Access Journals (Sweden)

    Yusuke Yoshiyasu

    2012-01-01

    Full Text Available Example-based mesh deformation techniques produce natural and realistic shapes by learning the space of deformations from examples. However, skeleton-based methods cannot manipulate a global mesh structure naturally, whereas the mesh-based approaches based on a translational control do not allow the user to edit a local mesh structure intuitively. This paper presents an example-driven mesh editing framework that achieves both global and local pose manipulations. The proposed system is built with a surface deformation method based on a two-step linear optimization technique and achieves direct manipulations of a model surface using translational and rotational controls. With the translational control, the user can create a model in natural poses easily. The rotational control can adjust the local pose intuitively by bending and twisting. We encode example deformations with a rotation-invariant mesh representation which handles large rotations in examples. To incorporate example deformations, we infer a pose from the handle translations/rotations and perform pose space interpolation, thereby avoiding involved nonlinear optimization. With the two-step linear approach combined with the proposed multiresolution deformation method, we can edit models at interactive rates without losing important deformation effects such as muscle bulging.

  15. Kin-aesthetic Space-making

    DEFF Research Database (Denmark)

    Brabrand, Helle

    2016-01-01

    -Francois Lyotard’s Gestus , discussing the work-of-art as a sensuously expressed ‘torsion’ of space/ time/ matter, producing its own space/ time/ matter. Erin Brannigan in Dancefilm uses the gesture-model as well, and points to a hybrid practice where dance and film work on each other. Likewise Shaun Gallagher...... as well as their production of meaning. Concurrently the practice questions presentation/ representation and creator/ spectator relations. Gesture-models call for an understanding of the work-of-art as creating affordance; affordance in the sense that effects generated between embodied-enactive perception......’s How the Body Shapes the Mind forms part of the theoretical approach to motile kin-aesthetical forces of art-making, underlying this paper. In my practice I work with body- and space gestures, interchanging through a ‘third’ material, featured on screens. The hybrid production includes animated 2 and 3...

  16. Estimation of vegetation photosynthetic capacity from space-based measurements of chlorophyll fluorescence for terrestrial biosphere models.

    Science.gov (United States)

    Zhang, Yongguang; Guanter, Luis; Berry, Joseph A; Joiner, Joanna; van der Tol, Christiaan; Huete, Alfredo; Gitelson, Anatoly; Voigt, Maximilian; Köhler, Philipp

    2014-12-01

    Photosynthesis simulations by terrestrial biosphere models are usually based on the Farquhar's model, in which the maximum rate of carboxylation (Vcmax ) is a key control parameter of photosynthetic capacity. Even though Vcmax is known to vary substantially in space and time in response to environmental controls, it is typically parameterized in models with tabulated values associated to plant functional types. Remote sensing can be used to produce a spatially continuous and temporally resolved view on photosynthetic efficiency, but traditional vegetation observations based on spectral reflectance lack a direct link to plant photochemical processes. Alternatively, recent space-borne measurements of sun-induced chlorophyll fluorescence (SIF) can offer an observational constraint on photosynthesis simulations. Here, we show that top-of-canopy SIF measurements from space are sensitive to Vcmax at the ecosystem level, and present an approach to invert Vcmax from SIF data. We use the Soil-Canopy Observation of Photosynthesis and Energy (SCOPE) balance model to derive empirical relationships between seasonal Vcmax and SIF which are used to solve the inverse problem. We evaluate our Vcmax estimation method at six agricultural flux tower sites in the midwestern US using spaced-based SIF retrievals. Our Vcmax estimates agree well with literature values for corn and soybean plants (average values of 37 and 101 μmol m(-2)  s(-1) , respectively) and show plausible seasonal patterns. The effect of the updated seasonally varying Vcmax parameterization on simulated gross primary productivity (GPP) is tested by comparing to simulations with fixed Vcmax values. Validation against flux tower observations demonstrate that simulations of GPP and light use efficiency improve significantly when our time-resolved Vcmax estimates from SIF are used, with R(2) for GPP comparisons increasing from 0.85 to 0.93, and for light use efficiency from 0.44 to 0.83. Our results support the use of

  17. GARCH and Irregularly Spaced Data

    NARCIS (Netherlands)

    Meddahi, N.; Renault, E.; Werker, B.J.M.

    2003-01-01

    An exact discretization of continuous time stochastic volatility processes observed at irregularly spaced times is used to give insights on how a coherent GARCH model can be specified for such data. The relation of our approach with those in the existing literature is studied.

  18. A Note on the Problem of Proper Time in Weyl Space-Time

    Science.gov (United States)

    Avalos, R.; Dahia, F.; Romero, C.

    2018-02-01

    We discuss the question of whether or not a general Weyl structure is a suitable mathematical model of space-time. This is an issue that has been in debate since Weyl formulated his unified field theory for the first time. We do not present the discussion from the point of view of a particular unification theory, but instead from a more general standpoint, in which the viability of such a structure as a model of space-time is investigated. Our starting point is the well known axiomatic approach to space-time given by Elhers, Pirani and Schild (EPS). In this framework, we carry out an exhaustive analysis of what is required for a consistent definition for proper time and show that such a definition leads to the prediction of the so-called "second clock effect". We take the view that if, based on experience, we were to reject space-time models predicting this effect, this could be incorporated as the last axiom in the EPS approach. Finally, we provide a proof that, in this case, we are led to a Weyl integrable space-time as the most general structure that would be suitable to model space-time.

  19. Activity markers and household space in Swahili urban contexts: An integrated geoarchaeological approach

    DEFF Research Database (Denmark)

    Wynne-Jones, Stephanie; Sulas, Federica

    , this paper draws from recent work at a Swahili urban site to illustrate the potential and challenges of an integrated geoarchaeological approach to the study of household space. The site of Songo Mnara (14th–16thc. AD) thrived as a Swahili stonetown off the coast of Tanzania. Here, our work has concentrated...

  20. Phase space approach to quantum dynamics

    International Nuclear Information System (INIS)

    Leboeuf, P.

    1991-03-01

    The Schroedinger equation for the time propagation of states of a quantised two-dimensional spherical phase space is replaced by the dynamics of a system of N particles lying in phase space. This is done through factorization formulae of analytic function theory arising in coherent-state representation, the 'particles' being the zeros of the quantum state. For linear Hamiltonians, like a spin in a uniform magnetic field, the motion of the particles is classical. However, non-linear terms induce interactions between the particles. Their time propagation is studied and it is shown that, contrary to integrable systems, for chaotic maps they tend to fill, as their classical counterpart, the whole phase space. (author) 13 refs., 3 figs