WorldWideScience

Sample records for general framework based

  1. A general framework for sensor-based human activity recognition.

    Science.gov (United States)

    Köping, Lukas; Shirahama, Kimiaki; Grzegorzek, Marcin

    2018-04-01

    Today's wearable devices like smartphones, smartwatches and intelligent glasses collect a large amount of data from their built-in sensors like accelerometers and gyroscopes. These data can be used to identify a person's current activity and in turn can be utilised for applications in the field of personal fitness assistants or elderly care. However, developing such systems is subject to certain restrictions: (i) since more and more new sensors will be available in the future, activity recognition systems should be able to integrate these new sensors with a small amount of manual effort and (ii) such systems should avoid high acquisition costs for computational power. We propose a general framework that achieves an effective data integration based on the following two characteristics: Firstly, a smartphone is used to gather and temporally store data from different sensors and transfer these data to a central server. Thus, various sensors can be integrated into the system as long as they have programming interfaces to communicate with the smartphone. The second characteristic is a codebook-based feature learning approach that can encode data from each sensor into an effective feature vector only by tuning a few intuitive parameters. In the experiments, the framework is realised as a real-time activity recognition system that integrates eight sensors from a smartphone, smartwatch and smartglasses, and its effectiveness is validated from different perspectives such as accuracies, sensor combinations and sampling rates. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. A general framework for regularized, similarity-based image restoration.

    Science.gov (United States)

    Kheradmand, Amin; Milanfar, Peyman

    2014-12-01

    Any image can be represented as a function defined on a weighted graph, in which the underlying structure of the image is encoded in kernel similarity and associated Laplacian matrices. In this paper, we develop an iterative graph-based framework for image restoration based on a new definition of the normalized graph Laplacian. We propose a cost function, which consists of a new data fidelity term and regularization term derived from the specific definition of the normalized graph Laplacian. The normalizing coefficients used in the definition of the Laplacian and associated regularization term are obtained using fast symmetry preserving matrix balancing. This results in some desired spectral properties for the normalized Laplacian such as being symmetric, positive semidefinite, and returning zero vector when applied to a constant image. Our algorithm comprises of outer and inner iterations, where in each outer iteration, the similarity weights are recomputed using the previous estimate and the updated objective function is minimized using inner conjugate gradient iterations. This procedure improves the performance of the algorithm for image deblurring, where we do not have access to a good initial estimate of the underlying image. In addition, the specific form of the cost function allows us to render the spectral analysis for the solutions of the corresponding linear equations. In addition, the proposed approach is general in the sense that we have shown its effectiveness for different restoration problems, including deblurring, denoising, and sharpening. Experimental results verify the effectiveness of the proposed algorithm on both synthetic and real examples.

  3. Fundamentals of sketch-based passwords a general framework

    CERN Document Server

    Riggan, Benjamin S; Wang, Cliff

    2015-01-01

    This SpringerBrief explores graphical password systems and examines novel drawing-based methods in terms of security, usability, and human computer-interactions. It provides a systematic approach for recognizing, comparing, and matching sketch-based passwords in the context of modern computing systems. The book offers both a security and usability analysis of the accumulative framework used for incorporating handwriting biometrics and a human computer-interaction performance analysis. The chapters offer new perspectives and experimental results regarding model uniqueness, recognition tolerance

  4. KNOWLEDGE SOCIETY, GENERAL FRAMEWORK FOR KNOWLEDGE BASED ECONOMY

    Directory of Open Access Journals (Sweden)

    Dragos CRISTEA

    2011-03-01

    Full Text Available This paper tries to present the existent relation between knowledge society and knowledge based economy. We will identify the main pillars of knowledge society and present their importance for the development of knowledge societies. Further, we will present two perspectives over knowledge societies, respectively science and learning perspectives, that directly affects knowledge based economies. At the end, we will conclude by identifying some important questions that must be answered regarding this new social paradigm.

  5. Generalized frameworks for first-order evolution inclusions based on Yosida approximations

    Directory of Open Access Journals (Sweden)

    Ram U. Verma

    2011-04-01

    Full Text Available First, general frameworks for the first-order evolution inclusions are developed based on the A-maximal relaxed monotonicity, and then using the Yosida approximation the solvability of a general class of first-order nonlinear evolution inclusions is investigated. The role the A-maximal relaxed monotonicity is significant in the sense that it not only empowers the first-order nonlinear evolution inclusions but also generalizes the existing Yosida approximations and its characterizations in the current literature.

  6. Generalized Yosida Approximations Based on Relatively A-Maximal m-Relaxed Monotonicity Frameworks

    Directory of Open Access Journals (Sweden)

    Heng-you Lan

    2013-01-01

    Full Text Available We introduce and study a new notion of relatively A-maximal m-relaxed monotonicity framework and discuss some properties of a new class of generalized relatively resolvent operator associated with the relatively A-maximal m-relaxed monotone operator and the new generalized Yosida approximations based on relatively A-maximal m-relaxed monotonicity framework. Furthermore, we give some remarks to show that the theory of the new generalized relatively resolvent operator and Yosida approximations associated with relatively A-maximal m-relaxed monotone operators generalizes most of the existing notions on (relatively maximal monotone mappings in Hilbert as well as Banach space and can be applied to study variational inclusion problems and first-order evolution equations as well as evolution inclusions.

  7. An Exemplar-Based Multi-View Domain Generalization Framework for Visual Recognition.

    Science.gov (United States)

    Niu, Li; Li, Wen; Xu, Dong; Cai, Jianfei

    2018-02-01

    In this paper, we propose a new exemplar-based multi-view domain generalization (EMVDG) framework for visual recognition by learning robust classifier that are able to generalize well to arbitrary target domain based on the training samples with multiple types of features (i.e., multi-view features). In this framework, we aim to address two issues simultaneously. First, the distribution of training samples (i.e., the source domain) is often considerably different from that of testing samples (i.e., the target domain), so the performance of the classifiers learnt on the source domain may drop significantly on the target domain. Moreover, the testing data are often unseen during the training procedure. Second, when the training data are associated with multi-view features, the recognition performance can be further improved by exploiting the relation among multiple types of features. To address the first issue, considering that it has been shown that fusing multiple SVM classifiers can enhance the domain generalization ability, we build our EMVDG framework upon exemplar SVMs (ESVMs), in which a set of ESVM classifiers are learnt with each one trained based on one positive training sample and all the negative training samples. When the source domain contains multiple latent domains, the learnt ESVM classifiers are expected to be grouped into multiple clusters. To address the second issue, we propose two approaches under the EMVDG framework based on the consensus principle and the complementary principle, respectively. Specifically, we propose an EMVDG_CO method by adding a co-regularizer to enforce the cluster structures of ESVM classifiers on different views to be consistent based on the consensus principle. Inspired by multiple kernel learning, we also propose another EMVDG_MK method by fusing the ESVM classifiers from different views based on the complementary principle. In addition, we further extend our EMVDG framework to exemplar-based multi-view domain

  8. A general science-based framework for dynamical spatio-temporal models

    Science.gov (United States)

    Wikle, C.K.; Hooten, M.B.

    2010-01-01

    Spatio-temporal statistical models are increasingly being used across a wide variety of scientific disciplines to describe and predict spatially-explicit processes that evolve over time. Correspondingly, in recent years there has been a significant amount of research on new statistical methodology for such models. Although descriptive models that approach the problem from the second-order (covariance) perspective are important, and innovative work is being done in this regard, many real-world processes are dynamic, and it can be more efficient in some cases to characterize the associated spatio-temporal dependence by the use of dynamical models. The chief challenge with the specification of such dynamical models has been related to the curse of dimensionality. Even in fairly simple linear, first-order Markovian, Gaussian error settings, statistical models are often over parameterized. Hierarchical models have proven invaluable in their ability to deal to some extent with this issue by allowing dependency among groups of parameters. In addition, this framework has allowed for the specification of science based parameterizations (and associated prior distributions) in which classes of deterministic dynamical models (e. g., partial differential equations (PDEs), integro-difference equations (IDEs), matrix models, and agent-based models) are used to guide specific parameterizations. Most of the focus for the application of such models in statistics has been in the linear case. The problems mentioned above with linear dynamic models are compounded in the case of nonlinear models. In this sense, the need for coherent and sensible model parameterizations is not only helpful, it is essential. Here, we present an overview of a framework for incorporating scientific information to motivate dynamical spatio-temporal models. First, we illustrate the methodology with the linear case. We then develop a general nonlinear spatio-temporal framework that we call general quadratic

  9. A general CFD framework for fault-resilient simulations based on multi-resolution information fusion

    Science.gov (United States)

    Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em

    2017-10-01

    We develop a general CFD framework for multi-resolution simulations to target multiscale problems but also resilience in exascale simulations, where faulty processors may lead to gappy, in space-time, simulated fields. We combine approximation theory and domain decomposition together with statistical learning techniques, e.g. coKriging, to estimate boundary conditions and minimize communications by performing independent parallel runs. To demonstrate this new simulation approach, we consider two benchmark problems. First, we solve the heat equation (a) on a small number of spatial "patches" distributed across the domain, simulated by finite differences at fine resolution and (b) on the entire domain simulated at very low resolution, thus fusing multi-resolution models to obtain the final answer. Second, we simulate the flow in a lid-driven cavity in an analogous fashion, by fusing finite difference solutions obtained with fine and low resolution assuming gappy data sets. We investigate the influence of various parameters for this framework, including the correlation kernel, the size of a buffer employed in estimating boundary conditions, the coarseness of the resolution of auxiliary data, and the communication frequency across different patches in fusing the information at different resolution levels. In addition to its robustness and resilience, the new framework can be employed to generalize previous multiscale approaches involving heterogeneous discretizations or even fundamentally different flow descriptions, e.g. in continuum-atomistic simulations.

  10. A general framework for global asymptotic stability analysis of delayed neural networks based on LMI approach

    International Nuclear Information System (INIS)

    Cao Jinde; Ho, Daniel W.C.

    2005-01-01

    In this paper, global asymptotic stability is discussed for neural networks with time-varying delay. Several new criteria in matrix inequality form are given to ascertain the uniqueness and global asymptotic stability of equilibrium point for neural networks with time-varying delay based on Lyapunov method and Linear Matrix Inequality (LMI) technique. The proposed LMI approach has the advantage of considering the difference of neuronal excitatory and inhibitory efforts, which is also computationally efficient as it can be solved numerically using recently developed interior-point algorithm. In addition, the proposed results generalize and improve previous works. The obtained criteria also combine two existing conditions into one generalized condition in matrix form. An illustrative example is also given to demonstrate the effectiveness of the proposed results

  11. A general framework of automorphic inflation

    International Nuclear Information System (INIS)

    Schimmrigk, Rolf

    2016-01-01

    Automorphic inflation is an application of the framework of automorphic scalar field theory, based on the theory of automorphic forms and representations. In this paper the general framework of automorphic and modular inflation is described in some detail, with emphasis on the resulting stratification of the space of scalar field theories in terms of the group theoretic data associated to the shift symmetry, as well as the automorphic data that specifies the potential. The class of theories based on Eisenstein series provides a natural generalization of the model of j-inflation considered previously.

  12. A general framework of automorphic inflation

    Energy Technology Data Exchange (ETDEWEB)

    Schimmrigk, Rolf [Department of Physics, Indiana University at South Bend,1700 Mishawaka Ave. South Bend, IN 46634 (United States)

    2016-05-24

    Automorphic inflation is an application of the framework of automorphic scalar field theory, based on the theory of automorphic forms and representations. In this paper the general framework of automorphic and modular inflation is described in some detail, with emphasis on the resulting stratification of the space of scalar field theories in terms of the group theoretic data associated to the shift symmetry, as well as the automorphic data that specifies the potential. The class of theories based on Eisenstein series provides a natural generalization of the model of j-inflation considered previously.

  13. A general framework for intelligent recommender systems

    Directory of Open Access Journals (Sweden)

    Jose Aguilar

    2017-07-01

    Full Text Available In this paper, we propose a general framework for an intelligent recommender system that extends the concept of a knowledge-based recommender system. The intelligent recommender system exploits knowledge, learns, discovers new information, infers preferences and criticisms, among other things. For that, the framework of an intelligent recommender system is defined by the following components: knowledge representation paradigm, learning methods, and reasoning mechanisms. Additionally, it has five knowledge models about the different aspects that we can consider during a recommendation: users, items, domain, context and criticisms. The mix of the components exploits the knowledge, updates it and infers, among other things. In this work, we implement one intelligent recommender system based on this framework, using Fuzzy Cognitive Maps (FCMs. Next, we test the performance of the intelligent recommender system with specialized criteria linked to the utilization of the knowledge in order to test the versatility and performance of the framework.

  14. A Generalized DRM Architectural Framework

    Directory of Open Access Journals (Sweden)

    PATRICIU, V. V.

    2011-02-01

    Full Text Available Online digital goods distribution environment lead to the need for a system to protect digital intellectual property. Digital Rights Management (DRM is the system born to protect and control distribution and use of those digital assets. The present paper is a review of the current state of DRM, focusing on architectural design, security technologies, and important DRM deployments. The paper primarily synthesizes DRM architectures within a general framework. We also present DRM ecosystem as providing a better understanding of what is currently happening to content rights management from a technological point of view. This paper includes conclusions of several DRM initiative studies, related to rights management systems with the purpose of identifying and describing the most significant DRM architectural models. The basic functions and processes of the DRM solutions are identified.

  15. Development of a General Purpose Gamification Framework

    OpenAIRE

    Vea, Eivind

    2016-01-01

    This report describes the design and implementation of a general purpose gamification framework developed in JavaScript on the Metor platform. Gamification is described as the use of game elements in none-game contexts. The purpose is to encourage and change user behaviour. Examples of existing gamification use cases and frameworks are described. A demo game shows how a general purpose framework can be used.

  16. A general framework for reasoning on inconsistency

    CERN Document Server

    Martinez, Maria Vanina; Subrahmanian, VS; Amgoud, Leila

    2013-01-01

    This SpringerBrief proposes a general framework for reasoning about inconsistency in a wide variety of logics, including inconsistency resolution methods that have not yet been studied.  The proposed framework allows users to specify preferences on how to resolve inconsistency when there are multiple ways to do so. This empowers users to resolve inconsistency in data leveraging both their detailed knowledge of the data as well as their application needs. The brief shows that the framework is well-suited to handle inconsistency in several logics, and provides algorithms to compute preferred opt

  17. A General Model of Sensitized Luminescence in Lanthanide-Based Coordination Polymers and Metal-Organic Framework Materials.

    Science.gov (United States)

    Einkauf, Jeffrey D; Clark, Jessica M; Paulive, Alec; Tanner, Garrett P; de Lill, Daniel T

    2017-05-15

    Luminescent lanthanides containing coordination polymers and metal-organic frameworks hold great potential in many applications due to their distinctive spectroscopic properties. While the ability to design coordination polymers for specific functions is often mentioned as a major benefit bestowed on these compounds, the lack of a meaningful understanding of the luminescence in lanthanide coordination polymers remains a significant challenge toward functional design. Currently, the study of these compounds is based on the antenna effect as derived from molecular systems, where organic antennae are used to facilitate lanthanide-centered luminescence. This molecular-based approach does not take into account the unique features of extended network solids, particularly the formation of band structure. While guidelines for the antenna effect are well established, they require modification before being applied to coordination polymers. A series of nine coordination polymers with varying topologies and organic linkers were studied to investigate the accuracy of the antenna effect in coordination polymer systems. By comparing a molecular-based approach to a band-based one, it was determined that the band structure that occurs in aggregated organic solids needs to be considered when evaluating the luminescence of lanthanide coordination polymers.

  18. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...

  19. Evaluating the Generalization Value of Process-based Models in a Deep-in-time Machine Learning framework

    Science.gov (United States)

    Shen, C.; Fang, K.

    2017-12-01

    Deep Learning (DL) methods have made revolutionary strides in recent years. A core value proposition of DL is that abstract notions and patterns can be extracted purely from data, without the need for domain expertise. Process-based models (PBM), on the other hand, can be regarded as repositories of human knowledge or hypotheses about how systems function. Here, through computational examples, we argue that there is merit in integrating PBMs with DL due to the imbalance and lack of data in many situations, especially in hydrology. We trained a deep-in-time neural network, the Long Short-Term Memory (LSTM), to learn soil moisture dynamics from Soil Moisture Active Passive (SMAP) Level 3 product. We show that when PBM solutions are integrated into LSTM, the network is able to better generalize across regions. LSTM is able to better utilize PBM solutions than simpler statistical methods. Our results suggest PBMs have generalization value which should be carefully assessed and utilized. We also emphasize that when properly regularized, the deep network is robust and is of superior testing performance compared to simpler methods.

  20. Using Eight Key Questions as an Inquiry-Based Framework for Ethical Reasoning Issues in a General Education Earth Systems and Climate Change Course

    Science.gov (United States)

    Johnson, E. A.; Ball, T. C.

    2014-12-01

    An important objective in general education geoscience courses is to help students evaluate social and ethical issues based upon scientific knowledge. It can be difficult for instructors trained in the physical sciences to design effective ways of including ethical issues in large lecture courses where whole-class discussions are not practical. The Quality Enhancement Plan for James Madison University, "The Madison Collaborative: Ethical Reasoning in Action," (http://www.jmu.edu/mc/index.shtml) has identified eight key questions to be used as a framework for developing ethical reasoning exercises and evaluating student learning. These eight questions are represented by the acronym FOR CLEAR and are represented by the concepts of Fairness, Outcomes, Responsibilities, Character, Liberty, Empathy, Authority, and Rights. In this study, we use the eight key questions as an inquiry-based framework for addressing ethical issues in a 100-student general education Earth systems and climate change course. Ethical reasoning exercises are presented throughout the course and range from questions of personal behavior to issues regarding potential future generations and global natural resources. In the first few exercises, key questions are identified for the students and calibrated responses are provided as examples. By the end of the semester, students are expected to identify key questions themselves and justify their own ethical and scientific reasoning. Evaluation rubrics are customized to this scaffolding approach to the exercises. Student feedback and course data will be presented to encourage discussion of this and other approaches to explicitly incorporating ethical reasoning in general education geoscience courses.

  1. A General Framework for Reviewing Dictionaries

    DEFF Research Database (Denmark)

    Nielsen, Sandro

    2013-01-01

    in specific types of situations in the real (extra-lexicographic) world. I propose a basis for a framework that contains an outline of general theoretical and practical principles that underlie the true nature of dictionary reviews, and places the reviews in a lexicographic universe with the dictionary...... and lexicography at its centre. This seems to be in line with the modern understanding of lexicography as a separate academic discipline concerned with the compilation, design, evaluation and use of dictionaries. Moreover, a set of generally applicable principles may lead the discourse community to accept...

  2. A general framework for updating belief distributions.

    Science.gov (United States)

    Bissiri, P G; Holmes, C C; Walker, S G

    2016-11-01

    We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data-generating mechanism. For instance, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our framework uses loss functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.

  3. Evaluating and optimizing the operation of the hydropower system in the Upper Yellow River: A general LINGO-based integrated framework.

    Science.gov (United States)

    Si, Yuan; Li, Xiang; Yin, Dongqin; Liu, Ronghua; Wei, Jiahua; Huang, Yuefei; Li, Tiejian; Liu, Jiahong; Gu, Shenglong; Wang, Guangqian

    2018-01-01

    The hydropower system in the Upper Yellow River (UYR), one of the largest hydropower bases in China, plays a vital role in the energy structure of the Qinghai Power Grid. Due to management difficulties, there is still considerable room for improvement in the joint operation of this system. This paper presents a general LINGO-based integrated framework to study the operation of the UYR hydropower system. The framework is easy to use for operators with little experience in mathematical modeling, takes full advantage of LINGO's capabilities (such as its solving capacity and multi-threading ability), and packs its three layers (the user layer, the coordination layer, and the base layer) together into an integrated solution that is robust and efficient and represents an effective tool for data/scenario management and analysis. The framework is general and can be easily transferred to other hydropower systems with minimal effort, and it can be extended as the base layer is enriched. The multi-objective model that represents the trade-off between power quantity (i.e., maximum energy production) and power reliability (i.e., firm output) of hydropower operation has been formulated. With equivalent transformations, the optimization problem can be solved by the nonlinear programming (NLP) solvers embedded in the LINGO software, such as the General Solver, the Multi-start Solver, and the Global Solver. Both simulation and optimization are performed to verify the model's accuracy and to evaluate the operation of the UYR hydropower system. A total of 13 hydropower plants currently in operation are involved, including two pivotal storage reservoirs on the Yellow River, which are the Longyangxia Reservoir and the Liujiaxia Reservoir. Historical hydrological data from multiple years (2000-2010) are provided as input to the model for analysis. The results are as follows. 1) Assuming that the reservoirs are all in operation (in fact, some reservoirs were not operational or did not

  4. A general modeling framework for describing spatially structured population dynamics

    Science.gov (United States)

    Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan

    2017-01-01

    Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance

  5. Generalized Boolean logic Driven Markov Processes: A powerful modeling framework for Model-Based Safety Analysis of dynamic repairable and reconfigurable systems

    International Nuclear Information System (INIS)

    Piriou, Pierre-Yves; Faure, Jean-Marc; Lesage, Jean-Jacques

    2017-01-01

    This paper presents a modeling framework that permits to describe in an integrated manner the structure of the critical system to analyze, by using an enriched fault tree, the dysfunctional behavior of its components, by means of Markov processes, and the reconfiguration strategies that have been planned to ensure safety and availability, with Moore machines. This framework has been developed from BDMP (Boolean logic Driven Markov Processes), a previous framework for dynamic repairable systems. First, the contribution is motivated by pinpointing the limitations of BDMP to model complex reconfiguration strategies and the failures of the control of these strategies. The syntax and semantics of GBDMP (Generalized Boolean logic Driven Markov Processes) are then formally defined; in particular, an algorithm to analyze the dynamic behavior of a GBDMP model is developed. The modeling capabilities of this framework are illustrated on three representative examples. Last, qualitative and quantitative analysis of GDBMP models highlight the benefits of the approach.

  6. Evaluating and optimizing the operation of the hydropower system in the Upper Yellow River: A general LINGO-based integrated framework

    Science.gov (United States)

    Si, Yuan; Liu, Ronghua; Wei, Jiahua; Huang, Yuefei; Li, Tiejian; Liu, Jiahong; Gu, Shenglong; Wang, Guangqian

    2018-01-01

    The hydropower system in the Upper Yellow River (UYR), one of the largest hydropower bases in China, plays a vital role in the energy structure of the Qinghai Power Grid. Due to management difficulties, there is still considerable room for improvement in the joint operation of this system. This paper presents a general LINGO-based integrated framework to study the operation of the UYR hydropower system. The framework is easy to use for operators with little experience in mathematical modeling, takes full advantage of LINGO’s capabilities (such as its solving capacity and multi-threading ability), and packs its three layers (the user layer, the coordination layer, and the base layer) together into an integrated solution that is robust and efficient and represents an effective tool for data/scenario management and analysis. The framework is general and can be easily transferred to other hydropower systems with minimal effort, and it can be extended as the base layer is enriched. The multi-objective model that represents the trade-off between power quantity (i.e., maximum energy production) and power reliability (i.e., firm output) of hydropower operation has been formulated. With equivalent transformations, the optimization problem can be solved by the nonlinear programming (NLP) solvers embedded in the LINGO software, such as the General Solver, the Multi-start Solver, and the Global Solver. Both simulation and optimization are performed to verify the model’s accuracy and to evaluate the operation of the UYR hydropower system. A total of 13 hydropower plants currently in operation are involved, including two pivotal storage reservoirs on the Yellow River, which are the Longyangxia Reservoir and the Liujiaxia Reservoir. Historical hydrological data from multiple years (2000–2010) are provided as input to the model for analysis. The results are as follows. 1) Assuming that the reservoirs are all in operation (in fact, some reservoirs were not operational or did

  7. Evaluating and optimizing the operation of the hydropower system in the Upper Yellow River: A general LINGO-based integrated framework.

    Directory of Open Access Journals (Sweden)

    Yuan Si

    Full Text Available The hydropower system in the Upper Yellow River (UYR, one of the largest hydropower bases in China, plays a vital role in the energy structure of the Qinghai Power Grid. Due to management difficulties, there is still considerable room for improvement in the joint operation of this system. This paper presents a general LINGO-based integrated framework to study the operation of the UYR hydropower system. The framework is easy to use for operators with little experience in mathematical modeling, takes full advantage of LINGO's capabilities (such as its solving capacity and multi-threading ability, and packs its three layers (the user layer, the coordination layer, and the base layer together into an integrated solution that is robust and efficient and represents an effective tool for data/scenario management and analysis. The framework is general and can be easily transferred to other hydropower systems with minimal effort, and it can be extended as the base layer is enriched. The multi-objective model that represents the trade-off between power quantity (i.e., maximum energy production and power reliability (i.e., firm output of hydropower operation has been formulated. With equivalent transformations, the optimization problem can be solved by the nonlinear programming (NLP solvers embedded in the LINGO software, such as the General Solver, the Multi-start Solver, and the Global Solver. Both simulation and optimization are performed to verify the model's accuracy and to evaluate the operation of the UYR hydropower system. A total of 13 hydropower plants currently in operation are involved, including two pivotal storage reservoirs on the Yellow River, which are the Longyangxia Reservoir and the Liujiaxia Reservoir. Historical hydrological data from multiple years (2000-2010 are provided as input to the model for analysis. The results are as follows. 1 Assuming that the reservoirs are all in operation (in fact, some reservoirs were not operational

  8. CHI: A General Agent Communication Framework

    Energy Technology Data Exchange (ETDEWEB)

    Goldsmith, S.Y.; Phillips, L.R.; Spires, S.V.

    1998-12-17

    We have completed and exercised a communication framework called CHI (CLOS to HTML Interface) by which agents can communicate with humans. CHI follows HTTP (HyperText Transfer Protocol) and produces HTML (HyperText Markup Language) for use by WWW (World-Wide Web) browsers. CHI enables the rapid and dynamic construction of interface mechanisms. The essence of CHI is automatic registration of dynamically generated interface elements to named objects in the agent's internal environment. The agent can access information in these objects at will. State is preserved, so an agent can pursue branching interaction sequences, activate failure recovery behaviors, and otherwise act opportunistically to maintain a conversation. The CHI mechanism remains transparent in multi-agent, multi-user environments because of automatically generated unique identifiers built into the CHI mechanism. In this paper we discuss design, language, implementation, and extension issues, and, by way of illustration, examine the use of the general CHI/HCHI mechanism in a specific international electronic commerce system. We conclude that the CHI mechanism is an effective, efficient, and extensible means of the agent/human communication.

  9. A general framework for performance guaranteed green data center networking

    OpenAIRE

    Wang, Ting; Xia, Yu; Muppala, Jogesh; Hamdi, Mounir; Foufou, Sebti

    2014-01-01

    From the perspective of resource allocation and routing, this paper aims to save as much energy as possible in data center networks. We present a general framework, based on the blocking island paradigm, to try to maximize the network power conservation and minimize sacrifices of network performance and reliability. The bandwidth allocation mechanism together with power-aware routing algorithm achieve a bandwidth guaranteed tighter network. Besides, our fast efficient heuristics for allocatin...

  10. General framework and key technologies of national nuclear emergency system

    International Nuclear Information System (INIS)

    Yuan Feng; Li Xudong; Zhu Guangying; Song Yafeng; Zeng Suotian; Shen Lifeng

    2014-01-01

    Nuclear emergency is the important safeguard for the sustainable development of nuclear energy, and is the significant part of national public crisis management. The paper gives the definition of nuclear emergency system explicitly based on the analysis of the characteristics of the nuclear emergency, and through the research of the structure and general framework, the general framework of the national nuclear emergency management system (NNEMS) is obtained, which is constructed in four parts, including one integrative platform, six layers, eight applications and two systems, then the paper indicate that the architecture of national emergency system that should be laid out by three-tiers, i.e. national, provincial and organizations with nuclear facilities, and also describe the functions of the NNEMS on the nuclear emergency's workflow. Finally, the paper discuss the key technology that NNIEMS needed, such as WebGIS, auxiliary decision-making, digitalized preplan and the conformity and usage of resources, and analyze the technical principle in details. (authors)

  11. A partial differential equation-based general framework adapted to Rayleigh's, Rician's and Gaussian's distributed noise for restoration and enhancement of magnetic resonance image.

    Science.gov (United States)

    Yadav, Ram Bharos; Srivastava, Subodh; Srivastava, Rajeev

    2016-01-01

    The proposed framework is obtained by casting the noise removal problem into a variational framework. This framework automatically identifies the various types of noise present in the magnetic resonance image and filters them by choosing an appropriate filter. This filter includes two terms: the first term is a data likelihood term and the second term is a prior function. The first term is obtained by minimizing the negative log likelihood of the corresponding probability density functions: Gaussian or Rayleigh or Rician. Further, due to the ill-posedness of the likelihood term, a prior function is needed. This paper examines three partial differential equation based priors which include total variation based prior, anisotropic diffusion based prior, and a complex diffusion (CD) based prior. A regularization parameter is used to balance the trade-off between data fidelity term and prior. The finite difference scheme is used for discretization of the proposed method. The performance analysis and comparative study of the proposed method with other standard methods is presented for brain web dataset at varying noise levels in terms of peak signal-to-noise ratio, mean square error, structure similarity index map, and correlation parameter. From the simulation results, it is observed that the proposed framework with CD based prior is performing better in comparison to other priors in consideration.

  12. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Directory of Open Access Journals (Sweden)

    Qi Zheng

    2016-10-01

    Full Text Available Accurate mapping of next-generation sequencing (NGS reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  13. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Science.gov (United States)

    Zheng, Qi; Grice, Elizabeth A

    2016-10-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  14. A Generalized Cauchy Distribution Framework for Problems Requiring Robust Behavior

    Directory of Open Access Journals (Sweden)

    Carrillo RafaelE

    2010-01-01

    Full Text Available Statistical modeling is at the heart of many engineering problems. The importance of statistical modeling emanates not only from the desire to accurately characterize stochastic events, but also from the fact that distributions are the central models utilized to derive sample processing theories and methods. The generalized Cauchy distribution (GCD family has a closed-form pdf expression across the whole family as well as algebraic tails, which makes it suitable for modeling many real-life impulsive processes. This paper develops a GCD theory-based approach that allows challenging problems to be formulated in a robust fashion. Notably, the proposed framework subsumes generalized Gaussian distribution (GGD family-based developments, thereby guaranteeing performance improvements over traditional GCD-based problem formulation techniques. This robust framework can be adapted to a variety of applications in signal processing. As examples, we formulate four practical applications under this framework: (1 filtering for power line communications, (2 estimation in sensor networks with noisy channels, (3 reconstruction methods for compressed sensing, and (4 fuzzy clustering.

  15. Generalized eigenvalue based spectrum sensing

    KAUST Repository

    Shakir, Muhammad

    2012-01-01

    Spectrum sensing is one of the fundamental components in cognitive radio networks. In this chapter, a generalized spectrum sensing framework which is referred to as Generalized Mean Detector (GMD) has been introduced. In this context, we generalize the detectors based on the eigenvalues of the received signal covariance matrix and transform the eigenvalue based spectrum sensing detectors namely: (i) the Eigenvalue Ratio Detector (ERD) and two newly proposed detectors which are referred to as (ii) the GEometric Mean Detector (GEMD) and (iii) the ARithmetic Mean Detector (ARMD) into an unified framework of generalize spectrum sensing. The foundation of the proposed framework is based on the calculation of exact analytical moments of the random variables of the decision threshold of the respective detectors. The decision threshold has been calculated in a closed form which is based on the approximation of Cumulative Distribution Functions (CDFs) of the respective test statistics. In this context, we exchange the analytical moments of the two random variables of the respective test statistics with the moments of the Gaussian (or Gamma) distribution function. The performance of the eigenvalue based detectors is compared with the several traditional detectors including the energy detector (ED) to validate the importance of the eigenvalue based detectors and the performance of the GEMD and the ARMD particularly in realistic wireless cognitive radio network. Analytical and simulation results show that the newly proposed detectors yields considerable performance advantage in realistic spectrum sensing scenarios. Moreover, the presented results based on proposed approximation approaches are in perfect agreement with the empirical results. © 2012 Springer Science+Business Media Dordrecht.

  16. A Cloud Based Data Integration Framework

    OpenAIRE

    Jiang , Nan; Xu , Lai; Vrieze , Paul ,; Lim , Mian-Guan; Jarabo , Oscar

    2012-01-01

    Part 7: Cloud-Based Support; International audience; Virtual enterprise (VE) relies on resource sharing and collaboration across geographically dispersed and dynamically allied businesses in order to better respond to market opportunities. It is generally considered that effective data integration and management is crucial to realise the value of VE. This paper describes a cloud-based data integration framework that can be used for supporting VE to discover, explore and respond more emerging ...

  17. General description of the national regulatory framework

    International Nuclear Information System (INIS)

    Basurto C, J.

    2008-12-01

    Of the six existing hierarchy in the Mexican legal system, the first five of the Constitution of the Mexican Official Norms have a binding character, containing mandatory requirements to meet, while the sixth is not binding. The articles that have nuclear subject in the Mexican Constitution are 25, 27 and 28. At the same hierarchical level as the Constitution is the international treaties signed and ratified by our country, such as for example the Nuclear Safety Convention or the Convention on the Nuclear Materials Physical Protection. For the treaties negotiation process consists of a text adoption, the authentication which implies the final content signature and unalterable, the Senate approval and the consent submission, which implies the ratification and publication in the Federation Official Gazette. In the case of Mexican laws your proposition process includes an initiative, the analysis of the relevant committee, the discussion, approval, sanction, and with it the publication of the initiation force. The road can become very convoluted because some steps are recurrent, returning to the same level several times. Regulations whose purpose is to clarify, develop or explain the general principles contained in the laws that relate to more obtainable your application are subject to a similarly complicated process. First we developed a preliminary draft by the competent authority subject to revision and opinion of the respective institutions prior to submission to the Federal Executive. The final document is submitted to the latter project, which must be approved by the agencies involved, approved by the Federal Executive and finally published in the Federation Official Gazette, from which it launched its application. (Author)

  18. Set-based Tasks within the Singularity-robust Multiple Task-priority Inverse Kinematics Framework: General Formulation, Stability Analysis and Experimental Results

    Directory of Open Access Journals (Sweden)

    Signe eMoe

    2016-04-01

    Full Text Available Inverse kinematics algorithms are commonly used in robotic systems to transform tasks to joint references, and several methods exist to ensure the achievement of several tasks simultaneously. The multiple task-priority inverse kinematicsframework allows tasks to be considered in a prioritized order by projecting task velocities through the nullspaces of higherpriority tasks. This paper extends this framework to handle setbased tasks, i.e. tasks with a range of valid values, in addition to equality tasks, which have a specific desired value. Examples of set-based tasks are joint limit and obstacle avoidance. The proposed method is proven to ensure asymptotic convergence of the equality task errors and the satisfaction of all high-priority set-based tasks. The practical implementation of the proposed algorithm is discussed, and experimental results are presented where a number of both set-based and equality tasks have been implemented on a 6 degree of freedom UR5 which is an industrial robotic arm from Universal Robots. The experiments validate thetheoretical results and confirm the effectiveness of the proposed approach.

  19. Calculating observables in inhomogeneous cosmologies. Part I: general framework

    Science.gov (United States)

    Hellaby, Charles; Walters, Anthony

    2018-02-01

    We lay out a general framework for calculating the variation of a set of cosmological observables, down the past null cone of an arbitrarily placed observer, in a given arbitrary inhomogeneous metric. The observables include redshift, proper motions, area distance and redshift-space density. Of particular interest are observables that are zero in the spherically symmetric case, such as proper motions. The algorithm is based on the null geodesic equation and the geodesic deviation equation, and it is tailored to creating a practical numerical implementation. The algorithm provides a method for tracking which light rays connect moving objects to the observer at successive times. Our algorithm is applied to the particular case of the Szekeres metric. A numerical implementation has been created and some results will be presented in a subsequent paper. Future work will explore the range of possibilities.

  20. A General Solution Framework for Component-Commonality Problems

    Directory of Open Access Journals (Sweden)

    Nils Boysen

    2009-05-01

    Full Text Available Component commonality - the use of the same version of a component across multiple products - is being increasingly considered as a promising way to offer high external variety while retaining low internal variety in operations. However, increasing commonality has both positive and negative cost effects, so that optimization approaches are required to identify an optimal commonality level. As components influence to a greater or lesser extent nearly every process step along the supply chain, it is not surprising that a multitude of diverging commonality problems is being investigated in literature, each of which are developing a specific algorithm designed for the respective commonality problem being considered. The paper on hand aims at a general framework which is flexible and efficient enough to be applied to a wide range of commonality problems. Such a procedure based on a two-stage graph approach is presented and tested. Finally, flexibility of the procedure is shown by customizing the framework to account for different types of commonality problems.

  1. Agent-Based Data Integration Framework

    Directory of Open Access Journals (Sweden)

    Łukasz Faber

    2014-01-01

    Full Text Available Combining data from diverse, heterogeneous sources while facilitating a unified access to it is an important (albeit difficult task. There are various possibilities of performing it. In this publication, we propose and describe an agent-based framework dedicated to acquiring and processing distributed, heterogeneous data collected from diverse sources (e.g., the Internet, external software, relational, and document databases. Using this multi-agent-based approach in the aspects of the general architecture (the organization and management of the framework, we create a proof-of-concept implementation. The approach is presented using a sample scenario in which the system is used to search for personal and professional profiles of scientists.

  2. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    Science.gov (United States)

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  3. A general framework for unambiguous detection of quantum states

    International Nuclear Information System (INIS)

    Eldar, Y.

    2004-01-01

    Full Text:The problem of detecting information stored in the state of a quantum system is a fundamental problem in quantum information theory. Several approaches have emerged to distinguishing between a collection of non-orthogonal quantum states. We consider the problem of unambiguous detection where we seek a measurement that with a certain probability returns an inconclusive result, but such that if the measurement returns an answer, then the answer is correct with probability 1. We begin by considering unambiguous discrimination between a set of linearly independent pure quantum states. We show that the design of the optimal measurement that minimizes the probability of an inconclusive result can be formulated as a semidefinite programming problem. Based on this formulation, we develop a set of necessary and sufficient conditions for an optimal quantum measurement. We show that the optimal measurement can be computed very efficiently in polynomial time by exploiting the many well-known algorithms for solving semidefinite programs, which are guaranteed to converge to the global optimum. Using the general conditions for optimality, we derive necessary and sufficient conditions so that the measurement that results in an equal probability of an inconclusive result for each one of the quantum states is optimal. We refer to this measurement as the equal-probability measurement (EPM). We then show that for any state set, the prior probabilities of the states can be chosen such that the EPM is optimal. Finally, we consider state sets with strong symmetry properties and equal prior probabilities for which the EPM is optimal. We next develop a general framework for unambiguous state discrimination between a collection of mixed quantum states, which can be applied to any number of states with arbitrary prior probabilities. In particular, we derive a set of necessary and sufficient conditions for an optimal measurement that minimizes the probability of an inconclusive

  4. An SOA-based architecture framework

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Beisiegel, M.; Hee, van K.M.; König, D.; Stahl, C.

    2007-01-01

    We present an Service-Oriented Architecture (SOA)– based architecture framework. The architecture framework is designed to be close to industry standards, especially to the Service Component Architecture (SCA). The framework is language independent and the building blocks of each system, activities

  5. IAEA Director General Comments on Cooperation Framework with Iran

    International Nuclear Information System (INIS)

    2013-01-01

    Full text: The following are remarks by the Director General of the International Atomic Energy Agency, Yukiya Amano, at a News Conference after he signed a Joint Statement on a Framework for Cooperation with the Islamic Republic of Iran: ''The International Atomic Energy Agency and the Islamic Republic of Iran have just issued the Joint Statement on a Framework for Cooperation. ''Under the Framework, Iran and the IAEA will cooperate further with respect to verification activities to be undertaken by the IAEA to resolve all present and past issues. The practical measures contained in the Annex are substantive measures and will be implemented in three months starting from today. ''This is an important step forward to start with, but much more needs to be done. ''The outstanding issues that are not contained in the Annex to the Framework for Cooperation, including those in my previous reports to the Board of Governors, will be addressed in the subsequent steps under the Framework for Cooperation. ''The IAEA is firmly committed to resolving all outstanding issues through dialogue and cooperation . (IAEA)

  6. General framework for adsorption processes on dynamic interfaces

    International Nuclear Information System (INIS)

    Schmuck, Markus; Kalliadasis, Serafim

    2016-01-01

    We propose a novel and general variational framework modelling particle adsorption mechanisms on evolving immiscible fluid interfaces. A by-product of our thermodynamic approach is that we systematically obtain analytic adsorption isotherms for given equilibrium interfacial geometries. We validate computationally our mathematical methodology by demonstrating the fundamental properties of decreasing interfacial free energies by increasing interfacial particle densities and of decreasing surface pressure with increasing surface area. (paper)

  7. Deflected Mirage Mediation: A Phenomenological Framework for Generalized Supersymmetry Breaking

    International Nuclear Information System (INIS)

    Everett, Lisa L.; Kim, Ian-Woo; Ouyang, Peter; Zurek, Kathryn M.

    2008-01-01

    We present a general phenomenological framework for dialing between gravity mediation, gauge mediation, and anomaly mediation. The approach is motivated from recent developments in moduli stabilization, which suggest that gravity mediated terms can be effectively loop suppressed and thus comparable to gauge and anomaly mediated terms. The gauginos exhibit a mirage unification behavior at a ''deflected'' scale, and gluinos are often the lightest colored sparticles. The approach provides a rich setting in which to explore generalized supersymmetry breaking at the CERN Large Hadron Collider

  8. Component-based framework for subsurface simulations

    International Nuclear Information System (INIS)

    Palmer, B J; Fang, Yilin; Hammond, Glenn; Gurumoorthi, Vidhya

    2007-01-01

    Simulations in the subsurface environment represent a broad range of phenomena covering an equally broad range of scales. Developing modelling capabilities that can integrate models representing different phenomena acting at different scales present formidable challenges both from the algorithmic and computer science perspective. This paper will describe the development of an integrated framework that will be used to combine different models into a single simulation. Initial work has focused on creating two frameworks, one for performing smooth particle hydrodynamics (SPH) simulations of fluid systems, the other for performing grid-based continuum simulations of reactive subsurface flow. The SPH framework is based on a parallel code developed for doing pore scale simulations, the continuum grid-based framework is based on the STOMP (Subsurface Transport Over Multiple Phases) code developed at PNNL Future work will focus on combining the frameworks together to perform multiscale, multiphysics simulations of reactive subsurface flow

  9. A Framework for Enterprise Operating Systems Based on Zachman Framework

    Science.gov (United States)

    Ostadzadeh, S. Shervin; Rahmani, Amir Masoud

    Nowadays, the Operating System (OS) isn't only the software that runs your computer. In the typical information-driven organization, the operating system is part of a much larger platform for applications and data that extends across the LAN, WAN and Internet. An OS cannot be an island unto itself; it must work with the rest of the enterprise. Enterprise wide applications require an Enterprise Operating System (EOS). Enterprise operating systems used in an enterprise have brought about an inevitable tendency to lunge towards organizing their information activities in a comprehensive way. In this respect, Enterprise Architecture (EA) has proven to be the leading option for development and maintenance of enterprise operating systems. EA clearly provides a thorough outline of the whole information system comprising an enterprise. To establish such an outline, a logical framework needs to be laid upon the entire information system. Zachman Framework (ZF) has been widely accepted as a standard scheme for identifying and organizing descriptive representations that have prominent roles in enterprise-wide system development. In this paper, we propose a framework based on ZF for enterprise operating systems. The presented framework helps developers to design and justify completely integrated business, IT systems, and operating systems which results in improved project success rate.

  10. Generalized min-max bound-based MRI pulse sequence design framework for wide-range T1 relaxometry: A case study on the tissue specific imaging sequence.

    Directory of Open Access Journals (Sweden)

    Yang Liu

    Full Text Available This paper proposes a new design strategy for optimizing MRI pulse sequences for T1 relaxometry. The design strategy optimizes the pulse sequence parameters to minimize the maximum variance of unbiased T1 estimates over a range of T1 values using the Cramér-Rao bound. In contrast to prior sequences optimized for a single nominal T1 value, the optimized sequence using our bound-based strategy achieves improved precision and accuracy for a broad range of T1 estimates within a clinically feasible scan time. The optimization combines the downhill simplex method with a simulated annealing process. To show the effectiveness of the proposed strategy, we optimize the tissue specific imaging (TSI sequence. Preliminary Monte Carlo simulations demonstrate that the optimized TSI sequence yields improved precision and accuracy over the popular driven-equilibrium single-pulse observation of T1 (DESPOT1 approach for normal brain tissues (estimated T1 700-2000 ms at 3.0T. The relative mean estimation error (MSE for T1 estimation is less than 1.7% using the optimized TSI sequence, as opposed to less than 7.0% using DESPOT1 for normal brain tissues. The optimized TSI sequence achieves good stability by keeping the MSE under 7.0% over larger T1 values corresponding to different lesion tissues and the cerebrospinal fluid (up to 5000 ms. The T1 estimation accuracy using the new pulse sequence also shows improvement, which is more pronounced in low SNR scenarios.

  11. Heartbeat-based error diagnosis framework for distributed embedded systems

    Science.gov (United States)

    Mishra, Swagat; Khilar, Pabitra Mohan

    2012-01-01

    Distributed Embedded Systems have significant applications in automobile industry as steer-by-wire, fly-by-wire and brake-by-wire systems. In this paper, we provide a general framework for fault detection in a distributed embedded real time system. We use heartbeat monitoring, check pointing and model based redundancy to design a scalable framework that takes care of task scheduling, temperature control and diagnosis of faulty nodes in a distributed embedded system. This helps in diagnosis and shutting down of faulty actuators before the system becomes unsafe. The framework is designed and tested using a new simulation model consisting of virtual nodes working on a message passing system.

  12. Multi-attribute utility theory. Toward a more general framework

    International Nuclear Information System (INIS)

    Beaudoin, F.; Munier, B.; Serquin, Y.; Ecole Normale Superieure, 94 - Cachan

    1997-12-01

    Optimizing maintenance programs for nuclear power plants is a difficult task. Beyond the reliability of the systems at hand, one has to consider several conflicting objectives such as safety, availability, maintenance costs, personal exposure to radiations, all under risk. Multi-Attributed Utility Theory is a widely used framework to cope with such problems. This procedure is, however, based on a set of axioms which imply an expected utility treatment of risk. It has been shown elsewhere that the risk structure to be considered in such cases does not correspond to behavior consistent with such a treatment of risk, but would rather correspond to a rank dependent evaluation type of model. The question raised is then how to use a multi-attributed scheme of preferences under such conditions. (author)

  13. Model-based safety architecture framework for complex systems

    NARCIS (Netherlands)

    Schuitemaker, Katja; Rajabali Nejad, Mohammadreza; Braakhuis, J.G.; Podofillini, Luca; Sudret, Bruno; Stojadinovic, Bozidar; Zio, Enrico; Kröger, Wolfgang

    2015-01-01

    The shift to transparency and rising need of the general public for safety, together with the increasing complexity and interdisciplinarity of modern safety-critical Systems of Systems (SoS) have resulted in a Model-Based Safety Architecture Framework (MBSAF) for capturing and sharing architectural

  14. Framework for Railway Phase-based Planning

    DEFF Research Database (Denmark)

    Li, Rui; Landex, Alex; Nielsen, Otto Anker

    spending) to evaluate the project alternatives. The comparison result can identify the most cost-efficient solution in a long run and therefore reduce the overall costs. This article defines a phase-based framework to guide the railway maintenance and renewal project planning at strategic level....... The framework evaluates the project options from a larger LCC scope: The costs from train operation companies and passengers, together with the maintenance and renewal costs from Infrastructure Managers are included in the calculation. The framework simplifies the planning processes and the LCC calculation...... and compared. A case study is introduced in the article to demonstrate how the framework works to compare timber sleepers and concrete sleepers from strategic planning level. Two Life Cycle Cost oriented policies are discussed to illustrate: high quality track is necessity to improve the cost efficiency...

  15. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  16. Non-self-adjoint Hamiltonians defined by generalized Riesz bases

    Energy Technology Data Exchange (ETDEWEB)

    Inoue, H., E-mail: h-inoue@math.kyushu-u.ac.jp [Graduate School of Mathematics, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka 819-0395 (Japan); Takakura, M., E-mail: mayumi@fukuoka-u.ac.jp [Department of Applied Mathematics, Fukuoka University, Fukuoka 814-0180 (Japan)

    2016-08-15

    Bagarello, Inoue, and Trapani [J. Math. Phys. 55, 033501 (2014)] investigated some operators defined by the Riesz bases. These operators connect with quasi-Hermitian quantum mechanics, and its relatives. In this paper, we introduce a notion of generalized Riesz bases which is a generalization of Riesz bases and investigate some operators defined by the generalized Riesz bases by changing the frameworks of the operators defined in the work of Bagarello, Inoue, and Trapani.

  17. A Framework for Argumentation-Based Negotiation

    OpenAIRE

    Sierra, C.; Jennings, N. R.; Noriega, P.; Parsons, S.

    1997-01-01

    Many autonomous agents operate in domains in which the cooperation of their fellow agents cannot be guaranteed. In such domains negotiation is essential to persuade others of the value of co-operation. This paper describes a general framework for negotiation in which agents exchange proposals backed by arguments which summarise the reasons why the proposals should be accepted.The argumentation is persuasive because the exchanges are able to alter the mental state of the agents involved. The f...

  18. Generalizing Backtrack-Free Search: A Framework for Search-Free Constraint Satisfaction

    Science.gov (United States)

    Jonsson, Ari K.; Frank, Jeremy

    2000-01-01

    Tractable classes of constraint satisfaction problems are of great importance in artificial intelligence. Identifying and taking advantage of such classes can significantly speed up constraint problem solving. In addition, tractable classes are utilized in applications where strict worst-case performance guarantees are required, such as constraint-based plan execution. In this work, we present a formal framework for search-free (backtrack-free) constraint satisfaction. The framework is based on general procedures, rather than specific propagation techniques, and thus generalizes existing techniques in this area. We also relate search-free problem solving to the notion of decision sets and use the result to provide a constructive criterion that is sufficient to guarantee search-free problem solving.

  19. A motion sensing-based framework for robotic manipulation.

    Science.gov (United States)

    Deng, Hao; Xia, Zeyang; Weng, Shaokui; Gan, Yangzhou; Fang, Peng; Xiong, Jing

    2016-01-01

    To data, outside of the controlled environments, robots normally perform manipulation tasks operating with human. This pattern requires the robot operators with high technical skills training for varied teach-pendant operating system. Motion sensing technology, which enables human-machine interaction in a novel and natural interface using gestures, has crucially inspired us to adopt this user-friendly and straightforward operation mode on robotic manipulation. Thus, in this paper, we presented a motion sensing-based framework for robotic manipulation, which recognizes gesture commands captured from motion sensing input device and drives the action of robots. For compatibility, a general hardware interface layer was also developed in the framework. Simulation and physical experiments have been conducted for preliminary validation. The results have shown that the proposed framework is an effective approach for general robotic manipulation with motion sensing control.

  20. Restful API Architecture Based on Laravel Framework

    Science.gov (United States)

    Chen, Xianjun; Ji, Zhoupeng; Fan, Yu; Zhan, Yongsong

    2017-10-01

    Web service has been an industry standard tech for message communication and integration between heterogeneous systems. RESTFUL API has become mainstream web service development paradigm after SOAP, how to effectively construct RESTFUL API remains a research hotspots. This paper presents a development model of RESTFUL API construction based on PHP language and LARAVEL framework. The key technical problems that need to be solved during the construction of RESTFUL API are discussed, and implementation details based on LARAVEL are given.

  1. Context-Aware Usage-Based Grid Authorization Framework

    Institute of Scientific and Technical Information of China (English)

    CUI Yongquan; HONG Fan; FU Cai

    2006-01-01

    Due to inherent heterogeneity, multi-domain characteristic and highly dynamic nature, authorization is a critical concern in grid computing. This paper proposes a general authorization and access control architecture, grid usage control (GUCON), for grid computing. It's based on the next generation access control mechanism usage control (UCON) model. The GUCON Framework dynamic grants and adapts permission to the subject based on a set of contextual information collected from the system environments; while retaining the authorization by evaluating access requests based on subject attributes, object attributes and requests. In general, GUCON model provides very flexible approaches to adapt the dynamically security request. GUCON model is being implemented in our experiment prototype.

  2. A model-based risk management framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune

    2002-08-15

    The ongoing research activity addresses these issues through two co-operative activities. The first is the IST funded research project CORAS, where Institutt for energiteknikk takes part as responsible for the work package for Risk Analysis. The main objective of the CORAS project is to develop a framework to support risk assessment of security critical systems. The second, called the Halden Open Dependability Demonstrator (HODD), is established in cooperation between Oestfold University College, local companies and HRP. The objective of HODD is to provide an open-source test bed for testing, teaching and learning about risk analysis methods, risk analysis tools, and fault tolerance techniques. The Inverted Pendulum Control System (IPCON), which main task is to keep a pendulum balanced and controlled, is the first system that has been established. In order to make risk assessment one need to know what a system does, or is intended to do. Furthermore, the risk assessment requires correct descriptions of the system, its context and all relevant features. A basic assumption is that a precise model of this knowledge, based on formal or semi-formal descriptions, such as UML, will facilitate a systematic risk assessment. It is also necessary to have a framework to integrate the different risk assessment methods. The experiences so far support this hypothesis. This report presents CORAS and the CORAS model-based risk management framework, including a preliminary guideline for model-based risk assessment. The CORAS framework for model-based risk analysis offers a structured and systematic approach to identify and assess security issues of ICT systems. From the initial assessment of IPCON, we also believe that the framework is applicable in a safety context. Further work on IPCON, as well as the experiences from the CORAS trials, will provide insight and feedback for further improvements. (Author)

  3. A General Probabilistic Forecasting Framework for Offshore Wind Power Fluctuations

    DEFF Research Database (Denmark)

    Trombe, Pierre-Julien; Pinson, Pierre; Madsen, Henrik

    2012-01-01

    Accurate wind power forecasts highly contribute to the integration of wind power into power systems. The focus of the present study is on large-scale offshore wind farms and the complexity of generating accurate probabilistic forecasts of wind power fluctuations at time-scales of a few minutes...... fluctuations are characterized by highly volatile dynamics which are difficult to capture and predict. Due to the lack of adequate on-site meteorological observations to relate these dynamics to meteorological phenomena, we propose a general model formulation based on a statistical approach and historical wind...... power measurements only. We introduce an advanced Markov Chain Monte Carlo (MCMC) estimation method to account for the different features observed in an empirical time series of wind power: autocorrelation, heteroscedasticity and regime-switching. The model we propose is an extension of Markov...

  4. Deflected Mirage Mediation: A Framework for Generalized Supersymmetry Breaking

    International Nuclear Information System (INIS)

    Kim, Ian-Woo

    2008-01-01

    We present a model of supersymmetry breaking in which the contributions from gravity/modulus, anomaly, and gauge mediation are all comparable. We term this scenario 'deflected mirage mediation', which is a generalization of the KKLT-motivated mirage mediation scenario to include gauge mediated contributions. These contributions deflect the gaugino mass unification scale and alter the pattern of soft parameters at low energies. Competitive gauge-mediated terms can naturally appear within phenomenological models based on the KKLT setup by the stabilization of the gauge singlet field responsible for the masses of the messenger fields. We analyze the renormalization group evolution of the supersymmetry breaking terms and the resulting low energy mass spectra.

  5. Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick Start Guide (Revision 1)

    Science.gov (United States)

    2017-06-01

    ARL-CR-0816 ● JUNE 2017 US Army Research Laboratory Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open...to the originator. ARL-CR-0816 ● JUNE 2017 US Army Research Laboratory Generalized Intelligent Framework for Tutoring (GIFT...January 2017 4. TITLE AND SUBTITLE Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick-Start Guide (Revision 1

  6. A General Framework for Setting Quantitative Population Objectives for Wildlife Conservation

    Directory of Open Access Journals (Sweden)

    Kristen E. Dybala

    2017-03-01

    Full Text Available https://doi.org/10.15447/sfews.2017v15iss1art8Quantitative population objectives are necessary to successfully achieve conservation goals of secure or robust wildlife populations. However, existing methods for setting quantitative population objectives commonly require extensive species-specific population viability data, which are often unavailable or are based on estimates of historical population sizes, which may no longer represent feasible objectives. Conservation practitioners require an alternative, science-based method for setting long-term quantitative population objectives. We reviewed conservation biology literature to develop a general conceptual framework that represents conservation biology principles and identifies key milestones a population would be expected to pass in the process of becoming a recovered or robust population. We then synthesized recent research to propose general hypotheses for the orders of magnitude at which most populations would be expected to reach each milestone. The framework is structured as a hierarchy of four population sizes, ranging from very small populations at increased risk of inbreeding depression and extirpation (< 1,000 adults to large populations with minimized risk of extirpation (> 50,000 adults, along with additional modifiers describing steeply declining and resilient populations. We also discuss the temporal and geographic scales at which this framework should be applied. To illustrate the application of this framework to conservation planning, we outline our use of the framework to set long-term population objectives for a multi-species regional conservation plan, and discuss additional considerations in applying this framework to other systems. This general framework provides a transparent, science-based method by which conservation practitioners and stakeholders can agree on long-term population objectives of an appropriate magnitude, particularly when the alternative approaches are

  7. CYBER FORENSICS COMPETENCY-BASED FRAMEWORK - AREVIEW

    OpenAIRE

    Elfadil Sabeil; Azizah Bt Abdul Manaf; Zuraini Ismail; Mohamed Abas

    2011-01-01

    Lack of Cyber Forensics experts is a huge challenge facing the world today. It comes due to the fancy of Cyber Forensics training or education. The multidisciplinary nature of Cyber Forensics proliferates to diverse training programmes, from a handful day‟s workshop to Postgraduate in Cyber Forensics. Consequently, this paper concentrates on analyzing the Cyber Forensics training programmes in terms of Competency-Based Framework. The study proves that Cyber Forensics training or education h...

  8. Rich preference-based argumentation frameworks

    OpenAIRE

    Amgoud , Leila; Vesic , Srdjan

    2014-01-01

    International audience; An argumentation framework is seen as a directed graph whose nodes are arguments and arcs are attacks between the arguments. Acceptable sets of arguments, called extensions, are computed using a semantics. Existing semantics are solely based on the attacks and do not take into account other important criteria like the intrinsic strengths of arguments. The contribution of this paper is three fold. First, we study how preferences issued from differences in strengths of a...

  9. A General Probabilistic Forecasting Framework for Offshore Wind Power Fluctuations

    Directory of Open Access Journals (Sweden)

    Henrik Madsen

    2012-03-01

    Full Text Available Accurate wind power forecasts highly contribute to the integration of wind power into power systems. The focus of the present study is on large-scale offshore wind farms and the complexity of generating accurate probabilistic forecasts of wind power fluctuations at time-scales of a few minutes. Such complexity is addressed from three perspectives: (i the modeling of a nonlinear and non-stationary stochastic process; (ii the practical implementation of the model we proposed; (iii the gap between working on synthetic data and real world observations. At time-scales of a few minutes, offshore fluctuations are characterized by highly volatile dynamics which are difficult to capture and predict. Due to the lack of adequate on-site meteorological observations to relate these dynamics to meteorological phenomena, we propose a general model formulation based on a statistical approach and historical wind power measurements only. We introduce an advanced Markov Chain Monte Carlo (MCMC estimation method to account for the different features observed in an empirical time series of wind power: autocorrelation, heteroscedasticity and regime-switching. The model we propose is an extension of Markov-Switching Autoregressive (MSAR models with Generalized AutoRegressive Conditional Heteroscedastic (GARCH errors in each regime to cope with the heteroscedasticity. Then, we analyze the predictive power of our model on a one-step ahead exercise of time series sampled over 10 min intervals. Its performances are compared to state-of-the-art models and highlight the interest of including a GARCH specification for density forecasts.

  10. Porter's contribution to more general and dynamic strategy frameworks

    NARCIS (Netherlands)

    F.A.J. van den Bosch (Frans)

    1997-01-01

    textabstractIntroduction. Understanding why firms are successful is a very basic question in strategy both from a practitioner and a research perspective. In the strategy and management literature, however, we are confronted with different analytical frameworks, applicable at different levels

  11. National Service Frameworks and UK general practitioners: street-level bureaucrats at work?

    Science.gov (United States)

    Checkland, Kath

    2004-11-01

    This paper argues that the past decade has seen significant changes in the nature of medical work in general practice in the UK. Increasing pressure to use normative clinical guidelines and the move towards explicit quantitative measures of performance together have the potential to alter the way in which health care is delivered to patients. Whilst it is possible to view these developments from the well-established sociological perspectives of deprofessionalisation and proletarianisation, this paper takes a view of general practice as work, and uses the ideas of Lipsky to analyse practice-level responses to some of these changes. In addition to evidence-based clinical guidelines, National Service Frameworks, introduced by the UK government in 1997, also specify detailed models of service provision that health care providers are expected to follow. As part of a larger study examining the impact of National Service Frameworks in general practice, the response of three practices to the first four NSFs were explored. The failure of NSFs to make a significant impact is compared to the practices' positive responses to purely clinical guidelines such as those developed by the British Hypertension Society. Lipsky's concept of public service workers as 'street-level bureaucrats' is discussed and used as a framework within which to view these findings.

  12. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  13. General framework and basis of decommissioning of nuclear facilities

    International Nuclear Information System (INIS)

    Santiago, J. L.; Martin, N.; Correa, C.

    2013-01-01

    This article summarizes the legal framework defining the strategies, the main activities and the basic responsibilities and roles of the various agents involved in the decommissioning of nuclear facilities in Spain. It also describes briefly the most relevant projects and activities already developed and/or ongoing nowadays, which have positioned Spain within the small group of countries having an integrated and proved experience and know how in this particular field. (Author)

  14. Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick-Start Guide

    Science.gov (United States)

    2016-03-01

    The GIFT Account allows users to log into GIFT Cloud , manage their personal storage in GIFT Cloud , download GIFT Local, and access resources...ARL-CR-0796 ● MAR 2016 US Army Research Laboratory Generalized Intelligent Framework for Tutoring (GIFT) Cloud / Virtual Open...originator. ARL-CR-0796 ● MAR 2016 US Army Research Laboratory Generalized Intelligent Framework for Tutoring (GIFT) Cloud / Virtual

  15. Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework

    Science.gov (United States)

    Chen, Huilin; Chen, Jinsong

    2016-01-01

    Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…

  16. An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from an Author’s Perspective

    Science.gov (United States)

    2014-12-01

    An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from an Author’s Perspective by Robert A Sottilare, Keith W...Intelligent Framework for Tutoring (GIFT) from an Author’s Perspective Robert A Sottilare and Keith W Brawner Human Research and Engineering...SUBTITLE An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from an Author’s Perspective 5a. CONTRACT NUMBER 5b. GRANT

  17. The strategic marketing planning – General Framework for Customer Segmentation

    Directory of Open Access Journals (Sweden)

    Alina Elena OPRESCU

    2014-03-01

    Full Text Available Any approach that involves the use of strategic resources of an organisation requires a responsible approach, a behaviour that enables it to properly integrate itself into the dynamic of the business environment. This articles addresses in a synthetic manner, the issues of specific integration efforts for customers’ segmentation in the strategic marketing planning. The essential activity for any organisation wishing to optimise its response to the market, the customer segmentation will fully benefit from the framework provided by the strategic marketing planning. Being a sequential process, it not only allows time optimisation of the entire marketing activity but it also leads to accuracy of the strategic planning and its stages.

  18. Assessment of information impacts in power system security against malicious attacks in a general framework

    International Nuclear Information System (INIS)

    Bompard, E.; Napoli, R.; Xue, F.

    2009-01-01

    In the analysis of power systems security, recently a new concern related to possible malicious attacks caught much attention. Coordination among different transmission system operators (TSO) in an interconnected power system to counteract such attacks has become an important problem. This paper presents a general framework for describing the physical, cyber and decision-making aspects of the problem and their interrelations; within this framework, an analytic tool for the assessment of information impacts in handling on-line security after a malicious attack is proposed and discussed. The model is based on the socially rational multi-agent systems and the equilibrium of a fictitious play is considered to analyze the impacts of various levels of information available to the interconnected system operators on the outcomes of the decision-making process under attack. A 34-buses test system, with 3 systems interconnected by tie-lines, is presented to illustrate the model and compare the impacts of different information scenarios

  19. Assessment of information impacts in power system security against malicious attacks in a general framework

    Energy Technology Data Exchange (ETDEWEB)

    Bompard, E. [Dipartimento di Ingegneria Elettrica, Politecnico di Torino, I-10129 Torino (Italy)], E-mail: ettore.bompard@polito.it; Napoli, R.; Xue, F. [Dipartimento di Ingegneria Elettrica, Politecnico di Torino, I-10129 Torino (Italy)

    2009-06-15

    In the analysis of power systems security, recently a new concern related to possible malicious attacks caught much attention. Coordination among different transmission system operators (TSO) in an interconnected power system to counteract such attacks has become an important problem. This paper presents a general framework for describing the physical, cyber and decision-making aspects of the problem and their interrelations; within this framework, an analytic tool for the assessment of information impacts in handling on-line security after a malicious attack is proposed and discussed. The model is based on the socially rational multi-agent systems and the equilibrium of a fictitious play is considered to analyze the impacts of various levels of information available to the interconnected system operators on the outcomes of the decision-making process under attack. A 34-buses test system, with 3 systems interconnected by tie-lines, is presented to illustrate the model and compare the impacts of different information scenarios.

  20. Regional frameworks applied to hydrology: can landscape-based frameworks capture the hydrologic variability?

    Science.gov (United States)

    R. McManamay; D. Orth; C. Dolloff; E. Frimpong

    2011-01-01

    Regional frameworks have been used extensively in recent years to aid in broad-scale management. Widely used landscape-based regional frameworks, such as hydrologic landscape regions (HLRs) and physiographic provinces, may provide predictive tools of hydrologic variability. However, hydrologic-based regional frameworks, created using only streamflow data, are also...

  1. Implementation of a PETN failure model using ARIA's general chemistry framework

    Energy Technology Data Exchange (ETDEWEB)

    Hobbs, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    We previously developed a PETN thermal decomposition model that accurately predicts thermal ignition and detonator failure [1]. This model was originally developed for CALORE [2] and required several complex user subroutines. Recently, a simplified version of the PETN decomposition model was implemented into ARIA [3] using a general chemistry framework without need for user subroutines. Detonator failure was also predicted with this new model using ENCORE. The model was simplified by 1) basing the model on moles rather than mass, 2) simplifying the thermal conductivity model, and 3) implementing ARIA’s new phase change model. This memo briefly describes the model, implementation, and validation.

  2. Generalized framework for context-specific metabolic model extraction methods

    Directory of Open Access Journals (Sweden)

    Semidán eRobaina Estévez

    2014-09-01

    Full Text Available Genome-scale metabolic models are increasingly applied to investigate the physiology not only of simple prokaryotes, but also eukaryotes, such as plants, characterized with compartmentalized cells of multiple types. While genome-scale models aim at including the entirety of known metabolic reactions, mounting evidence has indicated that only a subset of these reactions is active in a given context, including: developmental stage, cell type, or environment. As a result, several methods have been proposed to reconstruct context-specific models from existing genome-scale models by integrating various types of high-throughput data. Here we present a mathematical framework that puts all existing methods under one umbrella and provides the means to better understand their functioning, highlight similarities and differences, and to help users in selecting a most suitable method for an application.

  3. General framework for fluctuating dynamic density functional theory

    Science.gov (United States)

    Durán-Olivencia, Miguel A.; Yatsyshin, Peter; Goddard, Benjamin D.; Kalliadasis, Serafim

    2017-12-01

    We introduce a versatile bottom-up derivation of a formal theoretical framework to describe (passive) soft-matter systems out of equilibrium subject to fluctuations. We provide a unique connection between the constituent-particle dynamics of real systems and the time evolution equation of their measurable (coarse-grained) quantities, such as local density and velocity. The starting point is the full Hamiltonian description of a system of colloidal particles immersed in a fluid of identical bath particles. Then, we average out the bath via Zwanzig’s projection-operator techniques and obtain the stochastic Langevin equations governing the colloidal-particle dynamics. Introducing the appropriate definition of the local number and momentum density fields yields a generalisation of the Dean-Kawasaki (DK) model, which resembles the stochastic Navier-Stokes description of a fluid. Nevertheless, the DK equation still contains all the microscopic information and, for that reason, does not represent the dynamical law of observable quantities. We address this controversial feature of the DK description by carrying out a nonequilibrium ensemble average. Adopting a natural decomposition into local-equilibrium and nonequilibrium contribution, where the former is related to a generalised version of the canonical distribution, we finally obtain the fluctuating-hydrodynamic equation governing the time-evolution of the mesoscopic density and momentum fields. Along the way, we outline the connection between the ad hoc energy functional introduced in previous DK derivations and the free-energy functional from classical density-functional theory. The resultant equation has the structure of a dynamical density-functional theory (DDFT) with an additional fluctuating force coming from the random interactions with the bath. We show that our fluctuating DDFT formalism corresponds to a particular version of the fluctuating Navier-Stokes equations, originally derived by Landau and Lifshitz

  4. DEEP: a general computational framework for predicting enhancers

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2014-11-05

    Transcription regulation in multicellular eukaryotes is orchestrated by a number of DNA functional elements located at gene regulatory regions. Some regulatory regions (e.g. enhancers) are located far away from the gene they affect. Identification of distal regulatory elements is a challenge for the bioinformatics research. Although existing methodologies increased the number of computationally predicted enhancers, performance inconsistency of computational models across different cell-lines, class imbalance within the learning sets and ad hoc rules for selecting enhancer candidates for supervised learning, are some key questions that require further examination. In this study we developed DEEP, a novel ensemble prediction framework. DEEP integrates three components with diverse characteristics that streamline the analysis of enhancer\\'s properties in a great variety of cellular conditions. In our method we train many individual classification models that we combine to classify DNA regions as enhancers or non-enhancers. DEEP uses features derived from histone modification marks or attributes coming from sequence characteristics. Experimental results indicate that DEEP performs better than four state-of-the-art methods on the ENCODE data. We report the first computational enhancer prediction results on FANTOM5 data where DEEP achieves 90.2% accuracy and 90% geometric mean (GM) of specificity and sensitivity across 36 different tissues. We further present results derived using in vivo-derived enhancer data from VISTA database. DEEP-VISTA, when tested on an independent test set, achieved GM of 80.1% and accuracy of 89.64%. DEEP framework is publicly available at http://cbrc.kaust.edu.sa/deep/.

  5. DEEP: a general computational framework for predicting enhancers

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Kalnis, Panos; Bajic, Vladimir B.

    2014-01-01

    Transcription regulation in multicellular eukaryotes is orchestrated by a number of DNA functional elements located at gene regulatory regions. Some regulatory regions (e.g. enhancers) are located far away from the gene they affect. Identification of distal regulatory elements is a challenge for the bioinformatics research. Although existing methodologies increased the number of computationally predicted enhancers, performance inconsistency of computational models across different cell-lines, class imbalance within the learning sets and ad hoc rules for selecting enhancer candidates for supervised learning, are some key questions that require further examination. In this study we developed DEEP, a novel ensemble prediction framework. DEEP integrates three components with diverse characteristics that streamline the analysis of enhancer's properties in a great variety of cellular conditions. In our method we train many individual classification models that we combine to classify DNA regions as enhancers or non-enhancers. DEEP uses features derived from histone modification marks or attributes coming from sequence characteristics. Experimental results indicate that DEEP performs better than four state-of-the-art methods on the ENCODE data. We report the first computational enhancer prediction results on FANTOM5 data where DEEP achieves 90.2% accuracy and 90% geometric mean (GM) of specificity and sensitivity across 36 different tissues. We further present results derived using in vivo-derived enhancer data from VISTA database. DEEP-VISTA, when tested on an independent test set, achieved GM of 80.1% and accuracy of 89.64%. DEEP framework is publicly available at http://cbrc.kaust.edu.sa/deep/.

  6. LOCAL TEXTURE DESCRIPTION FRAMEWORK FOR TEXTURE BASED FACE RECOGNITION

    Directory of Open Access Journals (Sweden)

    R. Reena Rose

    2014-02-01

    Full Text Available Texture descriptors have an important role in recognizing face images. However, almost all the existing local texture descriptors use nearest neighbors to encode a texture pattern around a pixel. But in face images, most of the pixels have similar characteristics with that of its nearest neighbors because the skin covers large area in a face and the skin tone at neighboring regions are same. Therefore this paper presents a general framework called Local Texture Description Framework that uses only eight pixels which are at certain distance apart either circular or elliptical from the referenced pixel. Local texture description can be done using the foundation of any existing local texture descriptors. In this paper, the performance of the proposed framework is verified with three existing local texture descriptors Local Binary Pattern (LBP, Local Texture Pattern (LTP and Local Tetra Patterns (LTrPs for the five issues viz. facial expression, partial occlusion, illumination variation, pose variation and general recognition. Five benchmark databases JAFFE, Essex, Indian faces, AT&T and Georgia Tech are used for the experiments. Experimental results demonstrate that even with less number of patterns, the proposed framework could achieve higher recognition accuracy than that of their base models.

  7. The quality and outcomes framework: QOF - transforming general practice

    National Research Council Canada - National Science Library

    Gillam, Stephen; Siriwardena, Aloysius Niroshan

    2011-01-01

    ... comprehensive scheme of its kind in the world. Champions claim the QOF advances the quality of primary care; detractors fear the end of general practice as we know it. The introduction of the QOF provides a unique opportunity for research, analysis and re ection. This book is the rst comprehensive analysis of the impact of the QOF, examining the claims and counter-claims ...

  8. Request for All - Generalized Request Framework for PhEDEx

    CERN Document Server

    Huang, C-H; Ratnikova, N.; Sanchez-Hernandez, A.; Zhang, X.; Magini, N.

    2014-01-01

    PhEDEx has been serving CMS community since 2004 as the data broker. Every PhEDEx operation is initiated by a request, such as request to move data, request to delete data, and so on. A request has it own life cycle, including creation, approval, notification, and book keeping and the details depend on its type. Currently, only two kinds of requests, transfer and deletion, are fully integrated in PhEDEx. They are tailored specifically to the operations workflows. To be able to serve a new type of request it generally means a fair amount of development work.After several years of operation, we have gathered enough experience to rethink the request handling in PhEDEx. Generalized Request Project is set to abstract such experience and come up with a request system which is not tied into current workflow yet it is general enough to accommodate current and future requests.The challenges are dealing with different stages in a requests life cycle, complexity of approval process and complexity of the ability and auth...

  9. On MDA - SOA based Intercloud Interoperability framework

    Directory of Open Access Journals (Sweden)

    Tahereh Nodehi

    2013-01-01

    Full Text Available Cloud computing has been one of the latest technologies which assures reliable delivery of on - demand computing services over the Internet. Cloud service providers have established geographically distributed data centers and computing resources, which are available online as service. The clouds operated by different service providers working together in collaboration can open up lots more spaces for innovative scenarios with huge amount of resources provisioning on demand. However, current cloud systems do not support intercloud interoperability. This paper is thus motivated to address Intercloud Interoperabilityby analyzing different methodologies that have been applied to resolve various scenarios of interoperability. Model Driven Architecture (MDA and Service Oriented Architecture (SOA method have been used to address interoperability in various scenarios, which also opens up spaces to address intercloud interoperability by making use of these well accepted methodologies. The focus of this document is to show Intercloud Interoperability can be supported through a Model Driven approach and Service Oriented systems. Moreover, the current state of the art in Intercloud, concept and benefits of MDA and SOA are discussed in the paper. At the same time this paper also proposes a generic architecture for MDA - SOA based framework, which can be useful for developing applications which will require intercloud interoperability. The paper justi fies the usability of the framework by a use - case scenario for dynamic workload migration among heterogeneous clouds.

  10. Beyond heat baths II: framework for generalized thermodynamic resource theories

    Science.gov (United States)

    Yunger Halpern, Nicole

    2018-03-01

    Thermodynamics, which describes vast systems, has been reconciled with small scales, relevant to single-molecule experiments, in resource theories. Resource theories have been used to model exchanges of energy and information. Recently, particle exchanges were modeled; and an umbrella family of thermodynamic resource theories was proposed to model diverse baths, interactions, and free energies. This paper motivates and details the family’s structure and prospective applications. How to model electrochemical, gravitational, magnetic, and other thermodynamic systems is explained. Szilárd’s engine and Landauer’s Principle are generalized, as resourcefulness is shown to be convertible not only between information and gravitational energy, but also among diverse degrees of freedom. Extensive variables are associated with quantum operators that might fail to commute, introducing extra nonclassicality into thermodynamic resource theories. An early version of this paper partially motivated the later development of noncommutative thermalization. This generalization expands the theories’ potential for modeling realistic systems with which small-scale statistical mechanics might be tested experimentally.

  11. Testing general relativity with compact-body orbits: a modified Einstein–Infeld–Hoffmann framework

    Science.gov (United States)

    Will, Clifford M.

    2018-04-01

    We describe a general framework for analyzing orbits of systems containing compact objects (neutron stars or black holes) in a class of Lagrangian-based alternative theories of gravity that also admit a global preferred reference frame. The framework is based on a modified Einstein–Infeld–Hoffmann (EIH) formalism developed by Eardley and by Will, generalized to include the possibility of Lorentz-violating, preferred-frame effects. It uses a post-Newtonian N-body Lagrangian with arbitrary parameters that depend on the theory of gravity and on ‘sensitivities’ that encode the effects of the bodies’ internal structure on their motion. We determine the modified EIH parameters for the Einstein-Æther and Khronometric vector-tensor theories of gravity. We find the effects of motion relative to a preferred universal frame on the orbital parameters of binary systems containing neutron stars, such as a class of ultra-circular pulsar-white dwarf binaries; the amplitudes of the effects depend upon ‘strong-field’ preferred-frame parameters \\hatα1 and \\hatα2 , which we relate to the fundamental modified EIH parameters. We also determine the amplitude of the ‘Nordtvedt effect’ in a triple system containing the pulsar J0337+1715 in terms of the modified EIH parameters.

  12. Health state evaluation of an item: A general framework and graphical representation

    International Nuclear Information System (INIS)

    Jiang, R.; Jardine, A.K.S.

    2008-01-01

    This paper presents a general theoretical framework to evaluate the health state of an item based on condition monitoring information. The item's health state is defined in terms of its relative health level and overall health level. The former is evaluated based on the relative magnitude of the composite covariate and the latter is evaluated using a fractile life of the residual life distribution at the decision instant. In addition, a method is developed to graphically represent the degradation model, failure threshold model, and the observation history of the composite covariate. As a result, the health state of the monitored item can be intuitively presented and the evaluated result can be subsequently used in a condition-based maintenance optimization decision model, which is amenable to computer modeling. A numerical example is included to illustrate the proposed approach and its appropriateness

  13. A framework for grouping nanoparticles based on their measurable characteristics.

    Science.gov (United States)

    Sayes, Christie M; Smith, P Alex; Ivanov, Ivan V

    2013-01-01

    There is a need to take a broader look at nanotoxicological studies. Eventually, the field will demand that some generalizations be made. To begin to address this issue, we posed a question: are metal colloids on the nanometer-size scale a homogeneous group? In general, most people can agree that the physicochemical properties of nanomaterials can be linked and related to their induced toxicological responses. The focus of this study was to determine how a set of selected physicochemical properties of five specific metal-based colloidal materials on the nanometer-size scale - silver, copper, nickel, iron, and zinc - could be used as nanodescriptors that facilitate the grouping of these metal-based colloids. The example of the framework pipeline processing provided in this paper shows the utility of specific statistical and pattern recognition techniques in grouping nanoparticles based on experimental data about their physicochemical properties. Interestingly, the results of the analyses suggest that a seemingly homogeneous group of nanoparticles could be separated into sub-groups depending on interdependencies observed in their nanodescriptors. These particles represent an important category of nanomaterials that are currently mass produced. Each has been reputed to induce toxicological and/or cytotoxicological effects. Here, we propose an experimental methodology coupled with mathematical and statistical modeling that can serve as a prototype for a rigorous framework that aids in the ability to group nanomaterials together and to facilitate the subsequent analysis of trends in data based on quantitative modeling of nanoparticle-specific structure-activity relationships. The computational part of the proposed framework is rather general and can be applied to other groups of nanomaterials as well.

  14. A General Framework for Analyzing, Characterizing, and Implementing Spectrally Modulated, Spectrally Encoded Signals

    National Research Council Canada - National Science Library

    Roberts, Marcus L

    2006-01-01

    .... Research is rapidly progressing in SDR hardware and software venues, but current CR-based SDR research lacks the theoretical foundation and analytic framework to permit efficient implementation...

  15. A general framework for predicting delayed responses of ecological communities to habitat loss.

    Science.gov (United States)

    Chen, Youhua; Shen, Tsung-Jen

    2017-04-20

    Although biodiversity crisis at different spatial scales has been well recognised, the phenomena of extinction debt and immigration credit at a crossing-scale context are, at best, unclear. Based on two community patterns, regional species abundance distribution (SAD) and spatial abundance distribution (SAAD), Kitzes and Harte (2015) presented a macroecological framework for predicting post-disturbance delayed extinction patterns in the entire ecological community. In this study, we further expand this basic framework to predict diverse time-lagged effects of habitat destruction on local communities. Specifically, our generalisation of KH's model could address the questions that could not be answered previously: (1) How many species are subjected to delayed extinction in a local community when habitat is destructed in other areas? (2) How do rare or endemic species contribute to extinction debt or immigration credit of the local community? (3) How will species differ between two local areas? From the demonstrations using two SAD models (single-parameter lognormal and logseries), the predicted patterns of the debt, credit, and change in the fraction of unique species can vary, but with consistencies and depending on several factors. The general framework deepens the understanding of the theoretical effects of habitat loss on community dynamic patterns in local samples.

  16. A generalized coherence framework for detecting and characterizing nonlinear interactions in the nervous system

    NARCIS (Netherlands)

    Yang, Y.; Solis Escalante, T.; van der Helm, F.C.T.; Schouten, A.C.

    2016-01-01

    Objective: This paper introduces a generalized coherence framework for detecting and characterizing nonlinear interactions in the nervous system, namely cross-spectral coherence (CSC). CSC can detect different types of nonlinear interactions including harmonic and intermodulation coupling as present

  17. A general maximum entropy framework for thermodynamic variational principles

    International Nuclear Information System (INIS)

    Dewar, Roderick C.

    2014-01-01

    Minimum free energy principles are familiar in equilibrium thermodynamics, as expressions of the second law. They also appear in statistical mechanics as variational approximation schemes, such as the mean-field and steepest-descent approximations. These well-known minimum free energy principles are here unified and extended to any system analyzable by MaxEnt, including non-equilibrium systems. The MaxEnt Lagrangian associated with a generic MaxEnt distribution p defines a generalized potential Ψ for an arbitrary probability distribution p-hat, such that Ψ is a minimum at (p-hat) = p. Minimization of Ψ with respect to p-hat thus constitutes a generic variational principle, and is equivalent to minimizing the Kullback-Leibler divergence between p-hat and p. Illustrative examples of min–Ψ are given for equilibrium and non-equilibrium systems. An interpretation of changes in Ψ is given in terms of the second law, although min–Ψ itself is an intrinsic variational property of MaxEnt that is distinct from the second law

  18. A general maximum entropy framework for thermodynamic variational principles

    Energy Technology Data Exchange (ETDEWEB)

    Dewar, Roderick C., E-mail: roderick.dewar@anu.edu.au [Research School of Biology, The Australian National University, Canberra ACT 0200 (Australia)

    2014-12-05

    Minimum free energy principles are familiar in equilibrium thermodynamics, as expressions of the second law. They also appear in statistical mechanics as variational approximation schemes, such as the mean-field and steepest-descent approximations. These well-known minimum free energy principles are here unified and extended to any system analyzable by MaxEnt, including non-equilibrium systems. The MaxEnt Lagrangian associated with a generic MaxEnt distribution p defines a generalized potential Ψ for an arbitrary probability distribution p-hat, such that Ψ is a minimum at (p-hat) = p. Minimization of Ψ with respect to p-hat thus constitutes a generic variational principle, and is equivalent to minimizing the Kullback-Leibler divergence between p-hat and p. Illustrative examples of min–Ψ are given for equilibrium and non-equilibrium systems. An interpretation of changes in Ψ is given in terms of the second law, although min–Ψ itself is an intrinsic variational property of MaxEnt that is distinct from the second law.

  19. General Framework for Evaluating Password Complexity and Strength

    OpenAIRE

    Sahin, Cem S.; Lychev, Robert; Wagner, Neal

    2015-01-01

    Although it is common for users to select bad passwords that can be easily cracked by attackers, password-based authentication remains the most widely-used method. To encourage users to select good passwords, enterprises often enforce policies. Such policies have been proven to be ineffectual in practice. Accurate assessment of a password's resistance to cracking attacks is still an unsolved problem, and our work addresses this challenge. Although the best way to determine how difficult it ma...

  20. A flexible framework for process-based hydraulic and water ...

    Science.gov (United States)

    Background Models that allow for design considerations of green infrastructure (GI) practices to control stormwater runoff and associated contaminants have received considerable attention in recent years. While popular, generally, the GI models are relatively simplistic. However, GI model predictions are being relied upon by many municipalities and State/Local agencies to make decisions about grey vs. green infrastructure improvement planning. Adding complexity to GI modeling frameworks may preclude their use in simpler urban planning situations. Therefore, the goal here was to develop a sophisticated, yet flexible tool that could be used by design engineers and researchers to capture and explore the effect of design factors and properties of the media used in the performance of GI systems at a relatively small scale. We deemed it essential to have a flexible GI modeling tool that is capable of simulating GI system components and specific biophysical processes affecting contaminants such as reactions, and particle-associated transport accurately while maintaining a high degree of flexibly to account for the myriad of GI alternatives. The mathematical framework for a stand-alone GI performance assessment tool has been developed and will be demonstrated.Framework Features The process-based model framework developed here can be used to model a diverse range of GI practices such as green roof, retention pond, bioretention, infiltration trench, permeable pavement and

  1. An optimization-based framework for anisotropic simplex mesh adaptation

    Science.gov (United States)

    Yano, Masayuki; Darmofal, David L.

    2012-09-01

    We present a general framework for anisotropic h-adaptation of simplex meshes. Given a discretization and any element-wise, localizable error estimate, our adaptive method iterates toward a mesh that minimizes error for a given degrees of freedom. Utilizing mesh-metric duality, we consider a continuous optimization problem of the Riemannian metric tensor field that provides an anisotropic description of element sizes. First, our method performs a series of local solves to survey the behavior of the local error function. This information is then synthesized using an affine-invariant tensor manipulation framework to reconstruct an approximate gradient of the error function with respect to the metric tensor field. Finally, we perform gradient descent in the metric space to drive the mesh toward optimality. The method is first demonstrated to produce optimal anisotropic meshes minimizing the L2 projection error for a pair of canonical problems containing a singularity and a singular perturbation. The effectiveness of the framework is then demonstrated in the context of output-based adaptation for the advection-diffusion equation using a high-order discontinuous Galerkin discretization and the dual-weighted residual (DWR) error estimate. The method presented provides a unified framework for optimizing both the element size and anisotropy distribution using an a posteriori error estimate and enables efficient adaptation of anisotropic simplex meshes for high-order discretizations.

  2. Exact marginality in open string field theory. A general framework

    International Nuclear Information System (INIS)

    Kiermaier, M.

    2007-07-01

    We construct analytic solutions of open bosonic string field theory for any exactly marginal deformation in any boundary conformal field theory when properly renormalized operator products of the marginal operator are given. We explicitly provide such renormalized operator products for a class of marginal deformations which include the deformations of flat D-branes in flat backgrounds by constant massless modes of the gauge field and of the scalar fields on the D-branes, the cosine potential for a space-like coordinate, and the hyperbolic cosine potential for the time-like coordinate. In our construction we use integrated vertex operators, which are closely related to finite deformations in boundary conformal field theory, while previous analytic solutions were based on unintegrated vertex operators. We also introduce a modified star product to formulate string field theory around the deformed background. (orig.)

  3. A Generalized Framework for Modeling Next Generation 911 Implementations.

    Energy Technology Data Exchange (ETDEWEB)

    Kelic, Andjelka; Aamir, Munaf Syed; Kelic, Andjelka; Jrad, Ahmad M.; Mitchell, Roger

    2018-02-01

    This document summarizes the current state of Sandia 911 modeling capabilities and then addresses key aspects of Next Generation 911 (NG911) architectures for expansion of existing models. Analysis of three NG911 implementations was used to inform heuristics , associated key data requirements , and assumptions needed to capture NG911 architectures in the existing models . Modeling of NG911 necessitates careful consideration of its complexity and the diversity of implementations. Draft heuristics for constructing NG911 models are pres ented based on the analysis along with a summary of current challenges and ways to improve future NG911 modeling efforts . We found that NG911 relies on E nhanced 911 (E911) assets such as 911 selective routers to route calls originating from traditional tel ephony service which are a majority of 911 calls . We also found that the diversity and transitional nature of NG911 implementations necessitates significant and frequent data collection to ensure that adequate model s are available for crisis action support .

  4. towards a theory-based multi-dimensional framework for assessment in mathematics: The "SEA" framework

    Science.gov (United States)

    Anku, Sitsofe E.

    1997-09-01

    Using the reform documents of the National Council of Teachers of Mathematics (NCTM) (NCTM, 1989, 1991, 1995), a theory-based multi-dimensional assessment framework (the "SEA" framework) which should help expand the scope of assessment in mathematics is proposed. This framework uses a context based on mathematical reasoning and has components that comprise mathematical concepts, mathematical procedures, mathematical communication, mathematical problem solving, and mathematical disposition.

  5. Quantum generalized observables framework for psychological data: a case of preference reversals in US elections

    Science.gov (United States)

    Khrennikova, Polina; Haven, Emmanuel

    2017-10-01

    Politics is regarded as a vital area of public choice theory, and it is strongly relying on the assumptions of voters' rationality and as such, stability of preferences. However, recent opinion polls and real election outcomes in the USA have shown that voters often engage in `ticket splitting', by exhibiting contrasting party support in Congressional and Presidential elections (cf. Khrennikova 2014 Phys. Scripta T163, 014010 (doi:10.1088/0031-8949/2014/T163/014010); Khrennikova & Haven 2016 Phil. Trans. R. Soc. A 374, 20150106 (doi:10.1098/rsta.2015.0106); Smith et al. 1999 Am. J. Polit. Sci. 43, 737-764 (doi:10.2307/2991833)). Such types of preference reversals cannot be mathematically captured via the formula of total probability, thus showing that voters' decision making is at variance with the classical probabilistic information processing framework. In recent work, we have shown that quantum probability describes well the violation of Bayesian rationality in statistical data of voting in US elections, through the so-called interference effects of probability amplitudes. This paper is proposing a novel generalized observables framework of voting behaviour, by using the statistical data collected and analysed in previous studies by Khrennikova (Khrennikova 2015 Lect. Notes Comput. Sci. 8951, 196-209) and Khrennikova & Haven (Khrennikova & Haven 2016 Phil. Trans. R. Soc. A 374, 20150106 (doi:10.1098/rsta.2015.0106)). This framework aims to overcome the main problems associated with the quantum probabilistic representation of psychological data, namely the non-double stochasticity of transition probability matrices. We develop a simplified construction of generalized positive operator valued measures by formulating special non-orthonormal bases with respect to these operators. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  6. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent. Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  7. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent.    Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  8. Generalized eigenvalue based spectrum sensing

    KAUST Repository

    Shakir, Muhammad; Alouini, Mohamed-Slim

    2012-01-01

    of the decision threshold of the respective detectors. The decision threshold has been calculated in a closed form which is based on the approximation of Cumulative Distribution Functions (CDFs) of the respective test statistics. In this context, we exchange

  9. The Nature Index: A General Framework for Synthesizing Knowledge on the State of Biodiversity

    Science.gov (United States)

    Certain, Grégoire; Skarpaas, Olav; Bjerke, Jarle-Werner; Framstad, Erik; Lindholm, Markus; Nilsen, Jan-Erik; Norderhaug, Ann; Oug, Eivind; Pedersen, Hans-Christian; Schartau, Ann-Kristin; van der Meeren, Gro I.; Aslaksen, Iulie; Engen, Steinar; Garnåsjordet, Per-Arild; Kvaløy, Pål; Lillegård, Magnar; Yoccoz, Nigel G.; Nybø, Signe

    2011-01-01

    The magnitude and urgency of the biodiversity crisis is widely recognized within scientific and political organizations. However, a lack of integrated measures for biodiversity has greatly constrained the national and international response to the biodiversity crisis. Thus, integrated biodiversity indexes will greatly facilitate information transfer from science toward other areas of human society. The Nature Index framework samples scientific information on biodiversity from a variety of sources, synthesizes this information, and then transmits it in a simplified form to environmental managers, policymakers, and the public. The Nature Index optimizes information use by incorporating expert judgment, monitoring-based estimates, and model-based estimates. The index relies on a network of scientific experts, each of whom is responsible for one or more biodiversity indicators. The resulting set of indicators is supposed to represent the best available knowledge on the state of biodiversity and ecosystems in any given area. The value of each indicator is scaled relative to a reference state, i.e., a predicted value assessed by each expert for a hypothetical undisturbed or sustainably managed ecosystem. Scaled indicator values can be aggregated or disaggregated over different axes representing spatiotemporal dimensions or thematic groups. A range of scaling models can be applied to allow for different ways of interpreting the reference states, e.g., optimal situations or minimum sustainable levels. Statistical testing for differences in space or time can be implemented using Monte-Carlo simulations. This study presents the Nature Index framework and details its implementation in Norway. The results suggest that the framework is a functional, efficient, and pragmatic approach for gathering and synthesizing scientific knowledge on the state of biodiversity in any marine or terrestrial ecosystem and has general applicability worldwide. PMID:21526118

  10. Developing a framework of, and quality indicators for, general practice management in Europe.

    NARCIS (Netherlands)

    Engels, Y.M.P.; Campbell, S.M.; Dautzenberg, M.G.H.; Hombergh, P. van den; Brinkmann, H.; Szecsenyi, J.; Falcoff, H.; Seuntjens, L.; Kuenzi, B.; Grol, R.P.T.M.

    2005-01-01

    OBJECTIVES: To develop a framework for general practice management made up of quality indicators shared by six European countries. METHODS: Two-round postal Delphi questionnaire in the setting of general practice in Belgium, France, Germany, The Netherlands, Switzerland and the United Kingdom. Six

  11. Transaction-Based Building Controls Framework, Volume 1: Reference Guide

    Energy Technology Data Exchange (ETDEWEB)

    Somasundaram, Sriram [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pratt, Robert G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Akyol, Bora A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fernandez, Nicholas [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Foster, Nikolas AF [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Katipamula, Srinivas [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mayhorn, Ebony T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Somani, Abhishek [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Steckley, Andrew C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Taylor, Zachary T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    This document proposes a framework concept to achieve the objectives of raising buildings’ efficiency and energy savings potential benefitting building owners and operators. We call it a transaction-based framework, wherein mutually-beneficial and cost-effective market-based transactions can be enabled between multiple players across different domains. Transaction-based building controls are one part of the transactional energy framework. While these controls realize benefits by enabling automatic, market-based intra-building efficiency optimizations, the transactional energy framework provides similar benefits using the same market -based structure, yet on a larger scale and beyond just buildings, to the society at large.

  12. Axiomatic Quantum Field Theory in Terms of Operator Product Expansions: General Framework, and Perturbation Theory via Hochschild Cohomology

    Directory of Open Access Journals (Sweden)

    Stefan Hollands

    2009-09-01

    Full Text Available In this paper, we propose a new framework for quantum field theory in terms of consistency conditions. The consistency conditions that we consider are ''associativity'' or ''factorization'' conditions on the operator product expansion (OPE of the theory, and are proposed to be the defining property of any quantum field theory. Our framework is presented in the Euclidean setting, and is applicable in principle to any quantum field theory, including non-conformal ones. In our framework, we obtain a characterization of perturbations of a given quantum field theory in terms of a certain cohomology ring of Hochschild-type. We illustrate our framework by the free field, but our constructions are general and apply also to interacting quantum field theories. For such theories, we propose a new scheme to construct the OPE which is based on the use of non-linear quantized field equations.

  13. The evidence base for school inspection frameworks

    NARCIS (Netherlands)

    Scheerens, Jaap; Ehren, Melanie Catharina Margaretha; Ehren, Melanie C.M.

    2016-01-01

    This chapter describes how Inspectorates of Education operationalize different inspection goals (control, improvement, and liaison) in their inspection indicator frameworks. The chapter provides an overview and examples of the indicators used across a number of countries and how these are

  14. An Improved Generalized Predictive Control in a Robust Dynamic Partial Least Square Framework

    Directory of Open Access Journals (Sweden)

    Jin Xin

    2015-01-01

    Full Text Available To tackle the sensitivity to outliers in system identification, a new robust dynamic partial least squares (PLS model based on an outliers detection method is proposed in this paper. An improved radial basis function network (RBFN is adopted to construct the predictive model from inputs and outputs dataset, and a hidden Markov model (HMM is applied to detect the outliers. After outliers are removed away, a more robust dynamic PLS model is obtained. In addition, an improved generalized predictive control (GPC with the tuning weights under dynamic PLS framework is proposed to deal with the interaction which is caused by the model mismatch. The results of two simulations demonstrate the effectiveness of proposed method.

  15. A constitutive model for magnetostriction based on thermodynamic framework

    International Nuclear Information System (INIS)

    Ho, Kwangsoo

    2016-01-01

    This work presents a general framework for the continuum-based formulation of dissipative materials with magneto–mechanical coupling in the viewpoint of irreversible thermodynamics. The thermodynamically consistent model developed for the magnetic hysteresis is extended to include the magnetostrictive effect. The dissipative and hysteretic response of magnetostrictive materials is captured through the introduction of internal state variables. The evolution rate of magnetostrictive strain as well as magnetization is derived from thermodynamic and dissipative potentials in accordance with the general principles of thermodynamics. It is then demonstrated that the constitutive model is competent to describe the magneto-mechanical behavior by comparing simulation results with the experimental data reported in the literature. - Highlights: • A thermodynamically consistent model is proposed to describe the magneto-mechanical coupling effect. • Internal state variables are introduced to capture the dissipative material response. • The evolution rate of the magnetostrictive strain is derived through thermodynamic and dissipation potentials.

  16. Generic adaptation framework for unifying adaptive web-based systems

    NARCIS (Netherlands)

    Knutov, E.

    2012-01-01

    The Generic Adaptation Framework (GAF) research project first and foremost creates a common formal framework for describing current and future adaptive hypermedia (AHS) and adaptive webbased systems in general. It provides a commonly agreed upon taxonomy and a reference model that encompasses the

  17. Understanding general practice: a conceptual framework developed from case studies in the UK NHS.

    Science.gov (United States)

    Checkland, Kath

    2007-01-01

    General practice in the UK is undergoing a period of rapid and profound change. Traditionally, research into the effects of change on general practice has tended to regard GPs as individuals or as members of a professional group. To understand the impact of change, general practices should also be considered as organisations. To use the organisational studies literature to build a conceptual framework of general practice organisations, and to test and develop this empirically using case studies of change in practice. This study used the implementation of National Service Frameworks (NSFs) and the new General Medical Services (GMS) contract as incidents of change. In-depth, qualitative case studies. The design was iterative: each case study was followed by a review of the theoretical ideas. The final conceptual framework was the result of the dynamic interplay between theory and empirical evidence. Five general practices in England, selected using purposeful sampling. Semi-structured interviews with all clinical and managerial personnel in each practice, participant and nonparticipant observation, and examination of documents. A conceptual framework was developed that can be used to understand how and why practices respond to change. This framework enabled understanding of observed reactions to the introduction of NSFs and the new GMS contract. Important factors for generating responses to change included the story that the practice members told about their practice, beliefs about what counted as legitimate work, the role played by the manager, and previous experiences of change. Viewing general practices as small organisations has generated insights into factors that influence responses to change. Change tends to occur from the bottom up and is determined by beliefs about organisational reality. The conceptual framework suggests some questions that can be asked of practices to explain this internal reality.

  18. Generalized nuclear Fukui functions in the framework of spin-polarized density-functional theory

    International Nuclear Information System (INIS)

    Chamorro, E.; Proft, F. de; Geerlings, P.

    2005-01-01

    An extension of Cohen's nuclear Fukui function is presented in the spin-polarized framework of density-functional theory (SP-DFT). The resulting new nuclear Fukui function indices Φ Nα and Φ Sα are intended to be the natural descriptors for the responses of the nuclei to changes involving charge transfer at constant multiplicity and also the spin polarization at constant number of electrons. These generalized quantities allow us to gain new insights within a perturbative scheme based on DFT. Calculations of the electronic and nuclear SP-DFT quantities are presented within a Kohn-Sham framework of chemical reactivity for a sample of molecules, including H 2 O, H 2 CO, and some simple nitrenes (NX) and phosphinidenes (PX), with X=H, Li, F, Cl, OH, SH, NH 2 , and PH 2 . Results have been interpreted in terms of chemical bonding in the context of Berlin's theorem, which provides a separation of the molecular space into binding and antibinding regions

  19. Deep graphs—A general framework to represent and analyze heterogeneous complex systems across scales

    Science.gov (United States)

    Traxl, Dominik; Boers, Niklas; Kurths, Jürgen

    2016-06-01

    Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of

  20. A general framework of persistence strategies for biological systems helps explain domains of life

    Directory of Open Access Journals (Sweden)

    Liudmila S Yafremava

    2013-02-01

    Full Text Available The nature and cause of the division of organisms in superkingdoms is not fully understood. Assuming that environment shapes physiology, here we construct a novel theoretical framework that helps identify general patterns of organism persistence. This framework is based on Jacob von Uexküll’s organism-centric view of the environment and James G. Miller’s view of organisms as matter-energy-information processing molecular machines. Three concepts describe an organism's environmental niche: scope, umwelt and gap. Scope denotes the entirety of environmental events and conditions to which the organism is exposed during its lifetime. Umwelt encompasses an organism's perception of these events. The gap is the organism's blind spot, the scope that is not covered by umwelt. These concepts bring organisms of different complexity to a common ecological denominator. Ecological and physiological data suggest organisms persist using three strategies: flexibility, robustness and economy. All organisms use umwelt information to flexibly adapt to environmental change. They implement robustness against environmental perturbations within the gap generally through redundancy and reliability of internal constituents. Both flexibility and robustness improve survival. However, they also incur metabolic matter-energy processing costs, which otherwise could have been used for growth and reproduction. Lineages evolve unique tradeoff solutions among strategies in the space of what we call a persistence triangle. Protein domain architecture and other evidence support the preferential use of flexibility and robustness properties. Archaea and Bacteria gravitate toward the triangle’s economy vertex, with Archaea biased toward robustness. Eukarya trade economy for survivability. Protista occupy a saddle manifold separating akaryotes from multicellular organisms. Plants and the more flexible Fungi share an economic stratum, and Metazoa are locked in a positive feedback

  1. New framework of NGN web-based management system

    Science.gov (United States)

    Nian, Zhou; Jie, Yin; Qian, Mao

    2007-11-01

    This paper introduces the basic conceptions and key technology of the Ajax and some popular frameworks in the J2EE architecture, try to integrate all the frameworks into a new framework. The developers can develop web applications much more convenient by using this framework and the web application can provide a more friendly and interactive platform to the end users. At last an example is given to explain how to use the new framework to build a web-based management system of the softswitch network.

  2. A General Framework for Portfolio Theory—Part I: Theory and Various Models

    Directory of Open Access Journals (Sweden)

    Stanislaus Maier-Paape

    2018-05-01

    Full Text Available Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two-dimensional space of utility and risk. This is a rather general pattern. The modern portfolio theory of Markowitz (1959 and the capital market pricing model Sharpe (1964, are special cases of our general framework when the risk measure is taken to be the standard deviation and the utility function is the identity mapping. Using our general framework, we also recover and extend the results in Rockafellar et al. (2006, which were already an extension of the capital market pricing model to allow for the use of more general deviation measures. This generalized capital asset pricing model also applies to e.g., when an approximation of the maximum drawdown is considered as a risk measure. Furthermore, the consideration of a general utility function allows for going beyond the “additive” performance measure to a “multiplicative” one of cumulative returns by using the log utility. As a result, the growth optimal portfolio theory Lintner (1965 and the leverage space portfolio theory Vince (2009 can also be understood and enhanced under our general framework. Thus, this general framework allows a unification of several important existing portfolio theories and goes far beyond. For simplicity of presentation, we phrase all for a finite underlying probability space and a one period market model, but generalizations to more complex structures are straightforward.

  3. SQL Collaborative Learning Framework Based on SOA

    Science.gov (United States)

    Armiati, S.; Awangga, RM

    2018-04-01

    The research is focused on designing collaborative learning-oriented framework fulfilment service in teaching SQL Oracle 10g. Framework built a foundation of academic fulfilment service performed by a layer of the working unit in collaboration with Program Studi Manajemen Informatika. In the design phase defined what form of collaboration models and information technology proposed for Program Studi Manajemen Informatika by using a framework of collaboration inspired by the stages of modelling a Service Oriented Architecture (SOA). Stages begin with analyzing subsystems, this activity is used to determine subsystem involved and reliance as well as workflow between the subsystems. After the service can be identified, the second phase is designing the component specifications, which details the components that are implemented in the service to include the data, rules, services, profiles can be configured, and variations. The third stage is to allocate service, set the service to the subsystems that have been identified, and its components. Implementation framework contributes to the teaching guides and application architecture that can be used as a landing realize an increase in service by applying information technology.

  4. The evidence base for school inspection frameworks

    NARCIS (Netherlands)

    Scheerens, Jaap; Ehren, Melanie Catharina Margaretha

    2015-01-01

    This article describes how Inspectorates of Education operationalize different inspection goals (control, improvement, liaison) in their inspection indicator frameworks. The paper provides an overview and examples of the indicators used across a number of countries and how these are incorporated in

  5. Policy implementation in practice: the case of national service frameworks in general practice.

    Science.gov (United States)

    Checkland, Kath; Harrison, Stephen

    2004-10-01

    National Service Frameworks are an integral part of the government's drive to 'modernise' the NHS, intended to standardise both clinical care and the design of the services used to deliver that clinical care. This article uses evidence from qualitative case studies in three general practices to illustrate the difficulties associated with the implementation of such top-down guidelines and models of service. In these studies it was found that, while there had been little explicit activity directed at implementation overall, the National Service Framework for coronary heart disease had in general fared better than that for older people. Gunn's notion of 'perfect implementation' is used to make sense of the findings.

  6. A Resource Based Framework for Planning and Replanning

    NARCIS (Netherlands)

    Van der Krogt, R.P.J.; De Weerdt, M.M.; Witteveen, C.

    2003-01-01

    We discuss a rigorous unifying framework for both planning and replanning, extending an existing logic-based approach to resource-based planning. The primitive concepts in this Action Resource Framework (ARF) are actions and resources. Actions consume and produce resources. Plans are structures

  7. Mississippi Curriculum Framework for General Drafting (Program CIP: 48.0101--Drafting, General). Secondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for two secondary-level courses in drafting: drafting I and II. Presented…

  8. A general framework for a collaborative water quality knowledge and information network.

    Science.gov (United States)

    Dalcanale, Fernanda; Fontane, Darrell; Csapo, Jorge

    2011-03-01

    Increasing knowledge about the environment has brought about a better understanding of the complexity of the issues, and more information publicly available has resulted into a steady shift from centralized decision making to increasing levels of participatory processes. The management of that information, in turn, is becoming more complex. One of the ways to deal with the complexity is the development of tools that would allow all players, including managers, researchers, educators, stakeholders and the civil society, to be able to contribute to the information system, in any level they are inclined to do so. In this project, a search for the available technology for collaboration, methods of community filtering, and community-based review was performed and the possible implementation of these tools to create a general framework for a collaborative "Water Quality Knowledge and Information Network" was evaluated. The main goals of the network are to advance water quality education and knowledge; encourage distribution and access to data; provide networking opportunities; allow public perceptions and concerns to be collected; promote exchange of ideas; and, give general, open, and free access to information. A reference implementation was made available online and received positive feedback from the community, which also suggested some possible improvements.

  9. Interference Calculus A General Framework for Interference Management and Network Utility Optimization

    CERN Document Server

    Schubert, Martin

    2012-01-01

    This book develops a mathematical framework for modeling and optimizing interference-coupled multiuser systems. At the core of this framework is the concept of general interference functions, which provides a simple means of characterizing interdependencies between users. The entire analysis builds on the two core axioms scale-invariance and monotonicity. The proposed network calculus has its roots in power control theory and wireless communications. It adds theoretical tools for analyzing the typical behavior of interference-coupled networks. In this way it complements existing game-theoretic approaches. The framework should also be viewed in conjunction with optimization theory. There is a fruitful interplay between the theory of interference functions and convex optimization theory. By jointly exploiting the properties of interference functions, it is possible to design algorithms that outperform general-purpose techniques that only exploit convexity. The title “network calculus” refers to the fact tha...

  10. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    Energy Technology Data Exchange (ETDEWEB)

    Alioli, Simone [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Nason, Paolo [INFN, Milano-Bicocca (Italy); Oleari, Carlo [INFN, Milano-Bicocca (Italy); Milano-Bicocca Univ. (Italy); Re, Emanuele [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology

    2010-02-15

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  11. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    International Nuclear Information System (INIS)

    Alioli, Simone; Nason, Paolo; Oleari, Carlo; Re, Emanuele

    2010-02-01

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  12. A general theoretical framework for decoherence in open and closed systems

    International Nuclear Information System (INIS)

    Castagnino, Mario; Fortin, Sebastian; Laura, Roberto; Lombardi, Olimpia

    2008-01-01

    A general theoretical framework for decoherence is proposed, which encompasses formalisms originally devised to deal just with open or closed systems. The conditions for decoherence are clearly stated and the relaxation and decoherence times are compared. Finally, the spin-bath model is developed in detail from the new perspective

  13. General System Theory: Toward a Conceptual Framework for Science and Technology Education for All.

    Science.gov (United States)

    Chen, David; Stroup, Walter

    1993-01-01

    Suggests using general system theory as a unifying theoretical framework for science and technology education for all. Five reasons are articulated: the multidisciplinary nature of systems theory, the ability to engage complexity, the capacity to describe system dynamics, the ability to represent the relationship between microlevel and…

  14. Generalized phase retrieval algorithm based on information measures

    OpenAIRE

    Shioya, Hiroyuki; Gohara, Kazutoshi

    2006-01-01

    An iterative phase retrieval algorithm based on the maximum entropy method (MEM) is presented. Introducing a new generalized information measure, we derive a novel class of algorithms which includes the conventionally used error reduction algorithm and a MEM-type iterative algorithm which is presented for the first time. These different phase retrieval methods are unified on the basis of the framework of information measures used in information theory.

  15. Goal Orientations of General Chemistry Students via the Achievement Goal Framework

    Science.gov (United States)

    Lewis, Scott E.

    2018-01-01

    The Achievement Goal Framework describes students' goal orientations as: task-based, focusing on the successful completion of the task; self-based, evaluating performance relative to one's own past performance; or other-based, evaluating performance relative to the performance of others. Goal orientations have been used to explain student success…

  16. An XML-based framework for personalized health management.

    Science.gov (United States)

    Lee, Hiye-Ja; Park, Seung-Hun; Jeong, Byeong-Soo

    2006-01-01

    This paper proposes a framework for personalized health management. In this framework, XML technology is used for representing and managing the health information and knowledge. Major components of the framework are Health Management Prescription (HMP) Expert System and Health Information Repository. The HMP Expert System generates a HMP efficiently by using XML-based templates. Health Information Repository provides integrated health information and knowledge for personalized health management by using XML and relational database together.

  17. AGAMA: Action-based galaxy modeling framework

    Science.gov (United States)

    Vasiliev, Eugene

    2018-05-01

    The AGAMA library models galaxies. It computes gravitational potential and forces, performs orbit integration and analysis, and can convert between position/velocity and action/angle coordinates. It offers a framework for finding best-fit parameters of a model from data and self-consistent multi-component galaxy models, and contains useful auxiliary utilities such as various mathematical routines. The core of the library is written in C++, and there are Python and Fortran interfaces. AGAMA may be used as a plugin for the stellar-dynamical software packages galpy (ascl:1411.008), AMUSE (ascl:1107.007), and NEMO (ascl:1010.051).

  18. Resource Based Multi Agent Plan Merging : Framework and application

    NARCIS (Netherlands)

    De Weerdt, M.M.; Van der Krogt, R.P.J.; Witteveen, C.

    2003-01-01

    We discuss a resource-based planning framework where agents are able to merge plans by exchanging resources. In this framework, plans are specified as structured objects composed of resource consuming and resource producing processes (actions). A plan itself can also be conceived as a process

  19. An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective

    Science.gov (United States)

    2014-12-01

    An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective by Robert A...Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective Robert A Sottilare and Anne M Sinatra Human...2014 4. TITLE AND SUBTITLE An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective

  20. A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.

    Science.gov (United States)

    Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao

    2018-05-23

    The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.

  1. A Framework for IT-based Design Tools

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements to integ...... to integrated design enviornments, and analysis of engineeringdesign and design problem solving methods. And the developed framework has been testedby applying it to development of prototype design tools for realistic design scenarios.......The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements...

  2. Developing a theoretical framework for complex community-based interventions.

    Science.gov (United States)

    Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana

    2014-01-01

    Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.

  3. Towards a Theory-Based Framework for Assessing the ...

    African Journals Online (AJOL)

    The theory-based framework attempts to capture ESD's complexity in terms of the ... projects in teacher education institutions in Botswana, a brief description of the ..... How the 'four pillars of learning' relate to education for sustainable human.

  4. A Framework for Autonomous Trajectory-Based Operations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed is a framework for autonomous Traffic Flow Management (TFM) under Trajectory Based Operations (TBO) for Unmanned Aerial Systems (UAS). The...

  5. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.

    2017-01-01

    an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...... of the functionality of a system. The article further presents the application of the framework based on a product example. Finally, an empirical study in industry is presented. Therein, feedback on the potential of the proposed framework to support interdisciplinary design practice as well as on areas of further...

  6. Sustainable development based energy policy making frameworks, a critical review

    International Nuclear Information System (INIS)

    Meyar-Naimi, H.; Vaez-Zadeh, S.

    2012-01-01

    This paper, in the first step, presents an overview of the origination and formulation of sustainable development (SD) concept and the related policy making frameworks. The frameworks include Pressure–State–Response (PSR), Driving Force–State–Response (DSR), Driving Force–Pressure–State–Impact–Response (DPSIR), Driving Force–Pressure–State–Effect–Action (DPSEA) and Driving Force-Pressure-State-Exposure-Effect-Action (DPSEEA). In this regard, 40 case studies using the reviewed frameworks reported during 1994–2011 are surveyed. Then, their application area and application intensity are investigated. It is concluded that PSR, and DPSEA and DPSEEA have the higher and lower application intensities, respectively. Moreover, using Analytical Hierarchy Process (AHP) with a set of criteria, it is shown that PSR and DPSIR have the highest and lowest priorities. Finally, the shortcomings of frameworks applications are discussed. The paper is helpful in selecting appropriate policy making frameworks and presents some hints for future research in the area for developing more comprehensive models especially for sustainable electric energy policy making. - Highlights: ► The origination and formulation of sustainable development (SD) concept is reviewed. ► SD based frameworks (PSR, DSR, DPSIR, DPSEA and DPSEEA) are also reviewed. ► Then, the frameworks application area and intensity in recent years are investigated. ► Finally, the SD concept and the SD based frameworks are criticized. ► It will be helpful for developing more comprehensive energy policy making models.

  7. ENGAGE: A Game Based Learning and Problem Solving Framework

    Science.gov (United States)

    2012-07-13

    Gamification Summit 2012  Mensa Colloquium 2012.2: Social and Video Games  Seattle Science Festival  TED Salon Vancouver : http...From - To) 6/1/2012 – 6/30/2012 4. TITLE AND SUBTITLE ENGAGE: A Game Based Learning and Problem Solving Framework 5a. CONTRACT NUMBER N/A 5b...Popović ENGAGE: A Game Based Learning and Problem Solving Framework (Task 1 Month 4) Progress, Status and Management Report Monthly Progress

  8. LGBTQ relationally based positive psychology: An inclusive and systemic framework.

    Science.gov (United States)

    Domínguez, Daniela G; Bobele, Monte; Coppock, Jacqueline; Peña, Ezequiel

    2015-05-01

    Positive psychologists have contributed to our understandings of how positive emotions and flexible cognition enhance resiliency. However, positive psychologists' research has been slow to address the relational resources and interactions that help nonheterosexual families overcome adversity. Addressing overlooked lesbian, gay, bisexual, transgender, or queer (LGBTQ) and systemic factors in positive psychology, this article draws on family resilience literature and LGBTQ literature to theorize a systemic positive psychology framework for working with nonheterosexual families. We developed the LGBTQ relationally based positive psychology framework that integrates positive psychology's strengths-based perspective with the systemic orientation of Walsh's (1996) family resilience framework along with the cultural considerations proposed by LGBTQ family literature. We theorize that the LGBTQ relationally based positive psychology framework takes into consideration the sociopolitical adversities impacting nonheterosexual families and sensitizes positive psychologists, including those working in organized care settings, to the systemic interactions of same-sex loving relationships. (c) 2015 APA, all rights reserved).

  9. Health services research evaluation principles. Broadening a general framework for evaluating health information technology.

    Science.gov (United States)

    Sockolow, P S; Crawford, P R; Lehmann, H P

    2012-01-01

    Our forthcoming national experiment in increased health information technology (HIT) adoption funded by the American Recovery and Reinvestment Act of 2009 will require a comprehensive approach to evaluating HIT. The quality of evaluation studies of HIT to date reveals a need for broader evaluation frameworks that limits the generalizability of findings and the depth of lessons learned. Develop an informatics evaluation framework for health information technology (HIT) integrating components of health services research (HSR) evaluation and informatics evaluation to address identified shortcomings in available HIT evaluation frameworks. A systematic literature review updated and expanded the exhaustive review by Ammenwerth and deKeizer (AdK). From retained studies, criteria were elicited and organized into classes within a framework. The resulting Health Information Technology Research-based Evaluation Framework (HITREF) was used to guide clinician satisfaction survey construction, multi-dimensional analysis of data, and interpretation of findings in an evaluation of a vanguard community health care EHR. The updated review identified 128 electronic health record (EHR) evaluation studies and seven evaluation criteria not in AdK: EHR Selection/Development/Training; Patient Privacy Concerns; Unintended Consequences/ Benefits; Functionality; Patient Satisfaction with EHR; Barriers/Facilitators to Adoption; and Patient Satisfaction with Care. HITREF was used productively and was a complete evaluation framework which included all themes that emerged. We can recommend to future EHR evaluators that they consider adding a complete, research-based HIT evaluation framework, such as HITREF, to their evaluation tools suite to monitor HIT challenges as the federal government strives to increase HIT adoption.

  10. A general multiscale framework for the emergent effective elastodynamics of metamaterials

    Science.gov (United States)

    Sridhar, A.; Kouznetsova, V. G.; Geers, M. G. D.

    2018-02-01

    This paper presents a general multiscale framework towards the computation of the emergent effective elastodynamics of heterogeneous materials, to be applied for the analysis of acoustic metamaterials and phononic crystals. The generality of the framework is exemplified by two key characteristics. First, the underlying formalism relies on the Floquet-Bloch theorem to derive a robust definition of scales and scale separation. Second, unlike most homogenization approaches that rely on a classical volume average, a generalized homogenization operator is defined with respect to a family of particular projection functions. This yields a generalized macro-scale continuum, instead of the classical Cauchy continuum. This enables (in a micromorphic sense) to homogenize the rich dispersive behavior resulting from both Bragg scattering and local resonance. For an arbitrary unit cell, the homogenization projection functions are constructed using the Floquet-Bloch eigenvectors obtained in the desired frequency regime at select high symmetry points, which effectively resolves the emergent phenomena dominating that regime. Furthermore, a generalized Hill-Mandel condition is proposed that ensures power consistency between the homogenized and full-scale model. A high-order spatio-temporal gradient expansion is used to localize the multiscale problem leading to a series of recursive unit cell problems giving the appropriate micro-mechanical corrections. The developed multiscale method is validated against standard numerical Bloch analysis of the dispersion spectra of example unit cells encompassing multiple high-order branches generated by local resonance and/or Bragg scattering.

  11. Towards a Cloud Based Smart Traffic Management Framework

    Science.gov (United States)

    Rahimi, M. M.; Hakimpour, F.

    2017-09-01

    Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.

  12. TOWARDS A CLOUD BASED SMART TRAFFIC MANAGEMENT FRAMEWORK

    Directory of Open Access Journals (Sweden)

    M. M. Rahimi

    2017-09-01

    Full Text Available Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.

  13. A general framework for the evaluation of genetic association studies using multiple marginal models

    DEFF Research Database (Denmark)

    Kitsche, Andreas; Ritz, Christian; Hothorn, Ludwig A.

    2016-01-01

    OBJECTIVE: In this study, we present a simultaneous inference procedure as a unified analysis framework for genetic association studies. METHODS: The method is based on the formulation of multiple marginal models that reflect different modes of inheritance. The basic advantage of this methodology...

  14. Human factors of transitions in automated driving : A general framework and literature survey

    NARCIS (Netherlands)

    Lu, Z.; Happee, R.; Cabrall, C.D.D.; Kyriakidis, M.; de Winter, J.C.F.

    2016-01-01

    The topic of transitions in automated driving is becoming important now that cars are automated to ever greater extents. This paper proposes a theoretical framework to support and align human factors research on transitions in automated driving. Driving states are defined based on the allocation of

  15. CORECLUSTER: A Degeneracy Based Graph Clustering Framework

    OpenAIRE

    Giatsidis , Christos; Malliaros , Fragkiskos; Thilikos , Dimitrios M. ,; Vazirgiannis , Michalis

    2014-01-01

    International audience; Graph clustering or community detection constitutes an important task forinvestigating the internal structure of graphs, with a plethora of applications in several domains. Traditional tools for graph clustering, such asspectral methods, typically suffer from high time and space complexity. In thisarticle, we present \\textsc{CoreCluster}, an efficient graph clusteringframework based on the concept of graph degeneracy, that can be used along withany known graph clusteri...

  16. Communicative automata based programming. Society Framework

    Directory of Open Access Journals (Sweden)

    Andrei Micu

    2015-10-01

    Full Text Available One of the aims of this paper is to present a new programming paradigm based on the new paradigms intensively used in IT industry. Implementation of these techniques can improve the quality of code through modularization, not only in terms of entities used by a program, but also in terms of states in which they pass. Another aspect followed in this paper takes into account that in the development of software applications, the transition from the design to the source code is a very expensive step in terms of effort and time spent. Diagrams can hide very important details for simplicity of understanding, which can lead to incorrect or incomplete implementations. To improve this process communicative automaton based programming comes with an intermediate step. We will see how it goes after creating modeling diagrams to communicative automata and then to writing code for each of them. We show how the transition from one step to another is much easier and intuitive.

  17. An empirically based conceptual framework for fostering meaningful patient engagement in research.

    Science.gov (United States)

    Hamilton, Clayon B; Hoens, Alison M; Backman, Catherine L; McKinnon, Annette M; McQuitty, Shanon; English, Kelly; Li, Linda C

    2018-02-01

    Patient engagement in research (PEIR) is promoted to improve the relevance and quality of health research, but has little conceptualization derived from empirical data. To address this issue, we sought to develop an empirically based conceptual framework for meaningful PEIR founded on a patient perspective. We conducted a qualitative secondary analysis of in-depth interviews with 18 patient research partners from a research centre-affiliated patient advisory board. Data analysis involved three phases: identifying the themes, developing a framework and confirming the framework. We coded and organized the data, and abstracted, illustrated, described and explored the emergent themes using thematic analysis. Directed content analysis was conducted to derive concepts from 18 publications related to PEIR to supplement, confirm or refute, and extend the emergent conceptual framework. The framework was reviewed by four patient research partners on our research team. Participants' experiences of working with researchers were generally positive. Eight themes emerged: procedural requirements, convenience, contributions, support, team interaction, research environment, feel valued and benefits. These themes were interconnected and formed a conceptual framework to explain the phenomenon of meaningful PEIR from a patient perspective. This framework, the PEIR Framework, was endorsed by the patient research partners on our team. The PEIR Framework provides guidance on aspects of PEIR to address for meaningful PEIR. It could be particularly useful when patient-researcher partnerships are led by researchers with little experience of engaging patients in research. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  18. A general framework and review of scatter correction methods in cone beam CT. Part 2: Scatter estimation approaches

    International Nuclear Information System (INIS)

    Ruehrnschopf and, Ernst-Peter; Klingenbeck, Klaus

    2011-01-01

    The main components of scatter correction procedures are scatter estimation and a scatter compensation algorithm. This paper completes a previous paper where a general framework for scatter compensation was presented under the prerequisite that a scatter estimation method is already available. In the current paper, the authors give a systematic review of the variety of scatter estimation approaches. Scatter estimation methods are based on measurements, mathematical-physical models, or combinations of both. For completeness they present an overview of measurement-based methods, but the main topic is the theoretically more demanding models, as analytical, Monte-Carlo, and hybrid models. Further classifications are 3D image-based and 2D projection-based approaches. The authors present a system-theoretic framework, which allows to proceed top-down from a general 3D formulation, by successive approximations, to efficient 2D approaches. A widely useful method is the beam-scatter-kernel superposition approach. Together with the review of standard methods, the authors discuss their limitations and how to take into account the issues of object dependency, spatial variance, deformation of scatter kernels, external and internal absorbers. Open questions for further investigations are indicated. Finally, the authors refer on some special issues and applications, such as bow-tie filter, offset detector, truncated data, and dual-source CT.

  19. A framework for AI-based nuclear design support system

    International Nuclear Information System (INIS)

    Furuta, Kazuo; Kondo, Shunsuke

    1991-01-01

    Nowadays many computer programs are being developed and used for the analytic tasks in nuclear reactor design, but experienced designers are still responsible for most of the synthetic tasks which are not amenable to algorithmic computer processes. Artificial intelligence (AI) is a promising technology to deal with these intractable tasks in design. In development of AI-based design support systems, it is desirable to choose a comprehensive framework based on the scientific theory of design. In this work a framework for AI-based design support systems for nuclear reactor design will be proposed based on an exploration model of design. The fundamental architectures of this framework will be described especially on knowledge representation, context management and design planning. (author)

  20. Framework for AI-based nuclear reactor design support system

    International Nuclear Information System (INIS)

    Furuta, Kazuo; Kondo, Shunsuke

    1992-01-01

    Nowadays many computer programs are being developed and used for the analytic tasks in nuclear reactor design, but experienced designers are still responsible for most of the synthetic tasks which are not amenable to algorithmic computer processes. Artificial intelligence (AI) is a promising technology to deal with these intractable tasks in design. In development of AI-based design support systems, it is desirable to choose a comprehensive framework based on the scientific theory of design. In this work a framework for AI-based design support systems for nuclear reactor design will be proposed based on an explorative abduction model of design. The fundamental architectures of this framework will be described especially on knowledge representation, context management and design planning. (author)

  1. On effectiveness of network sensor-based defense framework

    Science.gov (United States)

    Zhang, Difan; Zhang, Hanlin; Ge, Linqiang; Yu, Wei; Lu, Chao; Chen, Genshe; Pham, Khanh

    2012-06-01

    Cyber attacks are increasing in frequency, impact, and complexity, which demonstrate extensive network vulnerabilities with the potential for serious damage. Defending against cyber attacks calls for the distributed collaborative monitoring, detection, and mitigation. To this end, we develop a network sensor-based defense framework, with the aim of handling network security awareness, mitigation, and prediction. We implement the prototypical system and show its effectiveness on detecting known attacks, such as port-scanning and distributed denial-of-service (DDoS). Based on this framework, we also implement the statistical-based detection and sequential testing-based detection techniques and compare their respective detection performance. The future implementation of defensive algorithms can be provisioned in our proposed framework for combating cyber attacks.

  2. A trait-based framework for stream algal communities.

    Science.gov (United States)

    Lange, Katharina; Townsend, Colin Richard; Matthaei, Christoph David

    2016-01-01

    The use of trait-based approaches to detect effects of land use and climate change on terrestrial plant and aquatic phytoplankton communities is increasing, but such a framework is still needed for benthic stream algae. Here we present a conceptual framework of morphological, physiological, behavioural and life-history traits relating to resource acquisition and resistance to disturbance. We tested this approach by assessing the relationships between multiple anthropogenic stressors and algal traits at 43 stream sites. Our "natural experiment" was conducted along gradients of agricultural land-use intensity (0-95% of the catchment in high-producing pasture) and hydrological alteration (0-92% streamflow reduction resulting from water abstraction for irrigation) as well as related physicochemical variables (total nitrogen concentration and deposited fine sediment). Strategic choice of study sites meant that agricultural intensity and hydrological alteration were uncorrelated. We studied the relationships of seven traits (with 23 trait categories) to our environmental predictor variables using general linear models and an information-theoretic model-selection approach. Life form, nitrogen fixation and spore formation were key traits that showed the strongest relationships with environmental stressors. Overall, FI (farming intensity) exerted stronger effects on algal communities than hydrological alteration. The large-bodied, non-attached, filamentous algae that dominated under high farming intensities have limited dispersal abilities but may cope with unfavourable conditions through the formation of spores. Antagonistic interactions between FI and flow reduction were observed for some trait variables, whereas no interactions occurred for nitrogen concentration and fine sediment. Our conceptual framework was well supported by tests of ten specific hypotheses predicting effects of resource supply and disturbance on algal traits. Our study also shows that investigating a

  3. Constructing rule-based models using the belief functions framework

    NARCIS (Netherlands)

    Almeida, R.J.; Denoeux, T.; Kaymak, U.; Greco, S.; Bouchon-Meunier, B.; Coletti, G.; Fedrizzi, M.; Matarazzo, B.; Yager, R.R.

    2012-01-01

    Abstract. We study a new approach to regression analysis. We propose a new rule-based regression model using the theoretical framework of belief functions. For this purpose we use the recently proposed Evidential c-means (ECM) to derive rule-based models solely from data. ECM allocates, for each

  4. Black hole based tests of general relativity

    International Nuclear Information System (INIS)

    Yagi, Kent; Stein, Leo C

    2016-01-01

    General relativity has passed all solar system experiments and neutron star based tests, such as binary pulsar observations, with flying colors. A more exotic arena for testing general relativity is in systems that contain one or more black holes. Black holes are the most compact objects in the Universe, providing probes of the strongest-possible gravitational fields. We are motivated to study strong-field gravity since many theories give large deviations from general relativity only at large field strengths, while recovering the weak-field behavior. In this article, we review how one can probe general relativity and various alternative theories of gravity by using electromagnetic waves from a black hole with an accretion disk, and gravitational waves from black hole binaries. We first review model-independent ways of testing gravity with electromagnetic/gravitational waves from a black hole system. We then focus on selected examples of theories that extend general relativity in rather simple ways. Some important characteristics of general relativity include (but are not limited to) (i) only tensor gravitational degrees of freedom, (ii) the graviton is massless, (iii) no quadratic or higher curvatures in the action, and (iv) the theory is four-dimensional. Altering a characteristic leads to a different extension of general relativity: (i) scalar–tensor theories, (ii) massive gravity theories, (iii) quadratic gravity, and (iv) theories with large extra dimensions. Within each theory, we describe black hole solutions, their properties, and current and projected constraints on each theory using black hole based tests of gravity. We close this review by listing some of the open problems in model-independent tests and within each specific theory. (paper)

  5. A likelihood-based framework for the analysis of discussion threads

    NARCIS (Netherlands)

    Gómez, Vincenc; Kappen, Hilbert J.; Litvak, Nelli; Kaltenbrunner, Andreas

    2013-01-01

    Online discussion threads are conversational cascades in the form of posted messages that can be generally found in social systems that comprise many-to-many interaction such as blogs, news aggregators or bulletin board systems. We propose a framework based on generative models of growing trees to

  6. Research on machine learning framework based on random forest algorithm

    Science.gov (United States)

    Ren, Qiong; Cheng, Hui; Han, Hai

    2017-03-01

    With the continuous development of machine learning, industry and academia have released a lot of machine learning frameworks based on distributed computing platform, and have been widely used. However, the existing framework of machine learning is limited by the limitations of machine learning algorithm itself, such as the choice of parameters and the interference of noises, the high using threshold and so on. This paper introduces the research background of machine learning framework, and combined with the commonly used random forest algorithm in machine learning classification algorithm, puts forward the research objectives and content, proposes an improved adaptive random forest algorithm (referred to as ARF), and on the basis of ARF, designs and implements the machine learning framework.

  7. A general framework for modeling growth and division of mammalian cells.

    Science.gov (United States)

    Gauthier, John H; Pohl, Phillip I

    2011-01-06

    Modeling the cell-division cycle has been practiced for many years. As time has progressed, this work has gone from understanding the basic principles to addressing distinct biological problems, e.g., the nature of the restriction point, how checkpoints operate, the nonlinear dynamics of the cell cycle, the effect of localization, etc. Most models consist of coupled ordinary differential equations developed by the researchers, restricted to deal with the interactions of a limited number of molecules. In the future, cell-cycle modeling--and indeed all modeling of complex biologic processes--will increase in scope and detail. A framework for modeling complex cell-biologic processes is proposed here. The framework is based on two constructs: one describing the entire lifecycle of a molecule and the second describing the basic cellular machinery. Use of these constructs allows complex models to be built in a straightforward manner that fosters rigor and completeness. To demonstrate the framework, an example model of the mammalian cell cycle is presented that consists of several hundred differential equations of simple mass action kinetics. The model calculates energy usage, amino acid and nucleotide usage, membrane transport, RNA synthesis and destruction, and protein synthesis and destruction for 33 proteins to give an in-depth look at the cell cycle. The framework presented here addresses how to develop increasingly descriptive models of complex cell-biologic processes. The example model of cellular growth and division constructed with the framework demonstrates that large structured models can be created with the framework, and these models can generate non-trivial descriptions of cellular processes. Predictions from the example model include those at both the molecular level--e.g., Wee1 spontaneously reactivates--and at the system level--e.g., pathways for timing-critical processes must shut down redundant pathways. A future effort is to automatically estimate

  8. Plasduino: An inexpensive, general-purpose data acquisition framework for educational experiments

    International Nuclear Information System (INIS)

    Baldini, L.

    2014-01-01

    Based on the Arduino development platform, plasduino is an open source data acquisition framework specifically designed for educational physics experiments. The source code, schematics and documentation are in the public domain under a GPL license and the system, streamlined for low cost and ease of use, can be replicated on the scale of a typical didactic lab with minimal effort. We describe the basic architecture of the system and illustrate its potential with some real-life examples.

  9. Plasduino: An inexpensive, general-purpose data acquisition framework for educational experiments

    Energy Technology Data Exchange (ETDEWEB)

    Baldini, L. [Universita' di Pisa and INFN Sez. di Pisa, Pisa (Italy)

    2014-07-15

    Based on the Arduino development platform, plasduino is an open source data acquisition framework specifically designed for educational physics experiments. The source code, schematics and documentation are in the public domain under a GPL license and the system, streamlined for low cost and ease of use, can be replicated on the scale of a typical didactic lab with minimal effort. We describe the basic architecture of the system and illustrate its potential with some real-life examples.

  10. Generalized Framework and Algorithms for Illustrative Visualization of Time-Varying Data on Unstructured Meshes

    Energy Technology Data Exchange (ETDEWEB)

    Alexander S. Rattner; Donna Post Guillen; Alark Joshi

    2012-12-01

    Photo- and physically-realistic techniques are often insufficient for visualization of simulation results, especially for 3D and time-varying datasets. Substantial research efforts have been dedicated to the development of non-photorealistic and illustration-inspired visualization techniques for compact and intuitive presentation of such complex datasets. While these efforts have yielded valuable visualization results, a great deal of work has been reproduced in studies as individual research groups often develop purpose-built platforms. Additionally, interoperability between illustrative visualization software is limited due to specialized processing and rendering architectures employed in different studies. In this investigation, a generalized framework for illustrative visualization is proposed, and implemented in marmotViz, a ParaView plugin, enabling its use on variety of computing platforms with various data file formats and mesh geometries. Detailed descriptions of the region-of-interest identification and feature-tracking algorithms incorporated into this tool are provided. Additionally, implementations of multiple illustrative effect algorithms are presented to demonstrate the use and flexibility of this framework. By providing a framework and useful underlying functionality, the marmotViz tool can act as a springboard for future research in the field of illustrative visualization.

  11. A Scalable Policy and SNMP Based Network Management Framework

    Institute of Scientific and Technical Information of China (English)

    LIU Su-ping; DING Yong-sheng

    2009-01-01

    Traditional SNMP-based network management can not deal with the task of managing large-scaled distributed network,while policy-based management is one of the effective solutions in network and distributed systems management. However,cross-vendor hardware compatibility is one of the limitations in policy-based management. Devices existing in current network mostly support SNMP rather than Common Open Policy Service (COPS) protocol. By analyzing traditional network management and policy-based network management, a scalable network management framework is proposed. It is combined with Internet Engineering Task Force (IETF) framework for policybased management and SNMP-based network management. By interpreting and translating policy decision to SNMP message,policy can be executed in traditional SNMP-based device.

  12. Integrated evaluation framework. Based on the logical framework approach for project cycle management

    International Nuclear Information System (INIS)

    1996-11-01

    This Integrated Evaluation Framework (IEF) was developed by TC Evaluation with the aim of presenting in a comprehensive manner the logic of thinking used when evaluating projects and programmes. Thus, in the first place, the intended audience for this report are evaluation officers, so that when applying the evaluation procedures and check lists, data can be organized following a systematic and logical scheme and conclusions can be derived ''objectively''. The value of such a framework for reporting on performance and in providing a quality reference for disbursements represents one of its major advantages. However, when developing and applying the IEF, it was realized that a Logical Framework Approach (LFA), like the one upon which the IEF is based, needs to be followed throughout the project life cycle, from the Country Programme Framework planning stage, through project design and implementation. Then, the helpful consequences flow into project design quality and smooth implementation. It is only in such an environment that meaningful and consistent evaluation can take place. Therefore the main audience for this report are Agency staff involved in planning, designing and implementing TC projects as well as their counterparts in Member States. In this understanding, the IEF was subjected to review by a consultants meeting, which included both external consultants and Agency staff. This Consultants Review Meeting encouraged the Secretariat to further adopt the LFA into the TC management process

  13. A Profile-Based Framework for Factorial Similarity and the Congruence Coefficient.

    Science.gov (United States)

    Hartley, Anselma G; Furr, R Michael

    2017-01-01

    We present a novel profile-based framework for understanding factorial similarity in the context of exploratory factor analysis in general, and for understanding the congruence coefficient (a commonly used index of factor similarity) specifically. First, we introduce the profile-based framework articulating factorial similarity in terms of 3 intuitive components: general saturation similarity, differential saturation similarity, and configural similarity. We then articulate the congruence coefficient in terms of these components, along with 2 additional profile-based components, and we explain how these components resolve ambiguities that can be-and are-found when using the congruence coefficient. Finally, we present secondary analyses revealing that profile-based components of factorial are indeed linked to experts' actual evaluations of factorial similarity. Overall, the profile-based approach we present offers new insights into the ways in which researchers can examine factor similarity and holds the potential to enhance researchers' ability to understand the congruence coefficient.

  14. Toward a model framework of generalized parallel componential processing of multi-symbol numbers.

    Science.gov (United States)

    Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph

    2015-05-01

    In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining and investigating a sign-decade compatibility effect for the comparison of positive and negative numbers, which extends the unit-decade compatibility effect in 2-digit number processing. Then, we evaluated whether the model is capable of accounting for previous findings in negative number processing. In a magnitude comparison task, in which participants had to single out the larger of 2 integers, we observed a reliable sign-decade compatibility effect with prolonged reaction times for incompatible (e.g., -97 vs. +53; in which the number with the larger decade digit has the smaller, i.e., negative polarity sign) as compared with sign-decade compatible number pairs (e.g., -53 vs. +97). Moreover, an analysis of participants' eye fixation behavior corroborated our model of parallel componential processing of multi-symbol numbers. These results are discussed in light of concurrent theoretical notions about negative number processing. On the basis of the present results, we propose a generalized integrated model framework of parallel componential multi-symbol processing. (c) 2015 APA, all rights reserved).

  15. Selected aspects of proposed new EU general data protection legal framework and the Croatian perspective

    Directory of Open Access Journals (Sweden)

    Nina GUMZEJ

    2013-12-01

    Full Text Available Proposed new EU general data protection legal framework profoundly affects a large number of day-to-day business operations of organizations processing personal data and calls for significant effort on their part toward the necessary legal-regulatory compliance. In this paper the author examines key legislative developments towards this new EU frame and impact for the Republic of Croatia as the youngest EU Member State. Following introductory overview, legal analysis of draft EU General Data Protection Regulation as proposed by the European Commission and recently adopted amendments by the European Parliament mainly focuses on selected solutions impacting national data protection supervisory authorities. This is complemented with examination of relevant sources of EU law, including the case law of the Court of Justice of the European Union. Assessment of results of this research is next made with respect to prospects of the data protection legal framework of the Republic of Croatia. The paper is concluded with the author’s critical overview of analyzed EU proposals impacting national data protection supervisory authorities in light of EU pivotal goals, and de lege ferenda proposals to timely address identified obstacles towards more adequate enforcement of data protection legislation in Croatia.

  16. A general framework for thermodynamically consistent parameterization and efficient sampling of enzymatic reactions.

    Directory of Open Access Journals (Sweden)

    Pedro Saa

    2015-04-01

    Full Text Available Kinetic models provide the means to understand and predict the dynamic behaviour of enzymes upon different perturbations. Despite their obvious advantages, classical parameterizations require large amounts of data to fit their parameters. Particularly, enzymes displaying complex reaction and regulatory (allosteric mechanisms require a great number of parameters and are therefore often represented by approximate formulae, thereby facilitating the fitting but ignoring many real kinetic behaviours. Here, we show that full exploration of the plausible kinetic space for any enzyme can be achieved using sampling strategies provided a thermodynamically feasible parameterization is used. To this end, we developed a General Reaction Assembly and Sampling Platform (GRASP capable of consistently parameterizing and sampling accurate kinetic models using minimal reference data. The former integrates the generalized MWC model and the elementary reaction formalism. By formulating the appropriate thermodynamic constraints, our framework enables parameterization of any oligomeric enzyme kinetics without sacrificing complexity or using simplifying assumptions. This thermodynamically safe parameterization relies on the definition of a reference state upon which feasible parameter sets can be efficiently sampled. Uniform sampling of the kinetics space enabled dissecting enzyme catalysis and revealing the impact of thermodynamics on reaction kinetics. Our analysis distinguished three reaction elasticity regions for common biochemical reactions: a steep linear region (0> ΔGr >-2 kJ/mol, a transition region (-2> ΔGr >-20 kJ/mol and a constant elasticity region (ΔGr <-20 kJ/mol. We also applied this framework to model more complex kinetic behaviours such as the monomeric cooperativity of the mammalian glucokinase and the ultrasensitive response of the phosphoenolpyruvate carboxylase of Escherichia coli. In both cases, our approach described appropriately not only

  17. Content sensitivity based access control framework for Hadoop

    Directory of Open Access Journals (Sweden)

    T.K. Ashwin Kumar

    2017-11-01

    Full Text Available Big data technologies have seen tremendous growth in recent years. They are widely used in both industry and academia. In spite of such exponential growth, these technologies lack adequate measures to protect data from misuse/abuse. Corporations that collect data from multiple sources are at risk of liabilities due to the exposure of sensitive information. In the current implementation of Hadoop, only file-level access control is feasible. Providing users with the ability to access data based on the attributes in a dataset or the user’s role is complicated because of the sheer volume and multiple formats (structured, unstructured and semi-structured of data. In this paper, we propose an access control framework, which enforces access control policies dynamically based on the sensitivity of the data. This framework enforces access control policies by harnessing the data context, usage patterns and information sensitivity. Information sensitivity changes over time with the addition and removal of datasets, which can lead to modifications in access control decisions. The proposed framework accommodates these changes. The proposed framework is automated to a large extent as the data itself determines the sensitivity with minimal user intervention. Our experimental results show that the proposed framework is capable of enforcing access control policies on non-multimedia datasets with minimal overhead.

  18. Advances in heuristically based generalized perturbation theory

    International Nuclear Information System (INIS)

    Gandini, A.

    1994-01-01

    A distinctive feature of heuristically based generalized perturbation theory methodology consists in the systematic use of importance conservation concepts. As well known, this use leads to fundamental reciprocity relationship. Instead, the alternative variational and differential one approaches make a consistent use of the properties and adjoint functions. The equivalence between the importance and the adjoint functions have been demonstrated in important cases. There are some instances, however, in which the commonly known operator governing the adjoint function are not adequate. In this paper ways proposed to generalize this rules, as adopted with the heuristic generalized perturbation theory methodology, are illustrated. When applied to the neutron/nuclide field characterizing the core evolution in a power reactor system, in which also an intensive control variable (ρ) is defined, these rules leas to an orthogonality relationship connected to this same control variable. A set of ρ-mode eigenfunctions may be correspondingly defined and an extended concept of reactivity (generalizing that commonly associated with the multiplication factor) proposed as more directly indicative of the controllability of a critical reactor system. (author). 25 refs

  19. Framework, process and tool for managing technology-based assets

    CSIR Research Space (South Africa)

    Kfir, R

    2000-10-01

    Full Text Available ) and the intellectual property (IP) of the organisation, The study describes a framework linking the core processes supporting the management of technology-based assets and offerings with other organisational elements such as leadership, strategy, and culture. Specific...

  20. A Conceptual Framework for Web-Based Learning Design

    Science.gov (United States)

    Alomyan, Hesham

    2017-01-01

    The purpose of this paper is to provide a coherent framework to present the relationship between individual differences and web-based learning. Two individual difference factors have been identified for investigation within the present paper: Cognitive style and prior knowledge. The importance of individual differences is reviewed and previous…

  1. Verification Based on Set-Abstraction Using the AIF Framework

    DEFF Research Database (Denmark)

    Mödersheim, Sebastian Alexander

    The AIF framework is a novel method for analyzing advanced security protocols, web services, and APIs, based a new abstract interpretation method. It consists of the specification language AIF and a translation/abstraction processes that produces a set of first-order Horn clauses. These can...

  2. The ETLMR MapReduce-Based ETL Framework

    DEFF Research Database (Denmark)

    Xiufeng, Liu; Thomsen, Christian; Pedersen, Torben Bach

    2011-01-01

    This paper presents ETLMR, a parallel Extract--Transform--Load (ETL) programming framework based on MapReduce. It has built-in support for high-level ETL-specific constructs including star schemas, snowflake schemas, and slowly changing dimensions (SCDs). ETLMR gives both high programming...

  3. Semantic-based framework for personalised ambient media

    NARCIS (Netherlands)

    Aroyo, L.M.; Bellekens, P.; Björkman, M.; Houben, G.J

    2008-01-01

    The paper proposes a semantic-based metadata framework for personalised interaction with TV media in a connected home context. Our approach allows the current home media centres to go beyond the simple concept of electronic programme guides and to offer the users a personalised media experience in

  4. A community-based framework for aquatic ecosystem models

    NARCIS (Netherlands)

    Trolle, D.; Hamilton, D.P.; Hipsey, M.R.; Bolding, K.; Bruggeman, J.; Mooij, W.M.; Janse, J.H.; Nielsen, A.; Jeppesen, E.; Elliott, J.A.; Makler-Pick, V.; Petzoldt, T.; Rinke, K.; Flindt, M.R.; Arhonditsis, G.B.; Gal, G.; Bjerring, R.; Tominaga, K.; Hoen, 't J.; Downing, A.S.; Marques, D.M.; Fragoso, C.R.; Sondergaard, M.; Hanson, P.C.

    2012-01-01

    Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through a

  5. Creating a nursing strategic planning framework based on evidence.

    Science.gov (United States)

    Shoemaker, Lorie K; Fischer, Brenda

    2011-03-01

    This article describes an evidence-informed strategic planning process and framework used by a Magnet-recognized public health system in California. This article includes (1) an overview of the organization and its strategic planning process, (2) the structure created within nursing for collaborative strategic planning and decision making, (3) the strategic planning framework developed based on the organization's balanced scorecard domains and the new Magnet model, and (4) the process undertaken to develop the nursing strategic priorities. Outcomes associated with the structure, process, and key initiatives are discussed throughout the article. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Developing a framework of, and quality indicators for, general practice management in Europe.

    Science.gov (United States)

    Engels, Yvonne; Campbell, Stephen; Dautzenberg, Maaike; van den Hombergh, Pieter; Brinkmann, Henrik; Szécsényi, Joachim; Falcoff, Hector; Seuntjens, Luc; Kuenzi, Beat; Grol, Richard

    2005-04-01

    To develop a framework for general practice management made up of quality indicators shared by six European countries. Two-round postal Delphi questionnaire in the setting of general practice in Belgium, France, Germany, The Netherlands, Switzerland and the United Kingdom. Six national expert panels, each consisting of 10 members, primarily primary care practitioners and experts in the field of quality in primary care participated in the study. The main outcome measures were: (a) a European framework with indicators for the organization of primary care; and (b) ratings of the face validity of the usefulness of the indicators by expert panels in six countries. Agreement was reached about a definition of practice management across five domains (infrastructure, staff, information, finance, and quality and safety), and a common set of indicators for the organization of general practice. The panellist response rate was 95%. Sixty-two indicators (37%) were rated face valid by all six panels. Examples include out of hours service, accessibility, the content of doctors' bags and staff involvement in quality improvement. No indicators were rated invalid by all six panels. It proved to be possible to develop a European set of indicators for assessing the quality of practice management, despite the differences in health care systems and cultures in the six different countries. These indicators will now be used in a quality assessment procedure of practice management in nine European countries. While organizational indicators are part of the new GMS contract in the UK, this research shows that many practice management issues within primary care are also of relevance in other European countries.

  7. Corporate Innovation Management Framework Based On Design Thinking

    DEFF Research Database (Denmark)

    Efeoglu, Arkin; Møller, Charles; Serie, Michel

    2014-01-01

    This paper proposes use of an Innovation Management Framework through the roll-out of Design Thinking in a multi-national company by applying adequately framed organizational governance. An Innovation Management Framework based on the principles of Design Thinking is providing central pillars...... that not only ensure effective governance. The elevation both from Innovation Management to foster Design Thinking and vice versa the Design Thinking characteristics that strengthen the corporate innovativeness through governance is in focus. With the latter in mind, this paper therefor looks on Innovation...... to be governed. An Innovation Management Framework with principles of DT may avoid uncoordinated innovation capabilities. Ultimately innovation will not be an R&D topic anymore but become part for every employee’s job, irrespective of his or her position. In a previous paper DT characteristics were evaluated...

  8. Nature-based supportive care opportunities: a conceptual framework.

    Science.gov (United States)

    Blaschke, Sarah; O'Callaghan, Clare C; Schofield, Penelope

    2018-03-22

    Given preliminary evidence for positive health outcomes related to contact with nature for cancer populations, research is warranted to ascertain possible strategies for incorporating nature-based care opportunities into oncology contexts as additional strategies for addressing multidimensional aspects of cancer patients' health and recovery needs. The objective of this study was to consolidate existing research related to nature-based supportive care opportunities and generate a conceptual framework for discerning relevant applications in the supportive care setting. Drawing on research investigating nature-based engagement in oncology contexts, a two-step analytic process was used to construct a conceptual framework for guiding nature-based supportive care design and future research. Concept analysis methodology generated new representations of understanding by extracting and synthesising salient concepts. Newly formulated concepts were transposed to findings from related research about patient-reported and healthcare expert-developed recommendations for nature-based supportive care in oncology. Five theoretical concepts (themes) were formulated describing patients' reasons for engaging with nature and the underlying needs these interactions address. These included: connecting with what is genuinely valued, distancing from the cancer experience, meaning-making and reframing the cancer experience, finding comfort and safety, and vital nurturance. Eight shared patient and expert recommendations were compiled, which address the identified needs through nature-based initiatives. Eleven additional patient-reported recommendations attend to beneficial and adverse experiential qualities of patients' nature-based engagement and complete the framework. The framework outlines salient findings about helpful nature-based supportive care opportunities for ready access by healthcare practitioners, designers, researchers and patients themselves. © Article author(s) (or their

  9. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  10. A Comparison between Linear IRT Observed-Score Equating and Levine Observed-Score Equating under the Generalized Kernel Equating Framework

    Science.gov (United States)

    Chen, Haiwen

    2012-01-01

    In this article, linear item response theory (IRT) observed-score equating is compared under a generalized kernel equating framework with Levine observed-score equating for nonequivalent groups with anchor test design. Interestingly, these two equating methods are closely related despite being based on different methodologies. Specifically, when…

  11. Conceptual Framework for Agent-Based Modeling of Customer-Oriented Supply Networks

    OpenAIRE

    Solano-Vanegas , Clara ,; Carrillo-Ramos , Angela; Montoya-Torres , Jairo ,

    2015-01-01

    Part 3: Collaboration Frameworks; International audience; Supply Networks (SN) are complex systems involving the interaction of different actors, very often, with different objectives and goals. Among the different existing modeling approaches, agent-based systems can properly represent the autonomous behavior of SN links and, simultaneously, observe the general response of the system as a result of individual actions. Most of research using agent-based modeling in SN focuses on production is...

  12. A Generalized Decision Framework Using Multi-objective Optimization for Water Resources Planning

    Science.gov (United States)

    Basdekas, L.; Stewart, N.; Triana, E.

    2013-12-01

    Colorado Springs Utilities (CSU) is currently engaged in an Integrated Water Resource Plan (IWRP) to address the complex planning scenarios, across multiple time scales, currently faced by CSU. The modeling framework developed for the IWRP uses a flexible data-centered Decision Support System (DSS) with a MODSIM-based modeling system to represent the operation of the current CSU raw water system coupled with a state-of-the-art multi-objective optimization algorithm. Three basic components are required for the framework, which can be implemented for planning horizons ranging from seasonal to interdecadal. First, a water resources system model is required that is capable of reasonable system simulation to resolve performance metrics at the appropriate temporal and spatial scales of interest. The system model should be an existing simulation model, or one developed during the planning process with stakeholders, so that 'buy-in' has already been achieved. Second, a hydrologic scenario tool(s) capable of generating a range of plausible inflows for the planning period of interest is required. This may include paleo informed or climate change informed sequences. Third, a multi-objective optimization model that can be wrapped around the system simulation model is required. The new generation of multi-objective optimization models do not require parameterization which greatly reduces problem complexity. Bridging the gap between research and practice will be evident as we use a case study from CSU's planning process to demonstrate this framework with specific competing water management objectives. Careful formulation of objective functions, choice of decision variables, and system constraints will be discussed. Rather than treating results as theoretically Pareto optimal in a planning process, we use the powerful multi-objective optimization models as tools to more efficiently and effectively move out of the inferior decision space. The use of this framework will help CSU

  13. A general-purpose process modelling framework for marine energy systems

    International Nuclear Information System (INIS)

    Dimopoulos, George G.; Georgopoulou, Chariklia A.; Stefanatos, Iason C.; Zymaris, Alexandros S.; Kakalis, Nikolaos M.P.

    2014-01-01

    Highlights: • Process modelling techniques applied in marine engineering. • Systems engineering approaches to manage the complexity of modern ship machinery. • General purpose modelling framework called COSSMOS. • Mathematical modelling of conservation equations and related chemical – transport phenomena. • Generic library of ship machinery component models. - Abstract: High fuel prices, environmental regulations and current shipping market conditions impose ships to operate in a more efficient and greener way. These drivers lead to the introduction of new technologies, fuels, and operations, increasing the complexity of modern ship energy systems. As a means to manage this complexity, in this paper we present the introduction of systems engineering methodologies in marine engineering via the development of a general-purpose process modelling framework for ships named as DNV COSSMOS. Shifting the focus from components – the standard approach in shipping- to systems, widens the space for optimal design and operation solutions. The associated computer implementation of COSSMOS is a platform that models, simulates and optimises integrated marine energy systems with respect to energy efficiency, emissions, safety/reliability and costs, under both steady-state and dynamic conditions. DNV COSSMOS can be used in assessment and optimisation of design and operation problems in existing vessels, new builds as well as new technologies. The main features and our modelling approach are presented and key capabilities are illustrated via two studies on the thermo-economic design and operation optimisation of a combined cycle system for large bulk carriers, and the transient operation simulation of an electric marine propulsion system

  14. A Cluster-Based Framework for the Security of Medical Sensor Environments

    Science.gov (United States)

    Klaoudatou, Eleni; Konstantinou, Elisavet; Kambourakis, Georgios; Gritzalis, Stefanos

    The adoption of Wireless Sensor Networks (WSNs) in the healthcare sector poses many security issues, mainly because medical information is considered particularly sensitive. The security mechanisms employed are expected to be more efficient in terms of energy consumption and scalability in order to cope with the constrained capabilities of WSNs and patients’ mobility. Towards this goal, cluster-based medical WSNs can substantially improve efficiency and scalability. In this context, we have proposed a general framework for cluster-based medical environments on top of which security mechanisms can rely. This framework fully covers the varying needs of both in-hospital environments and environments formed ad hoc for medical emergencies. In this paper, we further elaborate on the security of our proposed solution. We specifically focus on key establishment mechanisms and investigate the group key agreement protocols that can best fit in our framework.

  15. Generalization of information-based concepts in forecast verification

    Science.gov (United States)

    Tödter, J.; Ahrens, B.

    2012-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.

  16. A Framework for Agent-based Human Interaction Support

    Directory of Open Access Journals (Sweden)

    Axel Bürkle

    2008-10-01

    Full Text Available In this paper we describe an agent-based infrastructure for multimodal perceptual systems which aims at developing and realizing computer services that are delivered to humans in an implicit and unobtrusive way. The framework presented here supports the implementation of human-centric context-aware applications providing non-obtrusive assistance to participants in events such as meetings, lectures, conferences and presentations taking place in indoor "smart spaces". We emphasize on the design and implementation of an agent-based framework that supports "pluggable" service logic in the sense that the service developer can concentrate on coding the service logic independently of the underlying middleware. Furthermore, we give an example of the architecture's ability to support the cooperation of multiple services in a meeting scenario using an intelligent connector service and a semantic web oriented travel service.

  17. Offset Free Tracking Predictive Control Based on Dynamic PLS Framework

    Directory of Open Access Journals (Sweden)

    Jin Xin

    2017-10-01

    Full Text Available This paper develops an offset free tracking model predictive control based on a dynamic partial least square (PLS framework. First, state space model is used as the inner model of PLS to describe the dynamic system, where subspace identification method is used to identify the inner model. Based on the obtained model, multiple independent model predictive control (MPC controllers are designed. Due to the decoupling character of PLS, these controllers are running separately, which is suitable for distributed control framework. In addition, the increment of inner model output is considered in the cost function of MPC, which involves integral action in the controller. Hence, the offset free tracking performance is guaranteed. The results of an industry background simulation demonstrate the effectiveness of proposed method.

  18. A community-based framework for aquatic ecosystem models

    DEFF Research Database (Denmark)

    Trolle, Didde; Hamilton, D. P.; Hipsey, M. R.

    2012-01-01

    Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through...... a literature survey, we document the growing importance of numerical aquatic ecosystem models while also noting the difficulties, up until now, of the aquatic scientific community to make significant advances in these models during the past two decades. Through a common forum for aquatic ecosystem modellers we...... aim to (i) advance collaboration within the aquatic ecosystem modelling community, (ii) enable increased use of models for research, policy and ecosystem-based management, (iii) facilitate a collective framework using common (standardised) code to ensure that model development is incremental, (iv...

  19. An Attribute Based Access Control Framework for Healthcare System

    Science.gov (United States)

    Afshar, Majid; Samet, Saeed; Hu, Ting

    2018-01-01

    Nowadays, access control is an indispensable part of the Personal Health Record and supplies for its confidentiality by enforcing policies and rules to ensure that only authorized users gain access to requested resources in the system. In other words, the access control means protecting patient privacy in healthcare systems. Attribute-Based Access Control (ABAC) is a new access control model that can be used instead of other traditional types of access control such as Discretionary Access Control, Mandatory Access Control, and Role-Based Access Control. During last five years ABAC has shown some applications in both recent academic fields and industry purposes. ABAC by using user’s attributes and resources, makes a decision according to an access request. In this paper, we propose an ABAC framework for healthcare system. We use the engine of ABAC for rendering and enforcing healthcare policies. Moreover, we handle emergency situations in this framework.

  20. Defining Smart City. A Conceptual Framework Based on Keyword Analysis

    Directory of Open Access Journals (Sweden)

    Farnaz Mosannenzadeh

    2014-05-01

    Full Text Available “Smart city” is a concept that has been the subject of increasing attention in urban planning and governance during recent years. The first step to create Smart Cities is to understand its concept. However, a brief review of literature shows that the concept of Smart City is the subject of controversy. Thus, the main purpose of this paper is to provide a conceptual framework to define Smart City. To this aim, an extensive literature review was done. Then, a keyword analysis on literature was held against main research questions (why, what, who, when, where, how and based on three main domains involved in the policy decision making process and Smart City plan development: Academic, Industrial and Governmental. This resulted in a conceptual framework for Smart City. The result clarifies the definition of Smart City, while providing a framework to define Smart City’s each sub-system. Moreover, urban authorities can apply this framework in Smart City initiatives in order to recognize their main goals, main components, and key stakeholders.

  1. An extended framework for adaptive playback-based video summarization

    Science.gov (United States)

    Peker, Kadir A.; Divakaran, Ajay

    2003-11-01

    In our previous work, we described an adaptive fast playback framework for video summarization where we changed the playback rate using the motion activity feature so as to maintain a constant "pace." This method provides an effective way of skimming through video, especially when the motion is not too complex and the background is mostly still, such as in surveillance video. In this paper, we present an extended summarization framework that, in addition to motion activity, uses semantic cues such as face or skin color appearance, speech and music detection, or other domain dependent semantically significant events to control the playback rate. The semantic features we use are computationally inexpensive and can be computed in compressed domain, yet are robust, reliable, and have a wide range of applicability across different content types. The presented framework also allows for adaptive summaries based on preference, for example, to include more dramatic vs. action elements, or vice versa. The user can switch at any time between the skimming and the normal playback modes. The continuity of the video is preserved, and complete omission of segments that may be important to the user is avoided by using adaptive fast playback instead of skipping over long segments. The rule-set and the input parameters can be further modified to fit a certain domain or application. Our framework can be used by itself, or as a subsequent presentation stage for a summary produced by any other summarization technique that relies on generating a sub-set of the content.

  2. Obfuscation Framework Based on Functionally Equivalent Combinatorial Logic Families

    Science.gov (United States)

    2008-03-01

    of Defense, or the United States Government . AFIT/GCS/ENG/08-12 Obfuscation Framework Based on Functionally Equivalent Combinatorial Logic Families...time, United States policy strongly encourages the sale and transfer of some military equipment to foreign governments and makes it easier for...Proceedings of the International Conference on Availability, Reliability and Security, 2007. 14. McDonald, J. Todd and Alec Yasinsac. “Of unicorns and random

  3. Arcade: A Web-Java Based Framework for Distributed Computing

    Science.gov (United States)

    Chen, Zhikai; Maly, Kurt; Mehrotra, Piyush; Zubair, Mohammad; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    Distributed heterogeneous environments are being increasingly used to execute a variety of large size simulations and computational problems. We are developing Arcade, a web-based environment to design, execute, monitor, and control distributed applications. These targeted applications consist of independent heterogeneous modules which can be executed on a distributed heterogeneous environment. In this paper we describe the overall design of the system and discuss the prototype implementation of the core functionalities required to support such a framework.

  4. Molecularly Imprinted Polymer/Metal Organic Framework Based Chemical Sensors

    Directory of Open Access Journals (Sweden)

    Zhenzhong Guo

    2016-10-01

    Full Text Available The present review describes recent advances in the concept of molecular imprinting using metal organic frameworks (MOF for development of chemical sensors. Two main strategies regarding the fabrication, performance and applications of recent sensors based on molecularly imprinted polymers associated with MOF are presented: molecularly imprinted MOF films and molecularly imprinted core-shell nanoparticles using MOF as core. The associated transduction modes are also discussed. A brief conclusion and future expectations are described herein.

  5. A closed-loop based framework for design requirement management

    DEFF Research Database (Denmark)

    Zhang, Zhinan; Li, Xuemeng; Liu, Zelin

    2014-01-01

    management from product lifecycle, and requirement and requirement management lifecycle views. This paper highlights the importance of requirement lifecycle management and aims at closing the requirement information loop in product lifecycle. Then, it addresses the requirement management in engineering...... design field with focusing on the dynamics nature and incomplete nature of requirements. Finally, a closed-loop based framework is proposed for requirement management in engineering design....

  6. Quantification of Technology Innovation Usinga Risk-Based Framework

    OpenAIRE

    Gerard E. Sleefe

    2010-01-01

    There is significant interest in achieving technology innovation through new product development activities. It is recognized, however, that traditional project management practices focused only on performance, cost, and schedule attributes, can often lead to risk mitigation strategies that limit new technology innovation. In this paper, a new approach is proposed for formally managing and quantifying technology innovation. This approach uses a risk-based framework that s...

  7. General framework for real-time implementation of balancing services market outcome

    NARCIS (Netherlands)

    Virag, A.; Jokic, A.; Bosch, van den P.P.J.

    2012-01-01

    Load frequency control (LFC) is widely used for real-time balancing connected power systems. With the deregulation of power system markets, there is a necessity for adapting the current LFC to support market-based operation. In this paper we propose a general balancing tool which covers both,

  8. Making a Map of Science: General Systems Theory as a Conceptual Framework for Tertiary Science Education.

    Science.gov (United States)

    Gulyaev, Sergei A.; Stonyer, Heather R.

    2002-01-01

    Develops an integrated approach based on the use of general systems theory (GST) and the concept of 'mapping' scientific knowledge to provide students with tools for a more holistic understanding of science. Uses GST as the core methodology for understanding science and its complexity. Discusses the role of scientific community in producing…

  9. A Framework for a General Purpose Intelligent Control System for Particle Accelerators. Phase II Final Report

    International Nuclear Information System (INIS)

    Westervelt, Robert; Klein, William; Kroupa, Michael; Olsson, Eric; Rothrock, Rick

    1999-01-01

    Vista Control Systems, Inc. has developed a portable system for intelligent accelerator control. The design is general in scope and is thus configurable to a wide range of accelerator facilities and control problems. The control system employs a multi-layer organization in which knowledge-based decision making is used to dynamically configure lower level optimization and control algorithms

  10. A knowledge-based system framework for real-time monitoring applications

    International Nuclear Information System (INIS)

    Heaberlin, J.O.; Robinson, A.H.

    1989-01-01

    A real-time environment presents a challenge for knowledge-based systems for process monitoring with on-line data acquisition in nuclear power plants. These applications are typically data intensive. This, coupled with the dynamic nature of events on which problematic decisions are based, requires the development of techniques fundamentally different from those generally employed. Traditional approaches involve knowledge management techniques developed for static data, the majority of which is elicited directly from the user in a consultation environment. Inference mechanisms are generally noninterruptible, requiring all appropriate rules to be fired before new data can be accommodated. As a result, traditional knowledge-based applications in real-time environments have inherent problems in dealing with the time dependence of both the data and the solution process. For example, potential problems include obtaining a correct solution too late to be of use or focusing computing resources on problems that no longer exist. A knowledge-based system framework, the real-time framework (RTF), has been developed that can accommodate the time dependencies and resource trade-offs required for real-time process monitoring applications. This framework provides real-time functionality by using generalized problem-solving goals and control strategies that are modifiable during system operation and capable of accommodating feedback for redirection of activities

  11. Online Data Monitoring Framework Based on Histogram Packaging in Network Distributed Data Acquisition Systems

    International Nuclear Information System (INIS)

    Konno, T; Ishitsuka, M; Kuze, M; Cabarera, A; Sakamoto, Y

    2011-01-01

    O nline monitor frameworkis a new general software framework for online data monitoring, which provides a way to collect information from online systems, including data acquisition, and displays them to shifters far from experimental sites. 'Monitor Server', a core system in this framework gathers the monitoring information from the online subsystems and the information is handled as collections of histograms named H istogram Package . Monitor Server broadcasts the histogram packages to 'Monitor Viewers', graphical user interfaces in the framework. We developed two types of the viewers with different technologies: Java and web browser. We adapted XML based file for the configuration of GUI components on the windows and graphical objects on the canvases. Monitor Viewer creates its GUIs automatically with the configuration files.This monitoring framework has been developed for the Double Chooz reactor neutrino oscillation experiment in France, but can be extended for general application to be used in other experiments. This document reports the structure of the online monitor framework with some examples from the adaption to the Double Chooz experiment.

  12. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    Science.gov (United States)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  13. Land surface Verification Toolkit (LVT) - a generalized framework for land surface model evaluation

    Science.gov (United States)

    Kumar, S. V.; Peters-Lidard, C. D.; Santanello, J.; Harrison, K.; Liu, Y.; Shaw, M.

    2012-06-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  14. An openstack-based flexible video transcoding framework in live

    Science.gov (United States)

    Shi, Qisen; Song, Jianxin

    2017-08-01

    With the rapid development of mobile live business, transcoding HD video is often a challenge for mobile devices due to their limited processing capability and bandwidth-constrained network connection. For live service providers, it's wasteful for resources to delay lots of transcoding server because some of them are free to work sometimes. To deal with this issue, this paper proposed an Openstack-based flexible transcoding framework to achieve real-time video adaption for mobile device and make computing resources used efficiently. To this end, we introduced a special method of video stream splitting and VMs resource scheduling based on access pressure prediction,which is forecasted by an AR model.

  15. Competitive advantage: an analytical framework based on entrepreneurship

    Institute of Scientific and Technical Information of China (English)

    LIU Zhibiao

    2006-01-01

    This article observes and studies the role and effect of entrepreneurship within the theoretical framework of resource-based view(RBV).It advances competitive advantage theory based on entrepreneurship by proving the distinctiveness of entrepreneurship. Distinctive cognition competence of entrepreneurs provides them with personal specific assets,which determines both the competence to explore new business opportunities and the competence to integrate resources for risk activities.The characteristics of such intangible resource as entrepreneurship,such as its distinctiveness,limitedness of competition,and incomplete mobility of factors,are the most important sources of competitive advantage of enterprises in the strategic management theory of RBV.

  16. A VGI data integration framework based on linked data model

    Science.gov (United States)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  17. a Simulation-As Framework Facilitating Webgis Based Installation Planning

    Science.gov (United States)

    Zheng, Z.; Chang, Z. Y.; Fei, Y. F.

    2017-09-01

    Installation Planning is constrained by both natural and social conditions, especially for spatially sparse but functionally connected facilities. Simulation is important for proper deploy in space and configuration in function of facilities to make them a cohesive and supportive system to meet users' operation needs. Based on requirement analysis, we propose a framework to combine GIS and Agent simulation to overcome the shortness in temporal analysis and task simulation of traditional GIS. In this framework, Agent based simulation runs as a service on the server, exposes basic simulation functions, such as scenario configuration, simulation control, and simulation data retrieval to installation planners. At the same time, the simulation service is able to utilize various kinds of geoprocessing services in Agents' process logic to make sophisticated spatial inferences and analysis. This simulation-as-a-service framework has many potential benefits, such as easy-to-use, on-demand, shared understanding, and boosted performances. At the end, we present a preliminary implement of this concept using ArcGIS javascript api 4.0 and ArcGIS for server, showing how trip planning and driving can be carried out by agents.

  18. A collaborative knowledge management framework for supply chains: A UML-based model approach

    Directory of Open Access Journals (Sweden)

    Jorge Esteban Hernández

    2008-12-01

    Full Text Available In the most general cases, collaborative activities imply a distributed decision-making process which involves several supply chain nodes. In this paper, by means of a literature review, and by also considering the deficiencies of existing proposals, a collaborative knowledge management UML-based framework supported is proposed. In addition, this proposal synthesizes existing knowledge, and it not only fulfils, but enriches, each component with the modeller’s own knowledge.

  19. A KPI framework for process-based benchmarking of hospital information systems.

    Science.gov (United States)

    Jahn, Franziska; Winter, Alfred

    2011-01-01

    Benchmarking is a major topic for monitoring, directing and elucidating the performance of hospital information systems (HIS). Current approaches neglect the outcome of the processes that are supported by the HIS and their contribution to the hospital's strategic goals. We suggest to benchmark HIS based on clinical documentation processes and their outcome. A framework consisting of a general process model and outcome criteria for clinical documentation processes is introduced.

  20. Micromechanics based framework with second-order damage tensors

    Science.gov (United States)

    Desmorat, R.; Desmorat, B.; Olive, M.; Kolev, B.

    2018-05-01

    The harmonic product of tensors---leading to the concept of harmonic factorization---has been defined in a previous work (Olive et al, 2017). In the practical case of 3D crack density measurements on thin or thick walled structures, this mathematical tool allows us to factorize the harmonic (irreducible) part of the fourth-order damage tensor as an harmonic square: an exact harmonic square in 2D, an harmonic square over the set of so-called mechanically accessible directions for measurements in the 3D case. The corresponding micro-mechanics framework based on second---instead of fourth---order damage tensors is derived. An illustrating example is provided showing how the proposed framework allows for the modeling of the so-called hydrostatic sensitivity up to high damage levels.

  1. Community-Based Rural Tourism: A Proposed Sustainability Framework

    Directory of Open Access Journals (Sweden)

    Kayat Kalsom

    2014-01-01

    Full Text Available Many tourism projects run by community in the rural areas are labelled as Community-based Rural Tourism (CBRT, a type of a more ‘responsible’ tourism that contributes to sustainable development. However, a framework is needed to enable planners and managers to understand its criteria thus ensuring that the CBRTs fulfil the sustainability requirement. This paper presents findings from a literature review on previous writings in this topic. Findings from an analysis on the criteria of a sustainable CBRT product are discussed. It is found that in order for it to play a role in sustainable development, a CBRT product must focus on competitive management, resource conservation, and benefit creation to the community. The three elements need to be supported, in turn, by community involvement and commitment. As the proposed conceptual framework of sustainable CBRT product can be a basis for further research in CBRT, it offers producing theoretical and practical implications.

  2. An Ontology-Based Framework for Modeling User Behavior

    DEFF Research Database (Denmark)

    Razmerita, Liana

    2011-01-01

    and classifies its users according to their behavior. The user ontology is the backbone of OntobUMf and has been designed according to the Information Management System Learning Information Package (IMS LIP). The user ontology includes a Behavior concept that extends IMS LIP specification and defines...... characteristics of the users interacting with the system. Concrete examples of how OntobUMf is used in the context of a Knowledge Management (KM) System are provided. This paper discusses some of the implications of ontology-based user modeling for semantically enhanced KM and, in particular, for personal KM....... The results of this research may contribute to the development of other frameworks for modeling user behavior, other semantically enhanced user modeling frameworks, or other semantically enhanced information systems....

  3. A Bivariate Generalized Linear Item Response Theory Modeling Framework to the Analysis of Responses and Response Times.

    Science.gov (United States)

    Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J

    2015-01-01

    A generalized linear modeling framework to the analysis of responses and response times is outlined. In this framework, referred to as bivariate generalized linear item response theory (B-GLIRT), separate generalized linear measurement models are specified for the responses and the response times that are subsequently linked by cross-relations. The cross-relations can take various forms. Here, we focus on cross-relations with a linear or interaction term for ability tests, and cross-relations with a curvilinear term for personality tests. In addition, we discuss how popular existing models from the psychometric literature are special cases in the B-GLIRT framework depending on restrictions in the cross-relation. This allows us to compare existing models conceptually and empirically. We discuss various extensions of the traditional models motivated by practical problems. We also illustrate the applicability of our approach using various real data examples, including data on personality and cognitive ability.

  4. A Flexible Nonlinear Modelling Framework for Nonstationary Generalized Extreme Value Analysis in Hydrology and Climatology

    Science.gov (United States)

    Cannon, A. J.

    2009-12-01

    Parameters in a Generalized Extreme Value (GEV) distribution are specified as a function of covariates using a conditional density network (CDN), which is a probabilistic extension of the multilayer perceptron neural network. If the covariate is time, or is dependent on time, then the GEV-CDN model can be used to perform nonlinear, nonstationary GEV analysis of hydrological or climatological time series. Due to the flexibility of the neural network architecture, the model is capable of representing a wide range of nonstationary relationships. Model parameters are estimated by generalized maximum likelihood, an approach that is tailored to the estimation of GEV parameters from geophysical time series. Model complexity is identified using the Bayesian information criterion and the Akaike information criterion with small sample size correction. Monte Carlo simulations are used to validate GEV-CDN performance on four simple synthetic problems. The model is then demonstrated on precipitation data from southern California, a series that exhibits nonstationarity due to interannual/interdecadal climatic variability. A hierarchy of models can be defined by adjusting three aspects of the GEV-CDN model architecture: (i) by specifying either a linear or a nonlinear hidden-layer activation function; (ii) by adjusting the number of hidden-layer nodes; or (iii) by disconnecting weights leading to output-layer nodes. To illustrate, five GEV-CDN models are shown here in order of increasing complexity for the case of a single covariate, which, in this case, is assumed to be time. The shape parameter is assumed to be constant in all models, although this is not a requirement of the GEV-CDN framework.

  5. Gravitational wave tests of general relativity with the parameterized post-Einsteinian framework

    International Nuclear Information System (INIS)

    Cornish, Neil; Sampson, Laura; Yunes, Nicolas; Pretorius, Frans

    2011-01-01

    Gravitational wave astronomy has tremendous potential for studying extreme astrophysical phenomena and exploring fundamental physics. The waves produced by binary black hole mergers will provide a pristine environment in which to study strong-field dynamical gravity. Extracting detailed information about these systems requires accurate theoretical models of the gravitational wave signals. If gravity is not described by general relativity, analyses that are based on waveforms derived from Einstein's field equations could result in parameter biases and a loss of detection efficiency. A new class of ''parameterized post-Einsteinian'' waveforms has been proposed to cover this eventuality. Here, we apply the parameterized post-Einsteinian approach to simulated data from a network of advanced ground-based interferometers and from a future space-based interferometer. Bayesian inference and model selection are used to investigate parameter biases, and to determine the level at which departures from general relativity can be detected. We find that in some cases the parameter biases from assuming the wrong theory can be severe. We also find that gravitational wave observations will beat the existing bounds on deviations from general relativity derived from the orbital decay of binary pulsars by a large margin across a wide swath of parameter space.

  6. skeleSim: an extensible, general framework for population genetic simulation in R

    Science.gov (United States)

    Parobek, Christian M.; Archer, Frederick I.; DePrenger-Levin, Michelle E.; Hoban, Sean M.; Liggins, Libby; Strand, Allan E.

    2016-01-01

    Simulations are a key tool in molecular ecology for inference and forecasting, as well as for evaluating new methods. Due to growing computational power and a diversity of software with different capabilities, simulations are becoming increasingly powerful and useful. However, the widespread use of simulations by geneticists and ecologists is hindered by difficulties in understanding these softwares’ complex capabilities, composing code and input files, a daunting bioinformatics barrier, and a steep conceptual learning curve. skeleSim (an R package) guides users in choosing appropriate simulations, setting parameters, calculating genetic summary statistics, and organizing data output, in a reproducible pipeline within the R environment. skeleSim is designed to be an extensible framework that can ‘wrap’ around any simulation software (inside or outside the R environment) and be extended to calculate and graph any genetic summary statistics. Currently, skeleSim implements coalescent and forward-time models available in the fastsimcoal2 and rmetasim simulation engines to produce null distributions for multiple population genetic statistics and marker types, under a variety of demographic conditions. skeleSim is intended to make simulations easier while still allowing full model complexity to ensure that simulations play a fundamental role in molecular ecology investigations. skeleSim can also serve as a teaching tool: demonstrating the outcomes of stochastic population genetic processes; teaching general concepts of simulations; and providing an introduction to the R environment with a user-friendly graphical user interface (using shiny). PMID:27736016

  7. The Reinvention of General Relativity: A Historiographical Framework for Assessing One Hundred Years of Curved Space-time.

    Science.gov (United States)

    Blum, Alexander; Lalli, Roberto; Renn, M Jürgen

    2015-09-01

    The history of the theory of general relativity presents unique features. After its discovery, the theory was immediately confirmed and rapidly changed established notions of space and time. The further implications of general relativity, however, remained largely unexplored until the mid 1950s, when it came into focus as a physical theory and gradually returned to the mainstream of physics. This essay presents a historiographical framework for assessing the history of general relativity by taking into account in an integrated narrative intellectual developments, epistemological problems, and technological advances; the characteristics of post-World War II and Cold War science; and newly emerging institutional settings. It argues that such a framework can help us understand this renaissance of general relativity as a result of two main factors: the recognition of the untapped potential of general relativity and an explicit effort at community building, which allowed this formerly disparate and dispersed field to benefit from the postwar changes in the scientific landscape.

  8. Communicating Climate Change through ICT-Based Visualization: Towards an Analytical Framework

    Directory of Open Access Journals (Sweden)

    Björn-Ola Linnér

    2013-11-01

    Full Text Available The difficulties in communicating climate change science to the general public are often highlighted as one of the hurdles for support of enhanced climate action. The advances of interactive visualization using information and communication technology (ICT are claimed to be a game-changer in our ability to communicate complex issues. However, new analytical frameworks are warranted to analyse the role of such technologies. This paper develops a novel framework for analyzing the content, form, context and relevance of ICT-based visualization of climate change, based on insights from literature on climate change communication. Thereafter, we exemplify the analytical framework by applying it to a pilot case of ICT-based climate visualization in a GeoDome. Possibilities to use affordable advanced ICT-based visualization devices in science and policy communication are rapidly expanding. We thus see wider implications and applications of the analytical framework not only for other ICT environments but also other issue areas in sustainability communication.

  9. Assessing excellence in translational cancer research: a consensus based framework.

    Science.gov (United States)

    Rajan, Abinaya; Caldas, Carlos; van Luenen, Henri; Saghatchian, Mahasti; van Harten, Wim H

    2013-10-29

    It takes several years on average to translate basic research findings into clinical research and eventually deliver patient benefits. An expert-based excellence assessment can help improve this process by: identifying high performing Comprehensive Cancer Centres; best practices in translational cancer research; improving the quality and efficiency of the translational cancer research process. This can help build networks of excellent Centres by aiding focused partnerships. In this paper we report on a consensus building exercise that was undertaken to construct an excellence assessment framework for translational cancer research in Europe. We used mixed methods to reach consensus: a systematic review of existing translational research models critically appraised for suitability in performance assessment of Cancer Centres; a survey among European stakeholders (researchers, clinicians, patient representatives and managers) to score a list of potential excellence criteria, a focus group with selected representatives of survey participants to review and rescore the excellence criteria; an expert group meeting to refine the list; an open validation round with stakeholders and a critical review of the emerging framework by an independent body: a committee formed by the European Academy of Cancer Sciences. The resulting excellence assessment framework has 18 criteria categorized in 6 themes. Each criterion has a number of questions/sub-criteria. Stakeholders favoured using qualitative excellence criteria to evaluate the translational research "process" rather than quantitative criteria or judging only the outputs. Examples of criteria include checking if the Centre has mechanisms that can be rated as excellent for: involvement of basic researchers and clinicians in translational research (quality of supervision and incentives provided to clinicians to do a PhD in translational research) and well designed clinical trials based on ground-breaking concepts (innovative

  10. A microeconomic-based framework for green cooperative communications

    KAUST Repository

    Bonello, Nicholas

    2011-05-01

    The global concern on environmental issues is placing its mark on all sectors of the society, including of course, the wireless telecommunications industry. In this paper, we develop a microeconomic-based green radio framework in which a network operator shifts some of its responsibilities towards the intended destination to one of these inactive end-users with the specific intention of reducing the transmitted power of its source nodes and the incurred costs. Our results disclose up to 70% energy savings and up to 65% cost savings over the direct transmission benchmarker, when selecting from approximately 50 eligible relays in the scenario portrayed by the UMTS Rel. 99 standard. © 2011 IEEE.

  11. A framework of region-based dynamic image fusion

    Institute of Scientific and Technical Information of China (English)

    WANG Zhong-hua; QIN Zheng; LIU Yu

    2007-01-01

    A new framework of region-based dynamic image fusion is proposed. First, the technique of target detection is applied to dynamic images (image sequences) to segment images into different targets and background regions. Then different fusion rules are employed in different regions so that the target information is preserved as much as possible. In addition, steerable non-separable wavelet frame transform is used in the process of multi-resolution analysis, so the system achieves favorable characters of orientation and invariant shift. Compared with other image fusion methods, experimental results showed that the proposed method has better capabilities of target recognition and preserves clear background information.

  12. A microeconomic-based framework for green cooperative communications

    KAUST Repository

    Bonello, Nicholas; Aissa, Sonia

    2011-01-01

    The global concern on environmental issues is placing its mark on all sectors of the society, including of course, the wireless telecommunications industry. In this paper, we develop a microeconomic-based green radio framework in which a network operator shifts some of its responsibilities towards the intended destination to one of these inactive end-users with the specific intention of reducing the transmitted power of its source nodes and the incurred costs. Our results disclose up to 70% energy savings and up to 65% cost savings over the direct transmission benchmarker, when selecting from approximately 50 eligible relays in the scenario portrayed by the UMTS Rel. 99 standard. © 2011 IEEE.

  13. cMsg - A general purpose, publish-subscribe, interprocess communication implementation and framework

    International Nuclear Information System (INIS)

    Timmer, C; Abbott, D; Gyurjyan, V; Heyes, G; Jastrzembski, E; Wolin, E

    2008-01-01

    cMsg is software used to send and receive messages in the Jefferson Lab online and runcontrol systems. It was created to replace the several IPC software packages in use with a single API. cMsg is asynchronous in nature, running a callback for each message received. However, it also includes synchronous routines for convenience. On the framework level, cMsg is a thin API layer in Java, C, or C++ that can be used to wrap most message-based interprocess communication protocols. The top layer of cMsg uses this same API and multiplexes user calls to one of many such wrapped protocols (or domains) based on a URL-like string which we call a Uniform Domain Locator or UDL. One such domain is a complete implementation of a publish-subscribe messaging system using network communications and written in Java (user APIs in C and C++ too). This domain is built in a way which allows it to be used as a proxy server to other domains (protocols). Performance is excellent allowing the system not only to be used for messaging but also as a data distribution system

  14. A Generalized Robust Minimization Framework for Low-Rank Matrix Recovery

    Directory of Open Access Journals (Sweden)

    Wen-Ze Shao

    2014-01-01

    Full Text Available This paper considers the problem of recovering low-rank matrices which are heavily corrupted by outliers or large errors. To improve the robustness of existing recovery methods, the problem is solved by formulating it as a generalized nonsmooth nonconvex minimization functional via exploiting the Schatten p-norm (0 < p ≤1 and Lq(0 < q ≤1 seminorm. Two numerical algorithms are provided based on the augmented Lagrange multiplier (ALM and accelerated proximal gradient (APG methods as well as efficient root-finder strategies. Experimental results demonstrate that the proposed generalized approach is more inclusive and effective compared with state-of-the-art methods, either convex or nonconvex.

  15. CoP Sensing Framework on Web-Based Environment

    Science.gov (United States)

    Mustapha, S. M. F. D. Syed

    The Web technologies and Web applications have shown similar high growth rate in terms of daily usages and user acceptance. The Web applications have not only penetrated in the traditional domains such as education and business but have also encroached into areas such as politics, social, lifestyle, and culture. The emergence of Web technologies has enabled Web access even to the person on the move through PDAs or mobile phones that are connected using Wi-Fi, HSDPA, or other communication protocols. These two phenomena are the inducement factors toward the need of building Web-based systems as the supporting tools in fulfilling many mundane activities. In doing this, one of the many focuses in research has been to look at the implementation challenges in building Web-based support systems in different types of environment. This chapter describes the implementation issues in building the community learning framework that can be supported on the Web-based platform. The Community of Practice (CoP) has been chosen as the community learning theory to be the case study and analysis as it challenges the creativity of the architectural design of the Web system in order to capture the presence of learning activities. The details of this chapter describe the characteristics of the CoP to understand the inherent intricacies in modeling in the Web-based environment, the evidences of CoP that need to be traced automatically in a slick manner such that the evidence-capturing process is unobtrusive, and the technologies needed to embrace a full adoption of Web-based support system for the community learning framework.

  16. A NEW APROACH OF CONCEPTUAL FRAMEWORK FOR GENERAL PURPOSE FINANCIAL REPORTING BY PUBLIC SECTOR ENTITIES

    Directory of Open Access Journals (Sweden)

    Nistor Cristina

    2011-12-01

    Full Text Available The importance of accounting in the modern economy is obvious. That is more elevated bodies of the European Union and elsewhere dealing with the organization and functioning of accounting as a fundamental component of business (Nistor C., 2009. The mission of the International Federation of Accountants (IFAC is to serve the public interest, strengthen the worldwide accountancy profession and contribute to the development of strong international economies by initiating and encouraging the professional standards of high quality, the convergence process these international standards and to discuss issues of public interest which is extremely relevant international experience of (IFAC, 2011. Currently, the concepts related to financial reports in public sector are developed by IPSAS references. Many of today's IPSAS are based on international accounting standards (IAS / IFRS, to the extent that they are relevant to the requirements of the public sector. Therefore today's IPSAS are based on concepts and definitions of the IASB's conceptual framework, with changes where necessary for public sector specific approach. Thus this study present this brief draft statement under discussion by the leadership of IFAC in collaboration with other organizations and groups that develop financial reporting requirements of the public sector. Then, we highlight the importance and the degree of acceptance of the project which results from comments received. On the basis of combining qualitative with quantitative research seeks to demonstrate the necessity and usefulness of a common conceptual framework of the International Accounting Standards (in this case the Public Sector, starting from their emergence from presenting their bodies involved in the foundation, the content standards, experience of different countries. The results have direct implications on Romanian public accounting system, given that the reference of the international implementation and reporting is

  17. Experiential Knowledge Complements an LCA-Based Decision Support Framework

    Directory of Open Access Journals (Sweden)

    Heng Yi Teah

    2015-09-01

    Full Text Available A shrimp farmer in Taiwan practices innovation through trial-and-error for better income and a better environment, but such farmer-based innovation sometimes fails because the biological mechanism is unclear. Systematic field experimentation and laboratory research are often too costly, and simulating ground conditions is often too challenging. To solve this dilemma, we propose a decision support framework that explicitly utilizes farmer experiential knowledge through a participatory approach to alternatively estimate prospective change in shrimp farming productivity, and to co-design options for improvement. Data obtained from the farmer enable us to quantitatively analyze the production cost and greenhouse gas (GHG emission with a life cycle assessment (LCA methodology. We used semi-quantitative graphical representations of indifference curves and mixing triangles to compare and show better options for the farmer. Our results empower the farmer to make decisions more systematically and reliably based on the frequency of heterotrophic bacteria application and the revision of feed input. We argue that experiential knowledge may be less accurate due to its dependence on varying levels of farmer experience, but this knowledge is a reasonable alternative for immediate decision-making. More importantly, our developed framework advances the scope of LCA application to support practically important yet scientifically uncertain cases.

  18. Context-Based Pedagogy: A Framework From Experience.

    Science.gov (United States)

    Kantar, Lina D

    2016-07-01

    Attempts to transform teaching practice are inadvertently subjected to several hurdles, mostly attributed to the lack of a guiding framework. This study aimed at unraveling the conceptual underpinnings of the context-based pedagogy, being perceived the pedagogy that prepares professionals for future practice. Through focus group interviews, data were collected from 16 nursing students who had case studies as the main instructional format in three major courses. The participants were divided into three focus groups, and interview questions were based on three educational parameters: the learning environment, instructional format, and instructional process. Initial findings revealed an array of classroom activities that characterize each parameter. An in-depth analysis of these activities converged on four concepts: (a) dynamic learning environment, (b) realism, (c) thinking dispositions, and (d) professional formation. These concepts improvise a beginning framework for educators and curriculum leaders that can be used to integrate cases in the curriculum and to facilitate the contextualization of knowledge. [J Nurs Educ. 2016;55(7):391-395.]. Copyright 2016, SLACK Incorporated.

  19. From single-shot towards general work extraction in a quantum thermodynamic framework

    International Nuclear Information System (INIS)

    Gemmer, Jochen; Anders, Janet

    2015-01-01

    This paper considers work extraction from a quantum system to a work storage system (or weight) following Horodecki and Oppenheim (2013 Nat. Commun. 4 2059). An alternative approach is here developed that relies on the comparison of subspace dimensions without a need to introduce thermo-majorization used previously. Optimal single shot work for processes where a weight transfers from (a) a single energy level to another single energy level is then re-derived. In addition we discuss the final state of the system after work extraction and show that the system typically ends in its thermal state, while there are cases where the system is only close to it. The work of formation in the single level transfer setting is also re-derived. The approach presented now allows the extension of the single shot work concept to work extraction (b) involving multiple final levels of the weight. A key conclusion here is that the single shot work for case (a) is appropriate only when a resonance of a particular energy is required. When wishing to identify ‘work extraction’ with finding the weight in a specific available energy or any higher energy a broadening of the single shot work concept is required. As a final contribution we consider transformations of the system that (c) result in general weight state transfers. Introducing a transfer-quantity allows us to formulate minimum requirements for transformations to be at all possible in a thermodynamic framework. We show that choosing the free energy difference of the weight as the transfer-quantity one recovers various single shot results including single level transitions (a), multiple final level transitions (b), and recent results on restricted sets of multi-level to multi-level weight transfers. (paper)

  20. The Justice Dimension of Sustainability: A Systematic and General Conceptual Framework

    Directory of Open Access Journals (Sweden)

    Klara Helene Stumpf

    2015-06-01

    Full Text Available We discuss how the normative dimension of sustainability can be captured in terms of justice. We (i identify the core characteristics of the concept of sustainability and discuss underlying ethical, ontological and epistemological assumptions; (ii introduce a general conceptual structure of justice for the analysis and comparison of different conceptions of justice; and (iii employ this conceptual structure to determine the specific characteristics and challenges of justice in the context of sustainability. We demonstrate that sustainability raises specific and partly new challenges of justice regarding the community of justice, the judicandum, the informational base, the principles, and the instruments of justice.

  1. Knowledge management for systems biology a general and visually driven framework applied to translational medicine

    Directory of Open Access Journals (Sweden)

    Falciani Francesco

    2011-03-01

    Full Text Available Abstract Background To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the

  2. General framework for studying the dynamics of folded and nonfolded proteins by NMR relaxation spectroscopy and MD simulation

    NARCIS (Netherlands)

    Prompers, J.J.; Brüschweiler, R.

    2002-01-01

    A general framework is presented for the interpretation of NMR relaxation data of proteins. The method, termed isotropic reorientational eigenmode dynamics (iRED), relies on a principal component analysis of the isotropically averaged covariance matrix of the lattice functions of the spin

  3. Metal-organic frameworks based membranes for liquid separation.

    Science.gov (United States)

    Li, Xin; Liu, Yuxin; Wang, Jing; Gascon, Jorge; Li, Jiansheng; Van der Bruggen, Bart

    2017-11-27

    Metal-organic frameworks (MOFs) represent a fascinating class of solid crystalline materials which can be self-assembled in a straightforward manner by the coordination of metal ions or clusters with organic ligands. Owing to their intrinsic porous characteristics, unique chemical versatility and abundant functionalities, MOFs have received substantial attention for diverse industrial applications, including membrane separation. Exciting research activities ranging from fabrication strategies to separation applications of MOF-based membranes have appeared. Inspired by the marvelous achievements of MOF-based membranes in gas separations, liquid separations are also being explored for the purpose of constructing continuous MOFs membranes or MOF-based mixed matrix membranes. Although these are in an emerging stage of vigorous development, most efforts are directed towards improving the liquid separation efficiency with well-designed MOF-based membranes. Therefore, as an increasing trend in membrane separation, the field of MOF-based membranes for liquid separation is highlighted in this review. The criteria for judicious selection of MOFs in fabricating MOF-based membranes are given. Special attention is paid to rational design strategies for MOF-based membranes, along with the latest application progress in the area of liquid separations, such as pervaporation, water treatment, and organic solvent nanofiltration. Moreover, some attractive dual-function applications of MOF-based membranes in the removal of micropollutants, degradation, and antibacterial activity are also reviewed. Finally, we define the remaining challenges and future opportunities in this field. This Tutorial Review provides an overview and outlook for MOF-based membranes for liquid separations. Further development of MOF-based membranes for liquid separation must consider the demands of strict separation standards and environmental safety for industrial application.

  4. Metal–organic frameworks based membranes for liquid separation

    KAUST Repository

    Li, Xin

    2017-11-07

    Metal-organic frameworks (MOFs) represent a fascinating class of solid crystalline materials which can be self-assembled in a straightforward manner by the coordination of metal ions or clusters with organic ligands. Owing to their intrinsic porous characteristics, unique chemical versatility and abundant functionalities, MOFs have received substantial attention for diverse industrial applications, including membrane separation. Exciting research activities ranging from fabrication strategies to separation applications of MOF-based membranes have appeared. Inspired by the marvelous achievements of MOF-based membranes in gas separations, liquid separations are also being explored for the purpose of constructing continuous MOFs membranes or MOF-based mixed matrix membranes. Although these are in an emerging stage of vigorous development, most efforts are directed towards improving the liquid separation efficiency with well-designed MOF-based membranes. Therefore, as an increasing trend in membrane separation, the field of MOF-based membranes for liquid separation is highlighted in this review. The criteria for judicious selection of MOFs in fabricating MOF-based membranes are given. Special attention is paid to rational design strategies for MOF-based membranes, along with the latest application progress in the area of liquid separations, such as pervaporation, water treatment, and organic solvent nanofiltration. Moreover, some attractive dual-function applications of MOF-based membranes in the removal of micropollutants, degradation, and antibacterial activity are also reviewed. Finally, we define the remaining challenges and future opportunities in this field. This Tutorial Review provides an overview and outlook for MOF-based membranes for liquid separations. Further development of MOF-based membranes for liquid separation must consider the demands of strict separation standards and environmental safety for industrial application.

  5. Behavior Assessment in Children Following Hospital-Based General Anesthesia versus Office-Based General Anesthesia

    Directory of Open Access Journals (Sweden)

    LaQuia A. Vinson

    2016-08-01

    Full Text Available The purpose of this study was to determine if differences in behavior exist following dental treatment under hospital-based general anesthesia (HBGA or office-based general anesthesia (OBGA in the percentage of patients exhibiting positive behavior and in the mean Frankl scores at recall visits. This retrospective study examined records of a pediatric dental office over a 4 year period. Patients presenting before 48 months of age for an initial exam who were diagnosed with early childhood caries were included in the study. Following an initial exam, patients were treated under HBGA or OBGA. Patients were followed to determine their behavior at 6-, 12- and 18-month recall appointments. Fifty-four patients received treatment under HBGA and 26 were treated under OBGA. OBGA patients were significantly more likely to exhibit positive behavior at the 6- and 12-month recall visits p = 0.038 & p = 0.029. Clinicians should consider future behavior when determining general anesthesia treatment modalities in children with early childhood caries presenting to their office.

  6. Base Map Framework Submission for Clay County, AR, USA

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...

  7. Base Map Framework Submission for Jackson County, AR, USA

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...

  8. Base Map Framework Submission for Randolph County, AR, USA

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...

  9. A generalized adjoint framework for sensitivity and global error estimation in time-dependent nuclear reactor simulations

    International Nuclear Information System (INIS)

    Stripling, H.F.; Anitescu, M.; Adams, M.L.

    2013-01-01

    Highlights: ► We develop an abstract framework for computing the adjoint to the neutron/nuclide burnup equations posed as a system of differential algebraic equations. ► We validate use of the adjoint for computing both sensitivity to uncertain inputs and for estimating global time discretization error. ► Flexibility of the framework is leveraged to add heat transfer physics and compute its adjoint without a reformulation of the adjoint system. ► Such flexibility is crucial for high performance computing applications. -- Abstract: We develop a general framework for computing the adjoint variable to nuclear engineering problems governed by a set of differential–algebraic equations (DAEs). The nuclear engineering community has a rich history of developing and applying adjoints for sensitivity calculations; many such formulations, however, are specific to a certain set of equations, variables, or solution techniques. Any change or addition to the physics model would require a reformulation of the adjoint problem and substantial difficulties in its software implementation. In this work we propose an abstract framework that allows for the modification and expansion of the governing equations, leverages the existing theory of adjoint formulation for DAEs, and results in adjoint equations that can be used to efficiently compute sensitivities for parametric uncertainty quantification. Moreover, as we justify theoretically and demonstrate numerically, the same framework can be used to estimate global time discretization error. We first motivate the framework and show that the coupled Bateman and transport equations, which govern the time-dependent neutronic behavior of a nuclear reactor, may be formulated as a DAE system with a power constraint. We then use a variational approach to develop the parameter-dependent adjoint framework and apply existing theory to give formulations for sensitivity and global time discretization error estimates using the adjoint

  10. An Example-Based Brain MRI Simulation Framework.

    Science.gov (United States)

    He, Qing; Roy, Snehashis; Jog, Amod; Pham, Dzung L

    2015-02-21

    The simulation of magnetic resonance (MR) images plays an important role in the validation of image analysis algorithms such as image segmentation, due to lack of sufficient ground truth in real MR images. Previous work on MRI simulation has focused on explicitly modeling the MR image formation process. However, because of the overwhelming complexity of MR acquisition these simulations must involve simplifications and approximations that can result in visually unrealistic simulated images. In this work, we describe an example-based simulation framework, which uses an "atlas" consisting of an MR image and its anatomical models derived from the hard segmentation. The relationships between the MR image intensities and its anatomical models are learned using a patch-based regression that implicitly models the physics of the MR image formation. Given the anatomical models of a new brain, a new MR image can be simulated using the learned regression. This approach has been extended to also simulate intensity inhomogeneity artifacts based on the statistical model of training data. Results show that the example based MRI simulation method is capable of simulating different image contrasts and is robust to different choices of atlas. The simulated images resemble real MR images more than simulations produced by a physics-based model.

  11. Feasibility Study of a Generalized Framework for Developing Computer-Aided Detection Systems-a New Paradigm.

    Science.gov (United States)

    Nemoto, Mitsutaka; Hayashi, Naoto; Hanaoka, Shouhei; Nomura, Yukihiro; Miki, Soichiro; Yoshikawa, Takeharu

    2017-10-01

    We propose a generalized framework for developing computer-aided detection (CADe) systems whose characteristics depend only on those of the training dataset. The purpose of this study is to show the feasibility of the framework. Two different CADe systems were experimentally developed by a prototype of the framework, but with different training datasets. The CADe systems include four components; preprocessing, candidate area extraction, candidate detection, and candidate classification. Four pretrained algorithms with dedicated optimization/setting methods corresponding to the respective components were prepared in advance. The pretrained algorithms were sequentially trained in the order of processing of the components. In this study, two different datasets, brain MRA with cerebral aneurysms and chest CT with lung nodules, were collected to develop two different types of CADe systems in the framework. The performances of the developed CADe systems were evaluated by threefold cross-validation. The CADe systems for detecting cerebral aneurysms in brain MRAs and for detecting lung nodules in chest CTs were successfully developed using the respective datasets. The framework was shown to be feasible by the successful development of the two different types of CADe systems. The feasibility of this framework shows promise for a new paradigm in the development of CADe systems: development of CADe systems without any lesion specific algorithm designing.

  12. On long-only information-based portfolio diversification framework

    Science.gov (United States)

    Santos, Raphael A.; Takada, Hellinton H.

    2014-12-01

    Using the concepts from information theory, it is possible to improve the traditional frameworks for long-only asset allocation. In modern portfolio theory, the investor has two basic procedures: the choice of a portfolio that maximizes its risk-adjusted excess return or the mixed allocation between the maximum Sharpe portfolio and the risk-free asset. In the literature, the first procedure was already addressed using information theory. One contribution of this paper is the consideration of the second procedure in the information theory context. The performance of these approaches was compared with three traditional asset allocation methodologies: the Markowitz's mean-variance, the resampled mean-variance and the equally weighted portfolio. Using simulated and real data, the information theory-based methodologies were verified to be more robust when dealing with the estimation errors.

  13. LHCb: A CMake-based build and configuration framework

    CERN Multimedia

    Clemencic, M; Mato, P

    2011-01-01

    The LHCb experiment has been using the CMT build and configuration tool for its software since the first versions, mainly because of its multi-platform build support and its powerful configuration management functionality. Still, CMT has some limitations in terms of build performance and the increased complexity added to the tool to cope with new use cases added latterly. Therefore, we have been looking for a viable alternative to it and we have investigated the possibility of adopting the CMake tool, which does a very good job for building and is getting very popular in the HEP community. The result of this study is a CMake-based framework which provides most of the special configuration features available natively only in CMT, with the advantages of better performances, flexibility and portability.

  14. Internet of Things Based Combustible Ice Safety Monitoring System Framework

    Science.gov (United States)

    Sun, Enji

    2017-05-01

    As the development of human society, more energy is requires to meet the need of human daily lives. New energies play a significant role in solving the problems of serious environmental pollution and resources exhaustion in the present world. Combustible ice is essentially frozen natural gas, which can literally be lit on fire bringing a whole new meaning to fire and ice with less pollutant. This paper analysed the advantages and risks on the uses of combustible ice. By compare to other kinds of alternative energies, the advantages of the uses of combustible ice were concluded. The combustible ice basic physical characters and safety risks were analysed. The developments troubles and key utilizations of combustible ice were predicted in the end. A real-time safety monitoring system framework based on the internet of things (IOT) was built to be applied in the future mining, which provide a brand new way to monitoring the combustible ice mining safety.

  15. Energy Sharing Framework for Microgrid-Powered Cellular Base Stations

    KAUST Repository

    Farooq, Muhammad Junaid

    2017-02-07

    Cellular base stations (BSs) are increasingly becoming equipped with renewable energy generators to reduce operational expenditures and carbon footprint of wireless communications. Moreover, advancements in the traditional electricity grid allow two-way power flow and metering that enable the integration of distributed renewable energy generators at BS sites into a microgrid. In this paper, we develop an optimized energy management framework for microgrid-connected cellular BSs that are equipped with renewable energy generators and finite battery storage to minimize energy cost. The BSs share excess renewable energy with others to reduce the dependency on the conventional electricity grid. Three cases are investigated where the renewable energy generation is unknown, perfectly known, and partially known ahead of time. For the partially known case where only the statistics of renewable energy generation are available, stochastic programming is used to achieve a conservative solution. Results show the time varying energy management behaviour of the BSs and the effect of energy sharing between them.

  16. A CMake-based build and configuration framework

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The LHCb experiment has been using the CMT build and configuration tool for its software since the first versions, mainly because of its multi-platform build support and its powerful configuration management functionality. Still, CMT has some limitations in terms of build performance and the increased complexity added to the tool to cope with new use cases added latterly. Therefore, we have been looking for a viable alternative to it and we have investigated the possibility of adopting the CMake tool, which does a very good job for building and is getting very popular in the HEP community. The result of this study is a CMake-based framework which provides most of the special configuration features available natively only in CMT, with the advantages of better performances, flexibility and portability.

  17. An efficient heuristic versus a robust hybrid meta-heuristic for general framework of serial-parallel redundancy problem

    International Nuclear Information System (INIS)

    Sadjadi, Seyed Jafar; Soltani, R.

    2009-01-01

    We present a heuristic approach to solve a general framework of serial-parallel redundancy problem where the reliability of the system is maximized subject to some general linear constraints. The complexity of the redundancy problem is generally considered to be NP-Hard and the optimal solution is not normally available. Therefore, to evaluate the performance of the proposed method, a hybrid genetic algorithm is also implemented whose parameters are calibrated via Taguchi's robust design method. Then, various test problems are solved and the computational results indicate that the proposed heuristic approach could provide us some promising reliabilities, which are fairly close to optimal solutions in a reasonable amount of time.

  18. General Base-General Acid Catalysis in Human Histone Deacetylase 8.

    Science.gov (United States)

    Gantt, Sister M Lucy; Decroos, Christophe; Lee, Matthew S; Gullett, Laura E; Bowman, Christine M; Christianson, David W; Fierke, Carol A

    2016-02-09

    Histone deacetylases (HDACs) regulate cellular processes such as differentiation and apoptosis and are targeted by anticancer therapeutics in development and in the clinic. HDAC8 is a metal-dependent class I HDAC and is proposed to use a general acid-base catalytic pair in the mechanism of amide bond hydrolysis. Here, we report site-directed mutagenesis and enzymological measurements to elucidate the catalytic mechanism of HDAC8. Specifically, we focus on the catalytic function of Y306 and the histidine-aspartate dyads H142-D176 and H143-D183. Additionally, we report X-ray crystal structures of four representative HDAC8 mutants: D176N, D176N/Y306F, D176A/Y306F, and H142A/Y306F. These structures provide a useful framework for understanding enzymological measurements. The pH dependence of kcat/KM for wild-type Co(II)-HDAC8 is bell-shaped with two pKa values of 7.4 and 10.0. The upper pKa reflects the ionization of the metal-bound water molecule and shifts to 9.1 in Zn(II)-HDAC8. The H142A mutant has activity 230-fold lower than that of wild-type HDAC8, but the pKa1 value is not altered. Y306F HDAC8 is 150-fold less active than the wild-type enzyme; crystal structures show that Y306 hydrogen bonds with the zinc-bound substrate carbonyl, poised for transition state stabilization. The H143A and H142A/H143A mutants exhibit activity that is >80000-fold lower than that of wild-type HDAC8; the buried D176N and D176A mutants have significant catalytic effects, with more subtle effects caused by D183N and D183A. These enzymological and structural studies strongly suggest that H143 functions as a single general base-general acid catalyst, while H142 remains positively charged and serves as an electrostatic catalyst for transition state stabilization.

  19. RSSI-Based Distance Estimation Framework Using a Kalman Filter for Sustainable Indoor Computing Environments

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2016-11-01

    Full Text Available Given that location information is the key to providing a variety of services in sustainable indoor computing environments, it is required to obtain accurate locations. Locations can be estimated by three distances from three fixed points. Therefore, if the distance between two points can be measured or estimated accurately, the location in indoor environments can be estimated. To increase the accuracy of the measured distance, noise filtering, signal revision, and distance estimation processes are generally performed. This paper proposes a novel framework for estimating the distance between a beacon and an access point (AP in a sustainable indoor computing environment. Diverse types of received strength signal indications (RSSIs are used for WiFi, Bluetooth, and radio signals, and the proposed distance estimation framework is unique in that it is independent of the specific wireless signal involved, being based on the Bluetooth signal of the beacon. Generally, RSSI measurement, noise filtering, and revision are required for distance estimation using RSSIs. The employed RSSIs are first measured from an AP, with multiple APs sometimes used to increase the accuracy of the distance estimation. Owing to the inevitable presence of noise in the measured RSSIs, the application of noise filtering is essential, and further revision is used to address the inaccuracy and instability that characterizes RSSIs measured in an indoor environment. The revised RSSIs are then used to estimate the distance. The proposed distance estimation framework uses one AP to measure the RSSIs, a Kalman filter to eliminate noise, and a log-distance path loss model to revise the measured RSSIs. In the experimental implementation of the framework, both a RSSI filter and a Kalman filter were respectively used for noise elimination to comparatively evaluate the performance of the latter for the specific application. The Kalman filter was found to reduce the accumulated errors by 8

  20. NAPR: a Cloud-Based Framework for Neuroanatomical Age Prediction.

    Science.gov (United States)

    Pardoe, Heath R; Kuzniecky, Ruben

    2018-01-01

    The availability of cloud computing services has enabled the widespread adoption of the "software as a service" (SaaS) approach for software distribution, which utilizes network-based access to applications running on centralized servers. In this paper we apply the SaaS approach to neuroimaging-based age prediction. Our system, named "NAPR" (Neuroanatomical Age Prediction using R), provides access to predictive modeling software running on a persistent cloud-based Amazon Web Services (AWS) compute instance. The NAPR framework allows external users to estimate the age of individual subjects using cortical thickness maps derived from their own locally processed T1-weighted whole brain MRI scans. As a demonstration of the NAPR approach, we have developed two age prediction models that were trained using healthy control data from the ABIDE, CoRR, DLBS and NKI Rockland neuroimaging datasets (total N = 2367, age range 6-89 years). The provided age prediction models were trained using (i) relevance vector machines and (ii) Gaussian processes machine learning methods applied to cortical thickness surfaces obtained using Freesurfer v5.3. We believe that this transparent approach to out-of-sample evaluation and comparison of neuroimaging age prediction models will facilitate the development of improved age prediction models and allow for robust evaluation of the clinical utility of these methods.

  1. Understanding integrated care: a comprehensive conceptual framework based on the integrative functions of primary care.

    Science.gov (United States)

    Valentijn, Pim P; Schepman, Sanneke M; Opheij, Wilfrid; Bruijnzeels, Marc A

    2013-01-01

    Primary care has a central role in integrating care within a health system. However, conceptual ambiguity regarding integrated care hampers a systematic understanding. This paper proposes a conceptual framework that combines the concepts of primary care and integrated care, in order to understand the complexity of integrated care. The search method involved a combination of electronic database searches, hand searches of reference lists (snowball method) and contacting researchers in the field. The process of synthesizing the literature was iterative, to relate the concepts of primary care and integrated care. First, we identified the general principles of primary care and integrated care. Second, we connected the dimensions of integrated care and the principles of primary care. Finally, to improve content validity we held several meetings with researchers in the field to develop and refine our conceptual framework. The conceptual framework combines the functions of primary care with the dimensions of integrated care. Person-focused and population-based care serve as guiding principles for achieving integration across the care continuum. Integration plays complementary roles on the micro (clinical integration), meso (professional and organisational integration) and macro (system integration) level. Functional and normative integration ensure connectivity between the levels. The presented conceptual framework is a first step to achieve a better understanding of the inter-relationships among the dimensions of integrated care from a primary care perspective.

  2. Governmental, Legal and Regulatory Framework for Safety. General Safety Requirements. Part 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-09-15

    The objective of this publication is to establish requirements in respect of the governmental, legal and regulatory framework for safety. It covers the essential aspects of the framework for establishing a regulatory body and taking other actions necessary to ensure the effective regulatory control of facilities and activities utilized for peaceful purposes. Other responsibilities and functions, such as liaison within the global safety regime and on support services for safety (including radiation protection), emergency preparedness and response, nuclear security, and the State system of accounting for and control of nuclear material, are also covered.

  3. Governmental, Legal and Regulatory Framework for Safety. General Safety Requirements. Part 1 (Arabic Edition)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-09-15

    The objective of this publication is to establish requirements in respect of the governmental, legal and regulatory framework for safety. It covers the essential aspects of the framework for establishing a regulatory body and taking other actions necessary to ensure the effective regulatory control of facilities and activities utilized for peaceful purposes. Other responsibilities and functions, such as liaison within the global safety regime and on support services for safety (including radiation protection), emergency preparedness and response, nuclear security, and the State system of accounting for and control of nuclear material, are also covered.

  4. Governmental, Legal and Regulatory Framework for Safety. General Safety Requirements. Part 1 (Spanish Edition)

    International Nuclear Information System (INIS)

    2010-01-01

    The objective of this publication is to establish requirements in respect of the governmental, legal and regulatory framework for safety. It covers the essential aspects of the framework for establishing a regulatory body and taking other actions necessary to ensure the effective regulatory control of facilities and activities utilized for peaceful purposes. Other responsibilities and functions, such as liaison within the global safety regime and on support services for safety (including radiation protection), emergency preparedness and response, nuclear security, and the State system of accounting for and control of nuclear material, are also covered

  5. Governmental, Legal and Regulatory Framework for Safety. General Safety Requirements. Part 1 (French Edition)

    International Nuclear Information System (INIS)

    2010-01-01

    The objective of this publication is to establish requirements in respect of the governmental, legal and regulatory framework for safety. It covers the essential aspects of the framework for establishing a regulatory body and taking other actions necessary to ensure the effective regulatory control of facilities and activities utilized for peaceful purposes. Other responsibilities and functions, such as liaison within the global safety regime and on support services for safety (including radiation protection), emergency preparedness and response, nuclear security, and the State system of accounting for and control of nuclear material, are also covered

  6. Governmental, Legal and Regulatory Framework for Safety. General Safety Requirements. Part 1 (Chinese Edition)

    International Nuclear Information System (INIS)

    2010-01-01

    The objective of this publication is to establish requirements in respect of the governmental, legal and regulatory framework for safety. It covers the essential aspects of the framework for establishing a regulatory body and taking other actions necessary to ensure the effective regulatory control of facilities and activities utilized for peaceful purposes. Other responsibilities and functions, such as liaison within the global safety regime and on support services for safety (including radiation protection), emergency preparedness and response, nuclear security, and the State system of accounting for and control of nuclear material, are also covered

  7. A Decision Support Framework For Science-Based, Multi-Stakeholder Deliberation: A Coral Reef Example

    Science.gov (United States)

    We present a decision support framework for science-based assessment and multi-stakeholder deliberation. The framework consists of two parts: a DPSIR (Drivers-Pressures-States-Impacts-Responses) analysis to identify the important causal relationships among anthropogenic environ...

  8. A Framework-Based Environment for Object-Oriented Scientific Codes

    Directory of Open Access Journals (Sweden)

    Robert A. Ballance

    1993-01-01

    Full Text Available Frameworks are reusable object-oriented designs for domain-specific programs. In our estimation, frameworks are the key to productivity and reuse. However, frameworks require increased support from the programming environment. A framework-based environment must include design aides and project browsers that can mediate between the user and the framework. A framework-based approach also places new requirements on conventional tools such as compilers. This article explores the impact of object-oriented frameworks upon a programming environment, in the context of object-oriented finite element and finite difference codes. The role of tools such as design aides and project browsers is discussed, and the impact of a framework-based approach upon compilers is examined. Examples are drawn from our prototype C++ based environment.

  9. A Rule-Based Local Search Algorithm for General Shift Design Problems in Airport Ground Handling

    DEFF Research Database (Denmark)

    Clausen, Tommy

    We consider a generalized version of the shift design problem where shifts are created to cover a multiskilled demand and fit the parameters of the workforce. We present a collection of constraints and objectives for the generalized shift design problem. A local search solution framework with mul......We consider a generalized version of the shift design problem where shifts are created to cover a multiskilled demand and fit the parameters of the workforce. We present a collection of constraints and objectives for the generalized shift design problem. A local search solution framework...... with multiple neighborhoods and a loosely coupled rule engine based on simulated annealing is presented. Computational experiments on real-life data from various airport ground handling organization show the performance and flexibility of the proposed algorithm....

  10. A Framework for a Multi-Faceted, Educational, Knowledge-Based Recommender System

    Directory of Open Access Journals (Sweden)

    John W. Coffey

    2016-08-01

    Full Text Available The literature on intelligent or adaptive tutoring systems generally has a focus on how to determine what resources to present to students as they make their way through a course of study. The idea of multi-faceted student modeling is that a variety of measures, both academic and non-academic, might be represented in student models in service of a broader educational context. This paper contains a framework for a multi-faceted, educational, knowledge-based recommender system, including a basic set of descriptors that the model contains, and a taxonomy of inferences that might be made over such models.

  11. Covariant framework for a mass monopole as a field structure in general relativity

    International Nuclear Information System (INIS)

    Schleifer, N.

    1980-01-01

    We present a covariant framework for what is usually referred to as a mass monopole, by utilizing certain scalar invariants that are functions of the eigenvalues of the Riemann tensor. We thus bridge one of the theoretical gaps in the Einstein-Infeld-Hoffmann (EIH) derivation of the equations of motion of particles from the field equations: the lack of a covariant characterization of those aspects of a particle's structure which influence its motion. We have succeeded in giving a covariant framework for a mass monopole, which is the particle type assumed by EIH in their derivation. This is accomplished by using only the field outside the mass (singularity) to describe its characteristics, thereby conforming to a pure field description of nature. The utility of the framework has been verified by applying it to two physically relevant situations. The first is that of a Kerr particle, and the second is that of one spherically symmetric mass orbiting another. Our framework does indeed correspond to the intuitively expected results. In addition, our novel use of eigenvalues of the Riemann tensor appears to be a possible avenue of approach to the covariant characterization of other particle structure

  12. TP-model transformation-based-control design frameworks

    CERN Document Server

    Baranyi, Péter

    2016-01-01

    This book covers new aspects and frameworks of control, design, and optimization based on the TP model transformation and its various extensions. The author outlines the three main steps of polytopic and LMI based control design: 1) development of the qLPV state-space model, 2) generation of the polytopic model; and 3) application of LMI to derive controller and observer. He goes on to describe why literature has extensively studied LMI design, but has not focused much on the second step, in part because the generation and manipulation of the polytopic form was not tractable in many cases. The author then shows how the TP model transformation facilitates this second step and hence reveals new directions, leading to powerful design procedures and the formulation of new questions. The chapters of this book, and the complex dynamical control tasks which they cover, are organized so as to present and analyze the beneficial aspect of the family of approaches (control, design, and optimization). Additionally, the b...

  13. An evidence-based conceptual framework of healthy cooking

    Directory of Open Access Journals (Sweden)

    Margaret Raber

    2016-12-01

    Full Text Available Eating out of the home has been positively associated with body weight, obesity, and poor diet quality. While cooking at home has declined steadily over the last several decades, the benefits of home cooking have gained attention in recent years and many healthy cooking projects have emerged around the United States. The purpose of this study was to develop an evidence-based conceptual framework of healthy cooking behavior in relation to chronic disease prevention. A systematic review of the literature was undertaken using broad search terms. Studies analyzing the impact of cooking behaviors across a range of disciplines were included. Experts in the field reviewed the resulting constructs in a small focus group. The model was developed from the extant literature on the subject with 59 studies informing 5 individual constructs (frequency, techniques and methods, minimal usage, flavoring, and ingredient additions/replacements, further defined by a series of individual behaviors. Face validity of these constructs was supported by the focus group. A validated conceptual model is a significant step toward better understanding the relationship between cooking, disease and disease prevention and may serve as a base for future assessment tools and curricula.

  14. Multiple kernel boosting framework based on information measure for classification

    International Nuclear Information System (INIS)

    Qi, Chengming; Wang, Yuping; Tian, Wenjie; Wang, Qun

    2016-01-01

    The performance of kernel-based method, such as support vector machine (SVM), is greatly affected by the choice of kernel function. Multiple kernel learning (MKL) is a promising family of machine learning algorithms and has attracted many attentions in recent years. MKL combines multiple sub-kernels to seek better results compared to single kernel learning. In order to improve the efficiency of SVM and MKL, in this paper, the Kullback–Leibler kernel function is derived to develop SVM. The proposed method employs an improved ensemble learning framework, named KLMKB, which applies Adaboost to learning multiple kernel-based classifier. In the experiment for hyperspectral remote sensing image classification, we employ feature selected through Optional Index Factor (OIF) to classify the satellite image. We extensively examine the performance of our approach in comparison to some relevant and state-of-the-art algorithms on a number of benchmark classification data sets and hyperspectral remote sensing image data set. Experimental results show that our method has a stable behavior and a noticeable accuracy for different data set.

  15. A Security Monitoring Framework For Virtualization Based HEP Infrastructures

    Science.gov (United States)

    Gomez Ramirez, A.; Martinez Pedreira, M.; Grigoras, C.; Betev, L.; Lara, C.; Kebschull, U.; ALICE Collaboration

    2017-10-01

    High Energy Physics (HEP) distributed computing infrastructures require automatic tools to monitor, analyze and react to potential security incidents. These tools should collect and inspect data such as resource consumption, logs and sequence of system calls for detecting anomalies that indicate the presence of a malicious agent. They should also be able to perform automated reactions to attacks without administrator intervention. We describe a novel framework that accomplishes these requirements, with a proof of concept implementation for the ALICE experiment at CERN. We show how we achieve a fully virtualized environment that improves the security by isolating services and Jobs without a significant performance impact. We also describe a collected dataset for Machine Learning based Intrusion Prevention and Detection Systems on Grid computing. This dataset is composed of resource consumption measurements (such as CPU, RAM and network traffic), logfiles from operating system services, and system call data collected from production Jobs running in an ALICE Grid test site and a big set of malware samples. This malware set was collected from security research sites. Based on this dataset, we will proceed to develop Machine Learning algorithms able to detect malicious Jobs.

  16. Titan TTCN-3 Based Test Framework for Resource Constrained Systems

    Directory of Open Access Journals (Sweden)

    Yushev Artem

    2016-01-01

    Full Text Available Wireless communication systems more and more become part of our daily live. Especially with the Internet of Things (IoT the overall connectivity increases rapidly since everyday objects become part of the global network. For this purpose several new wireless protocols have arisen, whereas 6LoWPAN (IPv6 over Low power Wireless Personal Area Networks can be seen as one of the most important protocols within this sector. Originally designed on top of the IEEE802.15.4 standard it is a subject to various adaptions that will allow to use 6LoWPAN over different technologies; e.g. DECT Ultra Low Energy (ULE. Although this high connectivity offers a lot of new possibilities, there are several requirements and pitfalls coming along with such new systems. With an increasing number of connected devices the interoperability between different providers is one of the biggest challenges, which makes it necessary to verify the functionality and stability of the devices and the network. Therefore testing becomes one of the key components that decides on success or failure of such a system. Although there are several protocol implementations commonly available; e.g., for IoT based systems, there is still a lack of according tools and environments as well as for functional and conformance testing. This article describes the architecture and functioning of the proposed test framework based on Testing and Test Control Notation Version 3 (TTCN-3 for 6LoWPAN over ULE networks.

  17. Effect of Topography on Learning Military Tactics - Integration of Generalized Intelligent Framework for Tutoring (GIFT) and Augmented REality Sandtable (ARES)

    Science.gov (United States)

    2016-09-01

    Dunleavy M, Dede C. Augmented reality teaching and learning. Handbook of research on educational communications and technology . New York (NY): Springer...taxonomy of mixed reality visual displays. IEICE Transactions on Information and Systems. 1994;77(12):1321–1329. Noordzij ML, Scholten P, Laroy-Noordzij...Generalized Intelligent Framework for Tutoring (GIFT) and Augmented REality Sandtable (ARES) by Michael W Boyce, Ramsamooj J Reyes, Deeja E Cruz, Charles

  18. Pattern-based framework for data acquisition systems

    International Nuclear Information System (INIS)

    Padmini, S.; Diwakar, M.P.; Nair, Preetha; Mathew, R.

    2004-01-01

    The data acquisition framework implements a reusable abstract architectural design for use in the development of data acquisition systems. The framework is being used to build Flux Mapping system (FMs) for TAPS III-IV and RRS Data Acquisition System for Dhruva reactor

  19. Object Persistence: A Framework Based On Design Patterns

    OpenAIRE

    Kienzle, Jörg; Romanovsky, Alexander

    2000-01-01

    The poster presents a framework for providing object persistence in object-oriented programming languages without modifying the run-time system or the language itself. The framework does not rely on any kind of special programming language features. It only uses basic object-oriented programming techniques, and is therefore implementable in any object-oriented programming language.

  20. On the Use of Generalized Volume Scattering Models for the Improvement of General Polarimetric Model-Based Decomposition

    Directory of Open Access Journals (Sweden)

    Qinghua Xie

    2017-01-01

    Full Text Available Recently, a general polarimetric model-based decomposition framework was proposed by Chen et al., which addresses several well-known limitations in previous decomposition methods and implements a simultaneous full-parameter inversion by using complete polarimetric information. However, it only employs four typical models to characterize the volume scattering component, which limits the parameter inversion performance. To overcome this issue, this paper presents two general polarimetric model-based decomposition methods by incorporating the generalized volume scattering model (GVSM or simplified adaptive volume scattering model, (SAVSM proposed by Antropov et al. and Huang et al., respectively, into the general decomposition framework proposed by Chen et al. By doing so, the final volume coherency matrix structure is selected from a wide range of volume scattering models within a continuous interval according to the data itself without adding unknowns. Moreover, the new approaches rely on one nonlinear optimization stage instead of four as in the previous method proposed by Chen et al. In addition, the parameter inversion procedure adopts the modified algorithm proposed by Xie et al. which leads to higher accuracy and more physically reliable output parameters. A number of Monte Carlo simulations of polarimetric synthetic aperture radar (PolSAR data are carried out and show that the proposed method with GVSM yields an overall improvement in the final accuracy of estimated parameters and outperforms both the version using SAVSM and the original approach. In addition, C-band Radarsat-2 and L-band AIRSAR fully polarimetric images over the San Francisco region are also used for testing purposes. A detailed comparison and analysis of decomposition results over different land-cover types are conducted. According to this study, the use of general decomposition models leads to a more accurate quantitative retrieval of target parameters. However, there

  1. IT governance: An architectural framework based on consolidated best practices

    Directory of Open Access Journals (Sweden)

    Thami Batyashe

    2016-03-01

    Full Text Available Due to the continual increase of the significance of information technology (IT, the need to provide governance in the deployment, use and management of the artefacts is simultaneously essential. However, even though different frameworks have been employed, the implementation of IT governance has never been easy for many organisations. This is attributed to many factors, such as people, process and technological artefacts. IT governance frameworks differ in one area or another, making their selection challenging for organisations. As a result, some organisations have more than one IT governance framework. This, on one hand, sometimes results in duplication of the frameworks’ functionalities, thereby adding to the environment complexity. On another hand, some IT governance frameworks are short of functions in regard to the organisation’s objectives. These challenges are attributed to the lack of an architectural framework, of consolidated best practices.

  2. RosettaAntibodyDesign (RAbD): A general framework for computational antibody design.

    Science.gov (United States)

    Adolf-Bryfogle, Jared; Kalyuzhniy, Oleks; Kubitz, Michael; Weitzner, Brian D; Hu, Xiaozhen; Adachi, Yumiko; Schief, William R; Dunbrack, Roland L

    2018-04-01

    A structural-bioinformatics-based computational methodology and framework have been developed for the design of antibodies to targets of interest. RosettaAntibodyDesign (RAbD) samples the diverse sequence, structure, and binding space of an antibody to an antigen in highly customizable protocols for the design of antibodies in a broad range of applications. The program samples antibody sequences and structures by grafting structures from a widely accepted set of the canonical clusters of CDRs (North et al., J. Mol. Biol., 406:228-256, 2011). It then performs sequence design according to amino acid sequence profiles of each cluster, and samples CDR backbones using a flexible-backbone design protocol incorporating cluster-based CDR constraints. Starting from an existing experimental or computationally modeled antigen-antibody structure, RAbD can be used to redesign a single CDR or multiple CDRs with loops of different length, conformation, and sequence. We rigorously benchmarked RAbD on a set of 60 diverse antibody-antigen complexes, using two design strategies-optimizing total Rosetta energy and optimizing interface energy alone. We utilized two novel metrics for measuring success in computational protein design. The design risk ratio (DRR) is equal to the frequency of recovery of native CDR lengths and clusters divided by the frequency of sampling of those features during the Monte Carlo design procedure. Ratios greater than 1.0 indicate that the design process is picking out the native more frequently than expected from their sampled rate. We achieved DRRs for the non-H3 CDRs of between 2.4 and 4.0. The antigen risk ratio (ARR) is the ratio of frequencies of the native amino acid types, CDR lengths, and clusters in the output decoys for simulations performed in the presence and absence of the antigen. For CDRs, we achieved cluster ARRs as high as 2.5 for L1 and 1.5 for H2. For sequence design simulations without CDR grafting, the overall recovery for the native

  3. RosettaAntibodyDesign (RAbD): A general framework for computational antibody design

    Science.gov (United States)

    Adolf-Bryfogle, Jared; Kalyuzhniy, Oleks; Kubitz, Michael; Hu, Xiaozhen; Adachi, Yumiko; Schief, William R.

    2018-01-01

    A structural-bioinformatics-based computational methodology and framework have been developed for the design of antibodies to targets of interest. RosettaAntibodyDesign (RAbD) samples the diverse sequence, structure, and binding space of an antibody to an antigen in highly customizable protocols for the design of antibodies in a broad range of applications. The program samples antibody sequences and structures by grafting structures from a widely accepted set of the canonical clusters of CDRs (North et al., J. Mol. Biol., 406:228–256, 2011). It then performs sequence design according to amino acid sequence profiles of each cluster, and samples CDR backbones using a flexible-backbone design protocol incorporating cluster-based CDR constraints. Starting from an existing experimental or computationally modeled antigen-antibody structure, RAbD can be used to redesign a single CDR or multiple CDRs with loops of different length, conformation, and sequence. We rigorously benchmarked RAbD on a set of 60 diverse antibody–antigen complexes, using two design strategies—optimizing total Rosetta energy and optimizing interface energy alone. We utilized two novel metrics for measuring success in computational protein design. The design risk ratio (DRR) is equal to the frequency of recovery of native CDR lengths and clusters divided by the frequency of sampling of those features during the Monte Carlo design procedure. Ratios greater than 1.0 indicate that the design process is picking out the native more frequently than expected from their sampled rate. We achieved DRRs for the non-H3 CDRs of between 2.4 and 4.0. The antigen risk ratio (ARR) is the ratio of frequencies of the native amino acid types, CDR lengths, and clusters in the output decoys for simulations performed in the presence and absence of the antigen. For CDRs, we achieved cluster ARRs as high as 2.5 for L1 and 1.5 for H2. For sequence design simulations without CDR grafting, the overall recovery for the

  4. A general framework for the regression analysis of pooled biomarker assessments.

    Science.gov (United States)

    Liu, Yan; McMahan, Christopher; Gallagher, Colin

    2017-07-10

    As a cost-efficient data collection mechanism, the process of assaying pooled biospecimens is becoming increasingly common in epidemiological research; for example, pooling has been proposed for the purpose of evaluating the diagnostic efficacy of biological markers (biomarkers). To this end, several authors have proposed techniques that allow for the analysis of continuous pooled biomarker assessments. Regretfully, most of these techniques proceed under restrictive assumptions, are unable to account for the effects of measurement error, and fail to control for confounding variables. These limitations are understandably attributable to the complex structure that is inherent to measurements taken on pooled specimens. Consequently, in order to provide practitioners with the tools necessary to accurately and efficiently analyze pooled biomarker assessments, herein, a general Monte Carlo maximum likelihood-based procedure is presented. The proposed approach allows for the regression analysis of pooled data under practically all parametric models and can be used to directly account for the effects of measurement error. Through simulation, it is shown that the proposed approach can accurately and efficiently estimate all unknown parameters and is more computational efficient than existing techniques. This new methodology is further illustrated using monocyte chemotactic protein-1 data collected by the Collaborative Perinatal Project in an effort to assess the relationship between this chemokine and the risk of miscarriage. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  5. KMCLib: A general framework for lattice kinetic Monte Carlo (KMC) simulations

    Science.gov (United States)

    Leetmaa, Mikael; Skorodumova, Natalia V.

    2014-09-01

    KMCLib is a general framework for lattice kinetic Monte Carlo (KMC) simulations. The program can handle simulations of the diffusion and reaction of millions of particles in one, two, or three dimensions, and is designed to be easily extended and customized by the user to allow for the development of complex custom KMC models for specific systems without having to modify the core functionality of the program. Analysis modules and on-the-fly elementary step diffusion rate calculations can be implemented as plugins following a well-defined API. The plugin modules are loosely coupled to the core KMCLib program via the Python scripting language. KMCLib is written as a Python module with a backend C++ library. After initial compilation of the backend library KMCLib is used as a Python module; input to the program is given as a Python script executed using a standard Python interpreter. We give a detailed description of the features and implementation of the code and demonstrate its scaling behavior and parallel performance with a simple one-dimensional A-B-C lattice KMC model and a more complex three-dimensional lattice KMC model of oxygen-vacancy diffusion in a fluorite structured metal oxide. KMCLib can keep track of individual particle movements and includes tools for mean square displacement analysis, and is therefore particularly well suited for studying diffusion processes at surfaces and in solids. Catalogue identifier: AESZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AESZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 49 064 No. of bytes in distributed program, including test data, etc.: 1 575 172 Distribution format: tar.gz Programming language: Python and C++. Computer: Any computer that can run a C++ compiler and a Python interpreter. Operating system: Tested on Ubuntu 12

  6. [Generalized neonatal screening based on laboratory tests].

    Science.gov (United States)

    Ardaillou, Raymond; Le Gall, Jean-Yves

    2006-11-01

    Implementation of a generalized screening program for neonatal diseases must obey precise rules. The disease must be severe, recognizable at an early stage, amenable to an effective treatment, detectable with a non expensive and widely applicable test; it must also be a significant public health problem. Subjects with positive results must be offered immediate treatment or prevention. All screening programs must be regularly evaluated. In France, since 1978, a national screening program has been organized by a private association ("Association française pour le dépistage et la prévention des handicaps de l'enfant") and supervised by the "Caisse nationale d'assurance maladie" and "Direction Générale de la Sante". Five diseases are now included in the screening program: phenylketonuria, hypothyroidism, congenital adrenal hyperplasia, cystic fibrosis and sickle cell disease (the latter only in at-risk newborns). Toxoplasmosis is a particular problem because only the children of mothers who were not tested during the pregnancy or who seroconverted are screened. Neonatal screening for phenylketonuria and hypothyrodism is unanimously recommended. Screening for congenital adrenal hyperplasia is approved in most countries. Cases of sickle cell disease and cystic fibrosis are more complex because--not all children who carry the mutations develop severe forms;--there is no curative treatment;--parents may become anxious, even though the phenotype is sometimes mild or even asymptomatic. Supporters of screening stress the benefits of early diagnosis (which extends the life expectancy of these children, particularly in the case of sickle cell disease), the fact that it opens up the possibility of prenatal screening of future pregnancies, and the utility of informing heterozygous carriers identified by familial screening. Neonatal screening for other diseases is under discussion. Indeed, technical advances such as tandem mass spectrometry make it possible to detect about 50

  7. Optimizing the structure of the natural gas market using an agent-based modeling framework

    Energy Technology Data Exchange (ETDEWEB)

    Van Benthem, M.

    2010-01-14

    The overall research question guiding this study is as follows: what is the optimal structure of the natural gas market, considering both the degrees of affordability and supply security resulting from this structure? The sub-questions are: How can the concepts of supply security and affordability be usefully defined? (Chapter 2); What should a modeling framework for analyzing the natural gas market with regard to these concepts look like? (Chapters 3 and 4); What general conclusions can be drawn on the basis of this framework? (Chapter 5); What is the effect of liberalization on the Dutch natural gas market? (Chapter 6); What are the possible effects of current trends unfolding in the Dutch natural gas market? (Chapter 7). The framework constructed in this study implicitly contains the necessary elements for deriving a sustainability indicator too. However, to limit the scope of the study, sustainability will not be analyzed explicitly. Chapter 2 provides an introductory description of the natural gas market. Starting from a description of the natural gas value chain, the process of liberalization is described as a change in the organization of the value chain. In addition, the concepts of affordability and supply security are discussed and appropriate quantitative indicators for both objectives are identified. In Chapter 3, a survey of existing gas market models is performed. On the basis of this survey, a classification system for natural gas market models is developed. Furthermore, the characteristics of a modeling framework fit for the purpose of this study are derived. In Chapter 4, a general, quantitative framework for natural gas market modeling is developed on the basis of agent-based computational economics. The model's structure, its dynamics, output and data requirements are described. Furthermore, the properties of each agent are explored, and the possibilities for model verification and validation are outlined. Chapter 5 provides a number of

  8. Optimizing the structure of the natural gas market using an agent-based modeling framework

    International Nuclear Information System (INIS)

    Van Benthem, M.

    2010-01-01

    The overall research question guiding this study is as follows: what is the optimal structure of the natural gas market, considering both the degrees of affordability and supply security resulting from this structure? The sub-questions are: How can the concepts of supply security and affordability be usefully defined? (Chapter 2); What should a modeling framework for analyzing the natural gas market with regard to these concepts look like? (Chapters 3 and 4); What general conclusions can be drawn on the basis of this framework? (Chapter 5); What is the effect of liberalization on the Dutch natural gas market? (Chapter 6); What are the possible effects of current trends unfolding in the Dutch natural gas market? (Chapter 7). The framework constructed in this study implicitly contains the necessary elements for deriving a sustainability indicator too. However, to limit the scope of the study, sustainability will not be analyzed explicitly. Chapter 2 provides an introductory description of the natural gas market. Starting from a description of the natural gas value chain, the process of liberalization is described as a change in the organization of the value chain. In addition, the concepts of affordability and supply security are discussed and appropriate quantitative indicators for both objectives are identified. In Chapter 3, a survey of existing gas market models is performed. On the basis of this survey, a classification system for natural gas market models is developed. Furthermore, the characteristics of a modeling framework fit for the purpose of this study are derived. In Chapter 4, a general, quantitative framework for natural gas market modeling is developed on the basis of agent-based computational economics. The model's structure, its dynamics, output and data requirements are described. Furthermore, the properties of each agent are explored, and the possibilities for model verification and validation are outlined. Chapter 5 provides a number of

  9. A General Framework for Portfolio Theory. Part I: theory and various models

    OpenAIRE

    Maier-Paape, Stanislaus; Zhu, Qiji Jim

    2017-01-01

    Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two dimensional space of utility and risk. This is a rather general pattern. The modern portfolio theory of Markowitz [H. Markowitz, Portfolio Selection, 1959] and its natural generalization, the capital market pricing model, [W. F. Sharpe, Mutual fund performance , 1966] are spe...

  10. Governmental, Legal and Regulatory Framework for Safety. General Safety Requirements. Part 1, Revision 1 (Chinese Edition)

    International Nuclear Information System (INIS)

    2016-01-01

    This publication establishes requirements in respect of the governmental, legal and regulatory framework for safety. It covers the essential aspects of the framework for establishing a regulatory body and taking other actions necessary to ensure the effective regulatory control of facilities and activities utilized for peaceful purposes. Other responsibilities and functions, such as liaison within the global safety regime and on support services for safety (including radiation protection), emergency preparedness and response, nuclear security, and the State system of accounting for and control of nuclear material, are also covered. A review of Safety Requirements publications was commenced in 2011 following the accident in the Fukushima Daiichi nuclear power plant in Japan. The review revealed no significant areas of weakness and resulted in just a small set of amendments to strengthen the requirements and facilitate their implementation, which are contained in the present publication.

  11. A Web GIS Framework for Participatory Sensing Service: An Open Source-Based Implementation

    Directory of Open Access Journals (Sweden)

    Yu Nakayama

    2017-04-01

    Full Text Available Participatory sensing is the process in which individuals or communities collect and analyze systematic data using mobile phones and cloud services. To efficiently develop participatory sensing services, some server-side technologies have been proposed. Although they provide a good platform for participatory sensing, they are not optimized for spatial data management and processing. For the purpose of spatial data collection and management, many web GIS approaches have been studied. However, they still have not focused on the optimal framework for participatory sensing services. This paper presents a web GIS framework for participatory sensing service (FPSS. The proposed FPSS enables an integrated deployment of spatial data capture, storage, and data management functions. In various types of participatory sensing experiments, users can collect and manage spatial data in a unified manner. This feature is realized by the optimized system architecture and use case based on the general requirements for participatory sensing. We developed an open source GIS-based implementation of the proposed framework, which can overcome financial difficulties that are one of the major problems of deploying sensing experiments. We confirmed with the prototype that participatory sensing experiments can be performed efficiently with the proposed FPSS.

  12. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    International Nuclear Information System (INIS)

    Agostini, M; Pandola, L; Zavarise, P; Volynets, O

    2011-01-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  13. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    Science.gov (United States)

    Agostini, M.; Pandola, L.; Zavarise, P.; Volynets, O.

    2011-08-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  14. Cluster imaging of multi-brain networks (CIMBN: a general framework for hyperscanning and modeling a group of interacting brains

    Directory of Open Access Journals (Sweden)

    Lian eDuan

    2015-07-01

    Full Text Available Studying the neural basis of human social interactions is a key topic in the field of social neuroscience. Brain imaging studies in this field usually focus on the neural correlates of the social interactions between two participants. However, as the participant number further increases, even by a small amount, great difficulties raise. One challenge is how to concurrently scan all the interacting brains with high ecological validity, especially for a large number of participants. The other challenge is how to effectively model the complex group interaction behaviors emerging from the intricate neural information exchange among a group of socially organized people. Confronting these challenges, we propose a new approach called Cluster Imaging of Multi-brain Networks (CIMBN. CIMBN consists of two parts. The first part is a cluster imaging technique with high ecological validity based on multiple functional near-infrared spectroscopy (fNIRS systems. Using this technique, we can easily extend the simultaneous imaging capacity of social neuroscience studies up to dozens of participants. The second part of CIMBN is a multi-brain network (MBN modeling method based on graph theory. By taking each brain as a network node and the relationship between any two brains as a network edge, one can construct a network model for a group of interacting brains. The emergent group social behaviors can then be studied using the network’s properties, such as its topological structure and information exchange efficiency. Although there is still much work to do, as a general framework for hyperscanning and modeling a group of interacting brains, CIMBN can provide new insights into the neural correlates of group social interactions, and advance social neuroscience and social psychology.

  15. A Viola-Jones based hybrid face detection framework

    Science.gov (United States)

    Murphy, Thomas M.; Broussard, Randy; Schultz, Robert; Rakvic, Ryan; Ngo, Hau

    2013-12-01

    Improvements in face detection performance would benefit many applications. The OpenCV library implements a standard solution, the Viola-Jones detector, with a statistically boosted rejection cascade of binary classifiers. Empirical evidence has shown that Viola-Jones underdetects in some instances. This research shows that a truncated cascade augmented by a neural network could recover these undetected faces. A hybrid framework is constructed, with a truncated Viola-Jones cascade followed by an artificial neural network, used to refine the face decision. Optimally, a truncation stage that captured all faces and allowed the neural network to remove the false alarms is selected. A feedforward backpropagation network with one hidden layer is trained to discriminate faces based upon the thresholding (detection) values of intermediate stages of the full rejection cascade. A clustering algorithm is used as a precursor to the neural network, to group significant overlappings. Evaluated on the CMU/VASC Image Database, comparison with an unmodified OpenCV approach shows: (1) a 37% increase in detection rates if constrained by the requirement of no increase in false alarms, (2) a 48% increase in detection rates if some additional false alarms are tolerated, and (3) an 82% reduction in false alarms with no reduction in detection rates. These results demonstrate improved face detection and could address the need for such improvement in various applications.

  16. Engineering general intelligence

    CERN Document Server

    Goertzel, Ben; Geisweiller, Nil

    2014-01-01

    The work outlines a novel conceptual and theoretical framework for understanding Artificial General Intelligence and based on this framework outlines a practical roadmap for the development of AGI with capability at the human level and ultimately beyond.

  17. Geotube: a network based framework for Goescience dissemination

    Science.gov (United States)

    Grieco, Giovanni; Porta, Marina; Merlini, Anna Elisabetta; Caironi, Valeria; Reggiori, Donatella

    2016-04-01

    Geotube is a project promoted by Il Geco cultural association for the dissemination of Geoscience education in schools by open multimedia environments. The approach is based on the following keystones: • A deep and permanent epistemological reflection supported by confrontation within the International Scientific Community • A close link with the territory • A local to global inductive approach to basic concepts in Geosciences • The construction of an open framework to stimulate creativity The project has been developed as an educational activity for secondary schools (11 to 18 years old students). It provides for the creation of a network of institutions to be involved in order to ensure the required diversified expertise. They can comprise: Universities, Natural Parks, Mountain Communities, Municipalities, schools, private companies working in the sector, and so on. A single project lasts for one school year (October to June) and requires 8-12 work hours at school, one or two half day or full day excursions and a final event of presentation of outputs. The possible outputs comprise a pdf or ppt guidebook, a script and a video completely shooted and edited by the students. The framework is open in order to adapt to the single class or workgroup needs, the level and type of school, the time available and different subjects in Geosciences. In the last two years the two parts of the project have been successfully tested separately, while the full project will be presented at schools in in its full form in April 2016, in collaboration with University of Milan, Campo dei Fiori Natural Park, Piambello Mountain Community and Cunardo Municipality. The production of geotube outputs has been tested in a high school for three consecutive years. Students produced scripts and videos on the following subjects: geologic hazards, volcanoes and earthquakes, and climate change. The excursions have been tested with two different high schools. Firstly two areas have been

  18. A model-based framework for design of intensified enzyme-based processes

    DEFF Research Database (Denmark)

    Román-Martinez, Alicia

    This thesis presents a generic and systematic model-based framework to design intensified enzyme-based processes. The development of the presented methodology was motivated by the needs of the bio-based industry for a more systematic approach to achieve intensification in its production plants...... in enzyme-based processes which have found significant application in the pharmaceutical, food, and renewable fuels sector. The framework uses model-based strategies for (bio)-chemical process design and optimization, including the use of a superstructure to generate all potential reaction......(s)-separation(s) options according to a desired performance criteria and a generic mathematical model represented by the superstructure to derive the specific models corresponding to a specific process option. In principle, three methods of intensification of bioprocess are considered in this thesis: 1. enzymatic one...

  19. A Function-Based Framework for Stream Assessment & Restoration Projects

    Science.gov (United States)

    This report lays out a framework for approaching stream assessment and restoration projects that focuses on understanding the suite of stream functions at a site in the context of what is happening in the watershed.

  20. A Framework for Developing a Knowledge Base for Indigenous ...

    African Journals Online (AJOL)

    ESARBICA Journal: Journal of the Eastern and Southern Africa Regional Branch of the International Council on Archives. Journal Home · ABOUT THIS ... framework information scientists can document, preserve and disseminate indigenous ...

  1. Sensor Based Framework for Secure Multimedia Communication in VANET

    Science.gov (United States)

    Rahim, Aneel; Khan, Zeeshan Shafi; Bin Muhaya, Fahad T.; Sher, Muhammad; Kim, Tai-Hoon

    2010-01-01

    Secure multimedia communication enhances the safety of passengers by providing visual pictures of accidents and danger situations. In this paper we proposed a framework for secure multimedia communication in Vehicular Ad-Hoc Networks (VANETs). Our proposed framework is mainly divided into four components: redundant information, priority assignment, malicious data verification and malicious node verification. The proposed scheme jhas been validated with the help of the NS-2 network simulator and the Evalvid tool. PMID:22163462

  2. Sensor Based Framework for Secure Multimedia Communication in VANET

    Directory of Open Access Journals (Sweden)

    Tai-Hoon Kim

    2010-11-01

    Full Text Available Secure multimedia communication enhances the safety of passengers by providing visual pictures of accidents and danger situations. In this paper we proposed a framework for secure multimedia communication in Vehicular Ad-Hoc Networks (VANETs. Our proposed framework is mainly divided into four components: redundant information, priority assignment, malicious data verification and malicious node verification. The proposed scheme jhas been validated with the help of the NS-2 network simulator and the Evalvid tool.

  3. Generalized Information Theory Meets Human Cognition: Introducing a Unified Framework to Model Uncertainty and Information Search.

    Science.gov (United States)

    Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya

    2018-06-17

    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.

  4. An automated and integrated framework for dust storm detection based on ogc web processing services

    Science.gov (United States)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.

    2014-11-01

    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data

  5. Bure's underground research laboratory: general framework, objectives, siting process and schedule of the URL project

    International Nuclear Information System (INIS)

    Gaussen, J.L.

    2001-01-01

    Bure URL project is one of the components of the French research program dedicated to the study of HLLLW (High Level Long Lived Radioactive Waste) disposal in geologic repository within the framework of the 1991 Radioactive Waste Act. Pursuant to the said act, the objective of the URL project is to participate in the ''evaluation of options for retrievable or non- retrievable disposal in deep geologic formations''. More precisely, the goal of this URL, which is situated 300 km East of Paris, is to gain a better knowledge of a site capable of hosting a geologic repository. (author)

  6. Radical covalent organic frameworks: a general strategy to immobilize open-accessible polyradicals for high-performance capacitive energy storage.

    Science.gov (United States)

    Xu, Fei; Xu, Hong; Chen, Xiong; Wu, Dingcai; Wu, Yang; Liu, Hao; Gu, Cheng; Fu, Ruowen; Jiang, Donglin

    2015-06-01

    Ordered π-columns and open nanochannels found in covalent organic frameworks (COFs) could render them able to store electric energy. However, the synthetic difficulty in achieving redox-active skeletons has thus far restricted their potential for energy storage. A general strategy is presented for converting a conventional COF into an outstanding platform for energy storage through post-synthetic functionalization with organic radicals. The radical frameworks with openly accessible polyradicals immobilized on the pore walls undergo rapid and reversible redox reactions, leading to capacitive energy storage with high capacitance, high-rate kinetics, and robust cycle stability. The results suggest that channel-wall functional engineering with redox-active species will be a facile and versatile strategy to explore COFs for energy storage. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. A web-based e-learning framework for public perception and acceptance on nuclear energy

    International Nuclear Information System (INIS)

    Zhou Yangping; Yoshikawa, Hidekazu; Liu Jingquan; Ouyang, Jun; Lu Daogang

    2005-01-01

    Now, public acceptance plays a central role in the nuclear energy. Public concerns on safety and sustainability of nuclear energy, ground nuclear power in many countries and territories to a stop or even a downfall. In this study, an e-learning framework by using Internet, is proposed for public education in order to boost public perception on nuclear energy, which will certainly affect public acceptance toward it. This study aims at investigating public perception and acceptance on nuclear energy in a continuous and accurate manner. In addition, this e-learning framework can promote public perception on nuclear energy by using teaching material with a graphical hierarchy about knowledge of nuclear energy. This web-based e-learning framework mainly consists of two components: (1) an e-learning support module which continuously investigates public perception and acceptance toward nuclear energy and teaches public knowledge about nuclear energy; (2) an updating module which may improve the education materials by analyzing the effect of education or proving the materials submitted by the visitors through Wiki pages. Advantages and future work of this study are also generally described. (author)

  8. A novel water quality data analysis framework based on time-series data mining.

    Science.gov (United States)

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. A generalized framework for assessment of safety margins in nuclear power plants

    International Nuclear Information System (INIS)

    Gavrilas, M.; Youngblood, B.; Prelewicz, D.; Meyer, Jim

    2004-01-01

    The protection of public health and safety, and the environment from inadvertent releases of radioactive materials from nuclear power plants relies on the implementation of the defense-in-depth strategy. The term defense-in-depth evolved historically, and thus its application has not always been uniform. The use of the term in the context of the U.S. Nuclear Regulatory Commission (NRC) safety philosophy entails the reliance of a nuclear facility on successive compensatory measures in preventing accidents or mitigating damage caused by malfunctions, accidents, or naturally occurring events. The introduction of probabilistic risk analyses with NUREG-74/014 and subsequent evolution in risk assessment techniques, are leading to the implementation of risk informed regulation to ensure the safety of the public and the environment. Risk informed regulation minimizes the likelihood of overlooking potentially significant accident sequences while limiting unnecessary burdens imposed on licensees. The proposed framework merges fundamental elements of safety regulation: defense-in depth, safety margins and probabilistic risk. It formalizes the relationship between probabilistic risk assessment (PRA) methods and data, and deterministic analyses in a manner consistent with NRC's defense-in-depth philosophy. Succinctly put, the likelihood and consequences of accident scenarios are considered simultaneously and quantified by a plant safety metric. The integration of these fundamental elements into a practically applicable safety framework is consistent with the NRC policy statement on use of probabilistic risk assessment methods and the November 2002 Regulatory Guide on risk informed decisions on plant-specific changes to the licensing basis. Safety information resulting from the application of the framework supersedes traditional safety figures of merit. Safety quantifiers, referred herein as safety indices, expand on the qualifier outcomes that currently accompany fault tree

  10. A General Simulator for Acid-Base Titrations

    Science.gov (United States)

    de Levie, Robert

    1999-07-01

    General formal expressions are provided to facilitate the automatic computer calculation of acid-base titration curves of arbitrary mixtures of acids, bases, and salts, without and with activity corrections based on the Davies equation. Explicit relations are also given for the buffer strength of mixtures of acids, bases, and salts.

  11. Information-preserving structures: A general framework for quantum zero-error information

    International Nuclear Information System (INIS)

    Blume-Kohout, Robin; Ng, Hui Khoon; Poulin, David; Viola, Lorenza

    2010-01-01

    Quantum systems carry information. Quantum theory supports at least two distinct kinds of information (classical and quantum), and a variety of different ways to encode and preserve information in physical systems. A system's ability to carry information is constrained and defined by the noise in its dynamics. This paper introduces an operational framework, using information-preserving structures, to classify all the kinds of information that can be perfectly (i.e., with zero error) preserved by quantum dynamics. We prove that every perfectly preserved code has the same structure as a matrix algebra, and that preserved information can always be corrected. We also classify distinct operational criteria for preservation (e.g., 'noiseless','unitarily correctible', etc.) and introduce two natural criteria for measurement-stabilized and unconditionally preserved codes. Finally, for several of these operational criteria, we present efficient (polynomial in the state-space dimension) algorithms to find all of a channel's information-preserving structures.

  12. The interrogation decision-making model: A general theoretical framework for confessions.

    Science.gov (United States)

    Yang, Yueran; Guyll, Max; Madon, Stephanie

    2017-02-01

    This article presents a new model of confessions referred to as the interrogation decision-making model . This model provides a theoretical umbrella with which to understand and analyze suspects' decisions to deny or confess guilt in the context of a custodial interrogation. The model draws upon expected utility theory to propose a mathematical account of the psychological mechanisms that not only underlie suspects' decisions to deny or confess guilt at any specific point during an interrogation, but also how confession decisions can change over time. Findings from the extant literature pertaining to confessions are considered to demonstrate how the model offers a comprehensive and integrative framework for organizing a range of effects within a limited set of model parameters. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Electromagnetic characterization of bianisotropic metasurfaces on refractive substrates: General theoretical framework

    Energy Technology Data Exchange (ETDEWEB)

    Albooyeh, M.; Tretyakov, Sergei [Department of Radio Science and Engineering, Aalto University, P.O. Box 13000, FI-00076, Aalto (Finland); Simovski, Constantin [Department of Radio Science and Engineering, Aalto University, P.O. Box 13000, FI-00076, Aalto (Finland); Laboratory of Metamaterials, University for Information Technology, Mechanics and Optics (ITMO), 197101, St. Petersburg (Russian Federation)

    2016-10-15

    We present a general methodology for electromagnetic homogenization and characterization of bianisotropic metasurfaces formed by regular or random arrangements of small arbitrary inclusions at interfaces of two different isotropic media. The approach unites and generalizes the earlier theories developed independently by two joint research groups: that of profs. Holloway and Kuester and that of profs. Simovski and Tretyakov. We analyze the features of both formalisms and discuss their peculiarities in several example cases. Our theory can be used in the analysis and synthesis of a wide spectrum of metasurfaces. (copyright 2016 by WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  14. Formal framework for a nonlocal generalization of Einstein's theory of gravitation

    International Nuclear Information System (INIS)

    Hehl, Friedrich W.; Mashhoon, Bahram

    2009-01-01

    The analogy between electrodynamics and the translational gauge theory of gravity is employed in this paper to develop an ansatz for a nonlocal generalization of Einstein's theory of gravitation. Working in the linear approximation, we show that the resulting nonlocal theory is equivalent to general relativity with 'dark matter'. The nature of the predicted dark matter, which is the manifestation of the nonlocal character of gravity in our model, is briefly discussed. It is demonstrated that this approach can provide a basis for the Tohline-Kuhn treatment of the astrophysical evidence for dark matter.

  15. The control software framework of the web base

    International Nuclear Information System (INIS)

    Nakatani, Takeshi; Inamura, Yasuhiro; Ito, Takayoshi; Otomo, Toshiya

    2015-01-01

    Web browsers are one of the most platform-independent user interfaces. In particular, web pages created using responsive web design (RWD) are available for use on desktop and laptop computers, as well as tablet terminals and smart phones. We developed a common software framework, IROHA, for the instrument control system in the Materials and Life Science Experimental Facility at the Japan Proton Accelerator Research Complex to build a flexible and scalable system by adopting XML/HTTP. However, its user interface was platform-dependent, and we wanted it to be more user-friendly. In 2013, we developed the prototype of a new software framework, IROHA2, comprising several device control servers and an instrument management server, retaining the flexibility and scalability of IROHA. We also adopted the Bootstrap framework to create an RWD user interface for these servers. (author)

  16. The Gaia Framework: Version Support In Web Based Open Hypermedia

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kejser, Thomas

    2004-01-01

    The GAIA framework prototype, described herein, explores the possibilities and problems that arise when combining versioning and open hypermedia paradigms. It will be argued that it - by adding versioning as a separate service in the hypermedia architecture – is possible to build consistent...... versioning field and GAIA is compared with previous attempts at defining hypermedia versioning frameworks. GAIA is capable of multi-level versioning and versioning of structures and supports freezing mechanisms for both documents and hyperstructure. The experiences from GAIA provide an input to new reference...

  17. The Gaia Framework: Version Support In Web Based Open Hypermedia

    DEFF Research Database (Denmark)

    Kejser, Thomas; Grønbæk, Kaj

    2003-01-01

    The GAIA framework prototype, described herein, explores the possibilities and problems that arise when combining versioning and open hypermedia paradigms. It will be argued that it - by adding versioning as a separate service in the hypermedia architecture - is possible to build consistent...... versioning field and GAIA is compared with previous attempts at defining hypermedia versioning frameworks. GAIA is capable of multi-level versioning and versioning of structures and supports freezing mechanisms for both documents and hyperstructure. The experiences from GAIA provide an input to new reference...

  18. Evaluating the implementation of a quality improvement process in General Practice using a realist evaluation framework.

    Science.gov (United States)

    Moule, Pam; Clompus, Susan; Fieldhouse, Jon; Ellis-Jones, Julie; Barker, Jacqueline

    2018-05-25

    Underuse of anticoagulants in atrial fibrillation is known to increase the risk of stroke and is an international problem. The National Institute for Health Care and Excellence guidance CG180 seeks to reduce atrial fibrillation related strokes through prescriptions of Non-vitamin K antagonist Oral Anticoagulants. A quality improvement programme was established by the West of England Academic Health Science Network (West of England AHSN) to implement this guidance into General Practice. A realist evaluation identified whether the quality improvement programme worked, determining how and in what circumstances. Six General Practices in 1 region, became the case study sites. Quality improvement team, doctor, and pharmacist meetings within each of the General Practices were recorded at 3 stages: initial planning, review, and final. Additionally, 15 interviews conducted with the practice leads explored experiences of the quality improvement process. Observation and interview data were analysed and compared against the initial programme theory. The quality improvement resources available were used variably, with the training being valued by all. The initial programme theories were refined. In particular, local workload pressures and individual General Practitioner experiences and pre-conceived ideas were acknowledged. Where key motivators were in place, such as prior experience, the programme achieved optimal outcomes and secured a lasting quality improvement legacy. The employment of a quality improvement programme can deliver practice change and improvement legacy outcomes when particular mechanisms are employed and in contexts where there is a commitment to improve service. © 2018 John Wiley & Sons, Ltd.

  19. Applying the Multilevel Framework of Discourse Comprehension to Evaluate the Text Characteristics of General Chemistry Textbooks

    Science.gov (United States)

    Pyburn, Daniel T.; Pazicni, Samuel

    2014-01-01

    Prior chemistry education research has demonstrated a relationship between student reading skill and general chemistry course performance. In addition to student characteristics, however, the qualities of the learning materials with which students interact also impact student learning. For example, low-knowledge students benefit from texts that…

  20. The integration of DVH-based planning aspects into a convex intensity modulated radiation therapy optimization framework

    Energy Technology Data Exchange (ETDEWEB)

    Kratt, Karin [Faculty of Mathematics, Technical University of Kaiserslautern, Kaiserslautern (Germany); Scherrer, Alexander [Department of Optimization, Fraunhofer Institute for Industrial Mathematics (ITWM), Kaiserslautern (Germany)], E-mail: alexander.scherrer@itwm.fraunhofer.de

    2009-06-21

    The formulation of intensity modulated radiation therapy (IMRT) planning aspects frequently uses the dose-volume histogram (DVH), whereas plan computations often happen in the more desirable convex IMRT optimization framework. Inspired by a recent publication of Zinchenko et al (2008 Phys. Med. Biol. 53 3231-50), this work addresses the integration of DVH-based planning aspects into this framework from a general point of view. It first provides the basic mathematical requirements on the evaluation functions in order to support such an incorporation. Then it introduces the condition number as a description for how precisely DVH-based planning aspects can be reformulated in terms of evaluation functions. Exemplary numerical studies for the generalized equivalent uniform dose and a physical constraint function show the influence of function parameter values and DVH approximation on the condition number. The work concludes by formulating the aspects that should be taken into account for an appropriate integration of DVH-based planning aspects. (note)

  1. The integration of DVH-based planning aspects into a convex intensity modulated radiation therapy optimization framework

    International Nuclear Information System (INIS)

    Kratt, Karin; Scherrer, Alexander

    2009-01-01

    The formulation of intensity modulated radiation therapy (IMRT) planning aspects frequently uses the dose-volume histogram (DVH), whereas plan computations often happen in the more desirable convex IMRT optimization framework. Inspired by a recent publication of Zinchenko et al (2008 Phys. Med. Biol. 53 3231-50), this work addresses the integration of DVH-based planning aspects into this framework from a general point of view. It first provides the basic mathematical requirements on the evaluation functions in order to support such an incorporation. Then it introduces the condition number as a description for how precisely DVH-based planning aspects can be reformulated in terms of evaluation functions. Exemplary numerical studies for the generalized equivalent uniform dose and a physical constraint function show the influence of function parameter values and DVH approximation on the condition number. The work concludes by formulating the aspects that should be taken into account for an appropriate integration of DVH-based planning aspects. (note)

  2. Coulomb excitation of rotational states in the 162Dy nucleus in the framework of the generalized semiclassical approximation

    International Nuclear Information System (INIS)

    Bolotin, Yu.L.; Gonchar, V.Yu.; Chekanov, N.A.

    1985-01-01

    Coulomb excitation of rotational states induced in heavyion collisions is treated in the framework of the generalized semiclassical approximation. The Hamiltonian of the system under consideration involves not only Coulomb forces (monopole, quadrupole, and hexadecapole) but as well a real nuclear potential in the form of the deformed Woods-Saxon potential. Strong dependence of the excitation probability on the interference between the Coulomb and nuclear interactions is shown. Calculations are carried out for the reaction 40 Ar+ 162 Dy at E=148.6 MeV. The calculated Coulomb excitation probabilities agree satisfactory with the corresponding experimental values

  3. Decay constants of pseudoscalar mesons in Bethe–Salpeter framework with generalized structure of hadron-quark vertex

    International Nuclear Information System (INIS)

    Bhatnagar, Shashank; Li, Shiyuan

    2009-01-01

    We employ the framework of Bethe–Salpeter equation under Covariant Instantaneous Ansatz to study the leptonic decays of pseudoscalar mesons. The Dirac structure of hadron-quark vertex function Γ is generalized to include various Dirac covariants besides γ5 from their complete set. The covariants are incorporated in accordance with a power counting rule, order-by-order in powers of the inverse of the meson mass. The decay constants are calculated with the incorporation of leading order covariants. Most of the results are dramatically improved. (author)

  4. Tatool: a Java-based open-source programming framework for psychological studies.

    Science.gov (United States)

    von Bastian, Claudia C; Locher, André; Ruflin, Michael

    2013-03-01

    Tatool (Training and Testing Tool) was developed to assist researchers with programming training software, experiments, and questionnaires. Tatool is Java-based, and thus is a platform-independent and object-oriented framework. The architecture was designed to meet the requirements of experimental designs and provides a large number of predefined functions that are useful in psychological studies. Tatool comprises features crucial for training studies (e.g., configurable training schedules, adaptive training algorithms, and individual training statistics) and allows for running studies online via Java Web Start. The accompanying "Tatool Online" platform provides the possibility to manage studies and participants' data easily with a Web-based interface. Tatool is published open source under the GNU Lesser General Public License, and is available at www.tatool.ch.

  5. Hierarchical Scheduling Framework Based on Compositional Analysis Using Uppaal

    DEFF Research Database (Denmark)

    Boudjadar, Jalil; David, Alexandre; Kim, Jin Hyun

    2014-01-01

    This paper introduces a reconfigurable compositional scheduling framework, in which the hierarchical structure, the scheduling policies, the concrete task behavior and the shared resources can all be reconfigured. The behavior of each periodic preemptive task is given as a list of timed actions, ...

  6. A Graph Based Framework to Model Virus Integration Sites

    Directory of Open Access Journals (Sweden)

    Raffaele Fronza

    2016-01-01

    Here, we addressed the challenge to: 1 define the notion of CIS on graph models, 2 demonstrate that the structure of CIS enters in the category of scale-free networks and 3 show that our network approach analyzes CIS dynamically in an integrated systems biology framework using the Retroviral Transposon Tagged Cancer Gene Database (RTCGD as a testing dataset.

  7. A Qualitative Simulation Framework in Smalltalk Based on Fuzzy Arithmetic

    Science.gov (United States)

    Richard L. Olson; Daniel L. Schmoldt; David L. Peterson

    1996-01-01

    For many systems, it is not practical to collect and correlate empirical data necessary to formulate a mathematical model. However, it is often sufficient to predict qualitative dynamics effects (as opposed to system quantities), especially for research purposes. In this effort, an object-oriented application framework (AF) was developed for the qualitative modeling of...

  8. An ontology-based collaborative service framework for agricultural information

    Science.gov (United States)

    In recent years, China has developed modern agriculture energetically. An effective information framework is an important way to provide farms with agricultural information services and improve farmer's production technology and their income. The mountain areas in central China are dominated by agri...

  9. Activity Performance Management Framework Based on Outcome Based Budgeting Malaysian Nuclear Agency

    International Nuclear Information System (INIS)

    Aisya Raihan Abdul Kadir; Mohd Azmi Sidid Omar; Noriah Jamal

    2015-01-01

    The implementation of the Outcome Based Budgeting (OBB) in the planning and implementation of national development and public spending will emphasize the impact and effectiveness of programs and activities in line with the policies and objectives of the four pillars in the National Transformation programme, which is 1 Malaysia: People First, Performance Now, Government Transformation Programme (GTP), Economic Transformation Programme (ETP) and Malaysia Five Year Development Plan. OBB effective implementation at the ministry level was implemented by the Ministry OBB Implementation Committee (OIC) and Program Performance Management Committee (PPMC). At the agency it will be implemented by the Performance Management Committee Activities (APMC). OBB involve strategic implementation cycle consisting of four main processes, namely, outcome-based planning, budgeting, monitoring, evaluation, and reporting performance. OBB will be fully implemented in 2016 to replace the Modified Budgeting System (MBS). Performance Management Framework Activity (APMF) is based on outcome-based planning has been developed using methodologies such as ProLL Model (Logic and Linkages Programme), Problem Tree Analysis (PTA), Top-down approach, SMART principle, Framework Approach and rigour test. By applying this methodology several Activity Performance Management Framework (APMF) has been produced which consists of 3 output, 6 KPI output, 3 outcome and 8 KPI outcome in line with the direction and outcome of programme level and ministries level. APMF was planned at the beginning of each year and reporting of the performance on a quarterly basis through My Results application. (author)

  10. Unification of Electromagnetism and Gravitation in the Framework of General Geometry

    OpenAIRE

    Shahverdiyev, Shervgi

    2005-01-01

    A new geometry, called General geometry, is constructed. It is proven that its the most simplest special case is geometry underlying Electromagnetism. Another special case is Riemannian geometry. Action for electromagnetic field and Maxwell equations are derived from curvature function of geometry underlying Electromagnetism. It is shown that equation of motion for a particle interacting with electromagnetic field coincides exactly with equation for geodesics of geometry underlying Electromag...

  11. Process mapping as a framework for performance improvement in emergency general surgery.

    Science.gov (United States)

    DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad

    2018-02-01

    Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.

  12. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding...

  13. STICK: Spike Time Interval Computational Kernel, a Framework for General Purpose Computation Using Neurons, Precise Timing, Delays, and Synchrony.

    Science.gov (United States)

    Lagorce, Xavier; Benosman, Ryad

    2015-11-01

    There has been significant research over the past two decades in developing new platforms for spiking neural computation. Current neural computers are primarily developed to mimic biology. They use neural networks, which can be trained to perform specific tasks to mainly solve pattern recognition problems. These machines can do more than simulate biology; they allow us to rethink our current paradigm of computation. The ultimate goal is to develop brain-inspired general purpose computation architectures that can breach the current bottleneck introduced by the von Neumann architecture. This work proposes a new framework for such a machine. We show that the use of neuron-like units with precise timing representation, synaptic diversity, and temporal delays allows us to set a complete, scalable compact computation framework. The framework provides both linear and nonlinear operations, allowing us to represent and solve any function. We show usability in solving real use cases from simple differential equations to sets of nonlinear differential equations leading to chaotic attractors.

  14. Equilibrium and nonequilibrium many-body perturbation theory: a unified framework based on the Martin-Schwinger hierarchy

    International Nuclear Information System (INIS)

    Van Leeuwen, Robert; Stefanucci, Gianluca

    2013-01-01

    We present a unified framework for equilibrium and nonequilibrium many-body perturbation theory. The most general nonequilibrium many-body theory valid for general initial states is based on a time-contour originally introduced by Konstantinov and Perel'. The various other well-known formalisms of Keldysh, Matsubara and the zero-temperature formalism are then derived as special cases that arise under different assumptions. We further present a single simple proof of Wick's theorem that is at the same time valid in all these flavors of many-body theory. It arises simply as a solution of the equations of the Martin-Schwinger hierarchy for the noninteracting many-particle Green's function with appropriate boundary conditions. We further discuss a generalized Wick theorem for general initial states on the Keldysh contour and derive how the formalisms based on the Keldysh and Konstantinov-Perel'-contours are related for the case of general initial states.

  15. Teachers implementing context-based teaching materials : a framework for case-analysis in chemistry

    NARCIS (Netherlands)

    Vos, M.A.J.; Taconis, R.; Jochems, W.M.G.; Pilot, A.

    2010-01-01

    We present a framework for analysing the interplay between context-based teaching material and teachers, and for evaluating the adequacy of the resulting implementation of context-based pedagogy in chemistry classroom practice. The development of the framework is described, including an account of

  16. ComTrustO: Composite Trust-Based Ontology Framework for Information and Decision Fusion

    Science.gov (United States)

    2015-07-06

    11] presents a methodological approach for ontology management allowing development of extensible ontologies and the mapping from ontologies to...ComTrustO: Composite Trust-based Ontology Framework for Information and Decision Fusion Alessandro Oltramari Carnegie Mellon University Pittsburgh... ontology -based framework for information fusion, as a support system for human decision makers. In particular, we build upon the concept of composite

  17. A Physics-Based Modeling Framework for Prognostic Studies

    Science.gov (United States)

    Kulkarni, Chetan S.

    2014-01-01

    Prognostics and Health Management (PHM) methodologies have emerged as one of the key enablers for achieving efficient system level maintenance as part of a busy operations schedule, and lowering overall life cycle costs. PHM is also emerging as a high-priority issue in critical applications, where the focus is on conducting fundamental research in the field of integrated systems health management. The term diagnostics relates to the ability to detect and isolate faults or failures in a system. Prognostics on the other hand is the process of predicting health condition and remaining useful life based on current state, previous conditions and future operating conditions. PHM methods combine sensing, data collection, interpretation of environmental, operational, and performance related parameters to indicate systems health under its actual application conditions. The development of prognostics methodologies for the electronics field has become more important as more electrical systems are being used to replace traditional systems in several applications in the aeronautics, maritime, and automotive fields. The development of prognostics methods for electronics presents several challenges due to the great variety of components used in a system, a continuous development of new electronics technologies, and a general lack of understanding of how electronics fail. Similarly with electric unmanned aerial vehicles, electrichybrid cars, and commercial passenger aircraft, we are witnessing a drastic increase in the usage of batteries to power vehicles. However, for battery-powered vehicles to operate at maximum efficiency and reliability, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. We develop an electrochemistry-based model of Li-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable

  18. Decision support system based on DPSIR framework for a low flow Mediterranean river basin

    Science.gov (United States)

    Bangash, Rubab Fatima; Kumar, Vikas; Schuhmacher, Marta

    2013-04-01

    The application of decision making practices are effectively enhanced by adopting a procedural approach setting out a general methodological framework within which specific methods, models and tools can be integrated. Integrated Catchment Management is a process that recognizes the river catchment as a basic organizing unit for understanding and managing ecosystem process. Decision support system becomes more complex by considering unavoidable human activities within a catchment that are motivated by multiple and often competing criteria and/or constraints. DPSIR is a causal framework for describing the interactions between society and the environment. This framework has been adopted by the European Environment Agency and the components of this model are: Driving forces, Pressures, States, Impacts and Responses. The proposed decision support system is a two step framework based on DPSIR. Considering first three component of DPSIR, Driving forces, Pressures and States, hydrological and ecosystem services models are developed. The last two components, Impact and Responses, helped to develop Bayesian Network to integrate the models. This decision support system also takes account of social, economic and environmental aspects. A small river of Catalonia (Northeastern Spain), Francoli River with a low flow (~2 m3/s) is selected for integration of catchment assessment models and to improve knowledge transfer from research to the stakeholders with a view to improve decision making process. DHI's MIKE BASIN software is used to evaluate the low-flow Francolí River with respect to the water bodies' characteristics and also to assess the impact of human activities aiming to achieve good water status for all waters to comply with the WFD's River Basin Management Plan. Based on ArcGIS, MIKE BASIN is a versatile decision support tool that provides a simple and powerful framework for managers and stakeholders to address multisectoral allocation and environmental issues in river

  19. Matroidal Structure of Generalized Rough Sets Based on Tolerance Relations

    Directory of Open Access Journals (Sweden)

    Hui Li

    2014-01-01

    of the generalized rough set based on the tolerance relation. The matroid can also induce a new relation. We investigate the connection between the original tolerance relation and the induced relation.

  20. A Reconstruction of Development of the Periodic Table Based on History and Philosophy of Science and Its Implications for General Chemistry Textbooks

    Science.gov (United States)

    Brito, Angmary; Rodriguez, Maria A.; Niaz, Mansoor

    2005-01-01

    The objectives of this study are: (a) elaboration of a history and philosophy of science (HPS) framework based on a reconstruction of the development of the periodic table; (b) formulation of seven criteria based on the framework; and (c) evaluation of 57 freshman college-level general chemistry textbooks with respect to the presentation of the…

  1. VIP - A Framework-Based Approach to Robot Vision

    Directory of Open Access Journals (Sweden)

    Gerd Mayer

    2008-11-01

    Full Text Available For robot perception, video cameras are very valuable sensors, but the computer vision methods applied to extract information from camera images are usually computationally expensive. Integrating computer vision methods into a robot control architecture requires to balance exploitation of camera images with the need to preserve reactivity and robustness. We claim that better software support is needed in order to facilitate and simplify the application of computer vision and image processing methods on autonomous mobile robots. In particular, such support must address a simplified specification of image processing architectures, control and synchronization issues of image processing steps, and the integration of the image processing machinery into the overall robot control architecture. This paper introduces the video image processing (VIP framework, a software framework for multithreaded control flow modeling in robot vision.

  2. VIP - A Framework-Based Approach to Robot Vision

    Directory of Open Access Journals (Sweden)

    Hans Utz

    2006-03-01

    Full Text Available For robot perception, video cameras are very valuable sensors, but the computer vision methods applied to extract information from camera images are usually computationally expensive. Integrating computer vision methods into a robot control architecture requires to balance exploitation of camera images with the need to preserve reactivity and robustness. We claim that better software support is needed in order to facilitate and simplify the application of computer vision and image processing methods on autonomous mobile robots. In particular, such support must address a simplified specification of image processing architectures, control and synchronization issues of image processing steps, and the integration of the image processing machinery into the overall robot control architecture. This paper introduces the video image processing (VIP framework, a software framework for multithreaded control flow modeling in robot vision.

  3. VoVis: A Vocabulary-based Web Visualization Framework

    OpenAIRE

    Sharma, Swati

    2015-01-01

    The amount of data in today's digital world is growing day by day. Visualization is considered as the best form of communication because the human brain perceives it much faster than the text data that comprises with thousand of words. Because of the enormous and continuously increasing data, the demand to visualize it is also increasing. Information visualization is a wide research area that covers a broad range of data fields. There are various visualization frameworks, tools and technologi...

  4. A mathematical framework for agent based models of complex biological networks.

    Science.gov (United States)

    Hinkelmann, Franziska; Murrugarra, David; Jarrah, Abdul Salam; Laubenbacher, Reinhard

    2011-07-01

    Agent-based modeling and simulation is a useful method to study biological phenomena in a wide range of fields, from molecular biology to ecology. Since there is currently no agreed-upon standard way to specify such models, it is not always easy to use published models. Also, since model descriptions are not usually given in mathematical terms, it is difficult to bring mathematical analysis tools to bear, so that models are typically studied through simulation. In order to address this issue, Grimm et al. proposed a protocol for model specification, the so-called ODD protocol, which provides a standard way to describe models. This paper proposes an addition to the ODD protocol which allows the description of an agent-based model as a dynamical system, which provides access to computational and theoretical tools for its analysis. The mathematical framework is that of algebraic models, that is, time-discrete dynamical systems with algebraic structure. It is shown by way of several examples how this mathematical specification can help with model analysis. This mathematical framework can also accommodate other model types such as Boolean networks and the more general logical models, as well as Petri nets.

  5. Generic-distributed framework for cloud services marketplace based on unified ontology

    Directory of Open Access Journals (Sweden)

    Samer Hasan

    2017-11-01

    Full Text Available Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors’ knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  6. Generic-distributed framework for cloud services marketplace based on unified ontology.

    Science.gov (United States)

    Hasan, Samer; Valli Kumari, V

    2017-11-01

    Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  7. SDN Based User-Centric Framework for Heterogeneous Wireless Networks

    Directory of Open Access Journals (Sweden)

    Zhaoming Lu

    2016-01-01

    Full Text Available Due to the rapid growth of mobile data traffic, more and more basestations and access points (APs have been densely deployed to provide users with ubiquitous network access, which make current wireless network a complex heterogeneous network (HetNet. However, traditional wireless networks are designed with network-centric approaches where different networks have different quality of service (QoS strategies and cannot easily cooperate with each other to serve network users. Massive network infrastructures could not assure users perceived network and service quality, which is an indisputable fact. To address this issue, we design a new framework for heterogeneous wireless networks with the principle of user-centricity, refactoring the network from users’ perspective to suffice their requirements and preferences. Different from network-centric approaches, the proposed framework takes advantage of Software Defined Networking (SDN and virtualization technology, which will bring better perceived services quality for wireless network users. In the proposed user-centric framework, control plane and data plane are decoupled to manage the HetNets in a flexible and coadjutant way, and resource virtualization technology is introduced to abstract physical resources of HetNets into unified virtualized resources. Hence, ubiquitous and undifferentiated network connectivity and QoE (quality of experience driven fine-grained resource management could be achieved for wireless network users.

  8. ECONOMETRIC APPROACH OF HETEROSKEDASTICITY ON FINANCIAL TIME SERIES IN A GENERAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    FELICIA RAMONA BIRĂU

    2012-12-01

    Full Text Available The aim of this paper is to provide an overview of the diagnostic tests for detecting heteroskedasticity on financial time series. In financial econometrics, heteroskedasticity is generally associated with cross sectional data but can also be identified modeling time series data. The presence of heteroscedasticity in financial time series can be caused by certain specific factors, like a model misspecification, inadequate data transformation or as a result of certain outliers. Heteroskedasticity arise when the homoskedasticity assumption is violated. Testing for the presence of heteroskedasticity in financial time is performed by applying diagnostic test, such as : Breusch-Pagan LM test, White’s test, Glesjer LM test, Harvey-Godfrey LM test, Park LM test and Goldfeld-Quand test.

  9. A generalized linear factor model approach to the hierarchical framework for responses and response times.

    Science.gov (United States)

    Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J

    2015-05-01

    We show how the hierarchical model for responses and response times as developed by van der Linden (2007), Fox, Klein Entink, and van der Linden (2007), Klein Entink, Fox, and van der Linden (2009), and Glas and van der Linden (2010) can be simplified to a generalized linear factor model with only the mild restriction that there is no hierarchical model at the item side. This result is valuable as it enables all well-developed modelling tools and extensions that come with these methods. We show that the restriction we impose on the hierarchical model does not influence parameter recovery under realistic circumstances. In addition, we present two illustrative real data analyses to demonstrate the practical benefits of our approach. © 2014 The British Psychological Society.

  10. A framework for general sparse matrix-matrix multiplication on GPUs and heterogeneous processors

    DEFF Research Database (Denmark)

    Liu, Weifeng; Vinter, Brian

    2015-01-01

    General sparse matrix-matrix multiplication (SpGEMM) is a fundamental building block for numerous applications such as algebraic multigrid method (AMG), breadth first search and shortest path problem. Compared to other sparse BLAS routines, an efficient parallel SpGEMM implementation has to handle...... extra irregularity from three aspects: (1) the number of nonzero entries in the resulting sparse matrix is unknown in advance, (2) very expensive parallel insert operations at random positions in the resulting sparse matrix dominate the execution time, and (3) load balancing must account for sparse data...... memory space and efficiently utilizes the very limited on-chip scratchpad memory. Parallel insert operations of the nonzero entries are implemented through the GPU merge path algorithm that is experimentally found to be the fastest GPU merge approach. Load balancing builds on the number of necessary...

  11. A Framework for Incorporating General Domain Knowledge into Latent Dirichlet Allocation using First-Order Logic

    Energy Technology Data Exchange (ETDEWEB)

    Andrzejewski, D; Zhu, X; Craven, M; Recht, B

    2011-01-18

    Topic models have been used successfully for a variety of problems, often in the form of application-specific extensions of the basic Latent Dirichlet Allocation (LDA) model. Because deriving these new models in order to encode domain knowledge can be difficult and time-consuming, we propose the Fold-all model, which allows the user to specify general domain knowledge in First-Order Logic (FOL). However, combining topic modeling with FOL can result in inference problems beyond the capabilities of existing techniques. We have therefore developed a scalable inference technique using stochastic gradient descent which may also be useful to the Markov Logic Network (MLN) research community. Experiments demonstrate the expressive power of Fold-all, as well as the scalability of our proposed inference method.

  12. Hybrid modelling framework by using mathematics-based and information-based methods

    International Nuclear Information System (INIS)

    Ghaboussi, J; Kim, J; Elnashai, A

    2010-01-01

    Mathematics-based computational mechanics involves idealization in going from the observed behaviour of a system into mathematical equations representing the underlying mechanics of that behaviour. Idealization may lead mathematical models that exclude certain aspects of the complex behaviour that may be significant. An alternative approach is data-centric modelling that constitutes a fundamental shift from mathematical equations to data that contain the required information about the underlying mechanics. However, purely data-centric methods often fail for infrequent events and large state changes. In this article, a new hybrid modelling framework is proposed to improve accuracy in simulation of real-world systems. In the hybrid framework, a mathematical model is complemented by information-based components. The role of informational components is to model aspects which the mathematical model leaves out. The missing aspects are extracted and identified through Autoprogressive Algorithms. The proposed hybrid modelling framework has a wide range of potential applications for natural and engineered systems. The potential of the hybrid methodology is illustrated through modelling highly pinched hysteretic behaviour of beam-to-column connections in steel frames.

  13. A General Mathematical Framework for Calculating Systems-Scale Efficiency of Energy Extraction and Conversion: Energy Return on Investment (EROI and Other Energy Return Ratios

    Directory of Open Access Journals (Sweden)

    Adam R. Brandt

    2011-08-01

    Full Text Available The efficiencies of energy extraction and conversion systems are typically expressed using energy return ratios (ERRs such as the net energy ratio (NER or energy return on investment (EROI. A lack of a general mathematical framework prevents inter-comparison of NER/EROI estimates between authors: methods used are not standardized, nor is there a framework for succinctly reporting results in a consistent fashion. In this paper we derive normalized mathematical forms of four ERRs for energy extraction and conversion pathways. A bottom-up (process model formulation is developed for an n-stage energy harvesting and conversion pathway with various system boundaries. Formations with the broadest system boundaries use insights from life cycle analysis to suggest a hybrid process model/economic input output based framework. These models include indirect energy consumption due to external energy inputs and embodied energy in materials. Illustrative example results are given for simple energy extraction and conversion pathways. Lastly, we discuss the limitations of this approach and the intersection of this methodology with “top-down” economic approaches.

  14. A Heuristic Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) Authoring Tools

    Science.gov (United States)

    2016-03-01

    incorporates the following: models of domain knowledge (e.g., content, feedback, and remediation); pedagogical methods based on best instructional practices... relationship to each other (i.e., an ontology). • User Control and Freedom: Users often choose system functions by mistake and will need a clearly marked...2.2.1 Issue: Name Mismatch between Gift Authoring Tool and Browser UI Users may not understand the relationship between the Gift Authoring Tool and

  15. Model-based visual tracking the OpenTL framework

    CERN Document Server

    Panin, Giorgio

    2011-01-01

    This book has two main goals: to provide a unifed and structured overview of this growing field, as well as to propose a corresponding software framework, the OpenTL library, developed by the author and his working group at TUM-Informatik. The main objective of this work is to show, how most real-world application scenarios can be naturally cast into a common description vocabulary, and therefore implemented and tested in a fully modular and scalable way, through the defnition of a layered, object-oriented software architecture.The resulting architecture covers in a seamless way all processin

  16. A model-based Bayesian framework for ECG beat segmentation

    International Nuclear Information System (INIS)

    Sayadi, O; Shamsollahi, M B

    2009-01-01

    The study of electrocardiogram (ECG) waveform amplitudes, timings and patterns has been the subject of intense research, for it provides a deep insight into the diagnostic features of the heart's functionality. In some recent works, a Bayesian filtering paradigm has been proposed for denoising and compression of ECG signals. In this paper, it is shown that this framework may be effectively used for ECG beat segmentation and extraction of fiducial points. Analytic expressions for the determination of points and intervals are derived and evaluated on various real ECG signals. Simulation results show that the method can contribute to and enhance the clinical ECG beat segmentation performance

  17. Basic framework of urban design based on natural resources

    Science.gov (United States)

    Lubis, Irwar; Nasution, Mahyuddin K. M.; Maulina, Maudy

    2018-03-01

    To establishment of the city always begins because the availability of natural resources that meet the basic needs of its inhabitants, but after that the city relies on the sustainability of those basic need, which is primarily dependent on transportation. Transportation becomes the main needs of the city. Transportation, however, results in the potential for the city’s discomfort with noise and pollution, which mixes with the frenetic city life. Therefore, this paper reveals a basic framework using natural resources to reduce the noise and the pollution.

  18. A general population genetic framework for antagonistic selection that accounts for demography and recurrent mutation.

    Science.gov (United States)

    Connallon, Tim; Clark, Andrew G

    2012-04-01

    Antagonistic selection--where alleles at a locus have opposing effects on male and female fitness ("sexual antagonism") or between components of fitness ("antagonistic pleiotropy")--might play an important role in maintaining population genetic variation and in driving phylogenetic and genomic patterns of sexual dimorphism and life-history evolution. While prior theory has thoroughly characterized the conditions necessary for antagonistic balancing selection to operate, we currently know little about the evolutionary interactions between antagonistic selection, recurrent mutation, and genetic drift, which should collectively shape empirical patterns of genetic variation. To fill this void, we developed and analyzed a series of population genetic models that simultaneously incorporate these processes. Our models identify two general properties of antagonistically selected loci. First, antagonistic selection inflates heterozygosity and fitness variance across a broad parameter range--a result that applies to alleles maintained by balancing selection and by recurrent mutation. Second, effective population size and genetic drift profoundly affect the statistical frequency distributions of antagonistically selected alleles. The "efficacy" of antagonistic selection (i.e., its tendency to dominate over genetic drift) is extremely weak relative to classical models, such as directional selection and overdominance. Alleles meeting traditional criteria for strong selection (N(e)s > 1, where N(e) is the effective population size, and s is a selection coefficient for a given sex or fitness component) may nevertheless evolve as if neutral. The effects of mutation and demography may generate population differences in overall levels of antagonistic fitness variation, as well as molecular population genetic signatures of balancing selection.

  19. HealthNode: Software Framework for Efficiently Designing and Developing Cloud-Based Healthcare Applications

    Directory of Open Access Journals (Sweden)

    Ho-Kyeong Ra

    2018-01-01

    Full Text Available With the exponential improvement of software technology during the past decade, many efforts have been made to design remote and personalized healthcare applications. Many of these applications are built on mobile devices connected to the cloud. Although appealing, however, prototyping and validating the feasibility of an application-level idea is yet challenging without a solid understanding of the cloud, mobile, and the interconnectivity infrastructure. In this paper, we provide a solution to this by proposing a framework called HealthNode, which is a general-purpose framework for developing healthcare applications on cloud platforms using Node.js. To fully exploit the potential of Node.js when developing cloud applications, we focus on the fact that the implementation process should be eased. HealthNode presents an explicit guideline while supporting necessary features to achieve quick and expandable cloud-based healthcare applications. A case study applying HealthNode to various real-world health applications suggests that HealthNode can express architectural structure effectively within an implementation and that the proposed platform can support system understanding and software evolution.

  20. PEDLA: predicting enhancers with a deep learning-based algorithmic framework.

    Science.gov (United States)

    Liu, Feng; Li, Hao; Ren, Chao; Bo, Xiaochen; Shu, Wenjie

    2016-06-22

    Transcriptional enhancers are non-coding segments of DNA that play a central role in the spatiotemporal regulation of gene expression programs. However, systematically and precisely predicting enhancers remain a major challenge. Although existing methods have achieved some success in enhancer prediction, they still suffer from many issues. We developed a deep learning-based algorithmic framework named PEDLA (https://github.com/wenjiegroup/PEDLA), which can directly learn an enhancer predictor from massively heterogeneous data and generalize in ways that are mostly consistent across various cell types/tissues. We first trained PEDLA with 1,114-dimensional heterogeneous features in H1 cells, and demonstrated that PEDLA framework integrates diverse heterogeneous features and gives state-of-the-art performance relative to five existing methods for enhancer prediction. We further extended PEDLA to iteratively learn from 22 training cell types/tissues. Our results showed that PEDLA manifested superior performance consistency in both training and independent test sets. On average, PEDLA achieved 95.0% accuracy and a 96.8% geometric mean (GM) of sensitivity and specificity across 22 training cell types/tissues, as well as 95.7% accuracy and a 96.8% GM across 20 independent test cell types/tissues. Together, our work illustrates the power of harnessing state-of-the-art deep learning techniques to consistently identify regulatory elements at a genome-wide scale from massively heterogeneous data across diverse cell types/tissues.

  1. Generalized perturbation theory based on the method of cyclic characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Assawaroongruengchot, M.; Marleau, G. [Institut de Genie Nucleaire, Departement de Genie Physique, Ecole Polytechnique de Montreal, 2900 Boul. Edouard-Montpetit, Montreal, Que. H3T 1J4 (Canada)

    2006-07-01

    A GPT algorithm for estimation of eigenvalues and reaction-rate ratios is developed for the neutron transport problems in 2D fuel assemblies with isotropic scattering. In our study the GPT formulation is based on the integral transport equations. The mathematical relationship between the generalized flux importance and generalized source importance functions is applied to transform the generalized flux importance transport equations into the integro-differential forms. The resulting adjoint and generalized adjoint transport equations are then solved using the method of cyclic characteristics (MOCC). Because of the presence of negative adjoint sources, a biasing/decontamination scheme is applied to make the generalized adjoint functions positive in such a way that it can be used for the multigroup re-balance technique. To demonstrate the efficiency of the algorithms, perturbative calculations are performed on a 17 x 17 PWR lattice. (authors)

  2. Generalized perturbation theory based on the method of cyclic characteristics

    International Nuclear Information System (INIS)

    Assawaroongruengchot, M.; Marleau, G.

    2006-01-01

    A GPT algorithm for estimation of eigenvalues and reaction-rate ratios is developed for the neutron transport problems in 2D fuel assemblies with isotropic scattering. In our study the GPT formulation is based on the integral transport equations. The mathematical relationship between the generalized flux importance and generalized source importance functions is applied to transform the generalized flux importance transport equations into the integro-differential forms. The resulting adjoint and generalized adjoint transport equations are then solved using the method of cyclic characteristics (MOCC). Because of the presence of negative adjoint sources, a biasing/decontamination scheme is applied to make the generalized adjoint functions positive in such a way that it can be used for the multigroup re-balance technique. To demonstrate the efficiency of the algorithms, perturbative calculations are performed on a 17 x 17 PWR lattice. (authors)

  3. Framework for the Development of OER-Based Learning Materials in ODL Environment

    Science.gov (United States)

    Teng, Khor Ean; Hung, Chung Sheng

    2013-01-01

    This paper describes the framework for the development of OER-based learning materials "TCC121/05 Programming Fundamentals with Java" for ODL learners in Wawasan Open University (WOU) using three main development phases mainly: creation, evaluation and production phases. The proposed framework has further been tested on ODL learners to…

  4. A network Airline Revenue Management Framework Based on Deccomposition by Origins ans Destinations

    NARCIS (Netherlands)

    Birbil, S.I.; Frenk, J.B.G.; Gromicho Dos Santos, J.A.; Zhang, Shuzhong

    2014-01-01

    We propose a framework for solving airline revenue management problems on large networks, where the main concern is to allocate the flight leg capacities to customer requests under fixed class fares. This framework is based on a mathematical programming model that decomposes the network into

  5. Role of Knowledge Management and Analytical CRM in Business: Data Mining Based Framework

    Science.gov (United States)

    Ranjan, Jayanthi; Bhatnagar, Vishal

    2011-01-01

    Purpose: The purpose of the paper is to provide a thorough analysis of the concepts of business intelligence (BI), knowledge management (KM) and analytical CRM (aCRM) and to establish a framework for integrating all the three to each other. The paper also seeks to establish a KM and aCRM based framework using data mining (DM) techniques, which…

  6. Data-based decision making : conclusions and a data use framework

    NARCIS (Netherlands)

    Schildkamp, Kim; Lai, M.K.; Schildkamp, K.; Lai, M.K.; Earl, L.

    2013-01-01

    In this chapter, the results of all the studies presented in this book are summarized. What are the lessons learned? Based on the lessons learned, we developed a data use framework. In this framework, data use is influenced by several enablers and barriers (e.g., the school organization context,

  7. A computer-aided framework for development, identification andmanagement of physiologically-based pharmacokinetic models

    DEFF Research Database (Denmark)

    Heitzig, Martina; Linninger, Andreas; Sin, Gürkan

    2014-01-01

    The objective of this work is the development of a generic computer-aided modelling framework to support the development of physiologically-based pharmacokinetic models thereby increasing the efficiency and quality of the modelling process. In particular, the framework systematizes the modelling...

  8. A framework for performance evaluation of model-based optical trackers

    NARCIS (Netherlands)

    Smit, F.A.; Liere, van R.

    2008-01-01

    We describe a software framework to evaluate the performance of model-based optical trackers in virtual environments. The framework can be used to evaluate and compare the performance of different trackers under various conditions, to study the effects of varying intrinsic and extrinsic camera

  9. An Experience-Based Learning Framework: Activities for the Initial Development of Sustainability Competencies

    Science.gov (United States)

    Caniglia, Guido; John, Beatrice; Kohler, Martin; Bellina, Leonie; Wiek, Arnim; Rojas, Christopher; Laubichler, Manfred D.; Lang, Daniel

    2016-01-01

    Purpose: This paper aims to present an experience-based learning framework that provides a bottom-up, student-centered entrance point for the development of systems thinking, normative and collaborative competencies in sustainability. Design/methodology/approach: The framework combines mental mapping with exploratory walking. It interweaves…

  10. CASCADE: An Agent Based Framework For Modeling The Dynamics Of Smart Electricity Systems

    OpenAIRE

    Rylatt, R. M.; Gammon, Rupert; Boait, Peter John; Varga, L.; Allen, P.; Savill, M.; Snape, J. Richard; Lemon, Mark; Ardestani, B. M.; Pakka, V. H.; Fletcher, G.; Smith, S.; Fan, D.; Strathern, M.

    2013-01-01

    Collaborative project with Cranfield University The Complex Adaptive Systems, Cognitive Agents and Distributed Energy (CASCADE) project is developing a framework based on Agent Based Modelling (ABM). The CASCADE Framework can be used both to gain policy and industry relevant insights into the smart grid concept itself and as a platform to design and test distributed ICT solutions for smart grid based business entities. ABM is used to capture the behaviors of diff erent socia...

  11. Incorporating probabilistic seasonal climate forecasts into river management using a risk-based framework

    Science.gov (United States)

    Sojda, Richard S.; Towler, Erin; Roberts, Mike; Rajagopalan, Balaji

    2013-01-01

    [1] Despite the influence of hydroclimate on river ecosystems, most efforts to date have focused on using climate information to predict streamflow for water supply. However, as water demands intensify and river systems are increasingly stressed, research is needed to explicitly integrate climate into streamflow forecasts that are relevant to river ecosystem management. To this end, we present a five step risk-based framework: (1) define risk tolerance, (2) develop a streamflow forecast model, (3) generate climate forecast ensembles, (4) estimate streamflow ensembles and associated risk, and (5) manage for climate risk. The framework is successfully demonstrated for an unregulated watershed in southwest Montana, where the combination of recent drought and water withdrawals has made it challenging to maintain flows needed for healthy fisheries. We put forth a generalized linear modeling (GLM) approach to develop a suite of tools that skillfully model decision-relevant low flow characteristics in terms of climate predictors. Probabilistic precipitation forecasts are used in conjunction with the GLMs, resulting in season-ahead prediction ensembles that provide the full risk profile. These tools are embedded in an end-to-end risk management framework that directly supports proactive fish conservation efforts. Results show that the use of forecasts can be beneficial to planning, especially in wet years, but historical precipitation forecasts are quite conservative (i.e., not very “sharp”). Synthetic forecasts show that a modest “sharpening” can strongly impact risk and improve skill. We emphasize that use in management depends on defining relevant environmental flows and risk tolerance, requiring local stakeholder involvement.

  12. Maximally Localized States and Quantum Corrections of Black Hole Thermodynamics in the Framework of a New Generalized Uncertainty Principle

    International Nuclear Information System (INIS)

    Zhang, Shao-Jun; Miao, Yan-Gang; Zhao, Ying-Jie

    2015-01-01

    As a generalized uncertainty principle (GUP) leads to the effects of the minimal length of the order of the Planck scale and UV/IR mixing, some significant physical concepts and quantities are modified or corrected correspondingly. On the one hand, we derive the maximally localized states—the physical states displaying the minimal length uncertainty associated with a new GUP proposed in our previous work. On the other hand, in the framework of this new GUP we calculate quantum corrections to the thermodynamic quantities of the Schwardzschild black hole, such as the Hawking temperature, the entropy, and the heat capacity, and give a remnant mass of the black hole at the end of the evaporation process. Moreover, we compare our results with that obtained in the frameworks of several other GUPs. In particular, we observe a significant difference between the situations with and without the consideration of the UV/IR mixing effect in the quantum corrections to the evaporation rate and the decay time. That is, the decay time can greatly be prolonged in the former case, which implies that the quantum correction from the UV/IR mixing effect may give rise to a radical rather than a tiny influence to the Hawking radiation.

  13. A Dynamic Multi-Projection-Contour Approximating Framework for the 3D Reconstruction of Buildings by Super-Generalized Optical Stereo-Pairs.

    Science.gov (United States)

    Yan, Yiming; Su, Nan; Zhao, Chunhui; Wang, Liguo

    2017-09-19

    In this paper, a novel framework of the 3D reconstruction of buildings is proposed, focusing on remote sensing super-generalized stereo-pairs (SGSPs). As we all know, 3D reconstruction cannot be well performed using nonstandard stereo pairs, since reliable stereo matching could not be achieved when the image-pairs are collected at a great difference of views, and we always failed to obtain dense 3D points for regions of buildings, and cannot do further 3D shape reconstruction. We defined SGSPs as two or more optical images collected in less constrained views but covering the same buildings. It is even more difficult to reconstruct the 3D shape of a building by SGSPs using traditional frameworks. As a result, a dynamic multi-projection-contour approximating (DMPCA) framework was introduced for SGSP-based 3D reconstruction. The key idea is that we do an optimization to find a group of parameters of a simulated 3D model and use a binary feature-image that minimizes the total differences between projection-contours of the building in the SGSPs and that in the simulated 3D model. Then, the simulated 3D model, defined by the group of parameters, could approximate the actual 3D shape of the building. Certain parameterized 3D basic-unit-models of typical buildings were designed, and a simulated projection system was established to obtain a simulated projection-contour in different views. Moreover, the artificial bee colony algorithm was employed to solve the optimization. With SGSPs collected by the satellite and our unmanned aerial vehicle, the DMPCA framework was verified by a group of experiments, which demonstrated the reliability and advantages of this work.

  14. A Bayesian Framework for Generalized Linear Mixed Modeling Identifies New Candidate Loci for Late-Onset Alzheimer's Disease.

    Science.gov (United States)

    Wang, Xulong; Philip, Vivek M; Ananda, Guruprasad; White, Charles C; Malhotra, Ankit; Michalski, Paul J; Karuturi, Krishna R Murthy; Chintalapudi, Sumana R; Acklin, Casey; Sasner, Michael; Bennett, David A; De Jager, Philip L; Howell, Gareth R; Carter, Gregory W

    2018-03-05

    Recent technical and methodological advances have greatly enhanced genome-wide association studies (GWAS). The advent of low-cost whole-genome sequencing facilitates high-resolution variant identification, and the development of linear mixed models (LMM) allows improved identification of putatively causal variants. While essential for correcting false positive associations due to sample relatedness and population stratification, LMMs have commonly been restricted to quantitative variables. However, phenotypic traits in association studies are often categorical, coded as binary case-control or ordered variables describing disease stages. To address these issues, we have devised a method for genomic association studies that implements a generalized linear mixed model (GLMM) in a Bayesian framework, called Bayes-GLMM Bayes-GLMM has four major features: (1) support of categorical, binary and quantitative variables; (2) cohesive integration of previous GWAS results for related traits; (3) correction for sample relatedness by mixed modeling; and (4) model estimation by both Markov chain Monte Carlo (MCMC) sampling and maximal likelihood estimation. We applied Bayes-GLMM to the whole-genome sequencing cohort of the Alzheimer's Disease Sequencing Project (ADSP). This study contains 570 individuals from 111 families, each with Alzheimer's disease diagnosed at one of four confidence levels. With Bayes-GLMM we identified four variants in three loci significantly associated with Alzheimer's disease. Two variants, rs140233081 and rs149372995 lie between PRKAR1B and PDGFA The coded proteins are localized to the glial-vascular unit, and PDGFA transcript levels are associated with AD-related neuropathology. In summary, this work provides implementation of a flexible, generalized mixed model approach in a Bayesian framework for association studies. Copyright © 2018, Genetics.

  15. Component Based System Framework for Dynamic B2B Interaction

    NARCIS (Netherlands)

    Hu jinmin, Jinmin; Grefen, P.W.P.J.

    Business-to-Business (B2B) collaboration is becoming a pivotal way to bring today's enterprises to success in the dynamically changing e-business environment. Though many business-to-business protocols are developed to support B2B interaction, none are generally accepted. A B2B system should support

  16. Metal–organic frameworks based membranes for liquid separation

    KAUST Repository

    Li, Xin; Liu, Yuxin; Wang, Jing; Gascon, Jorge; Li, Jiansheng; Van der Bruggen, Bart

    2017-01-01

    , the field of MOF-based membranes for liquid separation is highlighted in this review. The criteria for judicious selection of MOFs in fabricating MOF-based membranes are given. Special attention is paid to rational design strategies for MOF-based membranes

  17. Competency-Based Education: A Framework for Measuring Quality Courses

    Science.gov (United States)

    Krause, Jackie; Dias, Laura Portolese; Schedler, Chris

    2015-01-01

    The growth of competency-based education in an online environment requires the development and measurement of quality competency-based courses. While quality measures for online courses have been developed and standardized, they do not directly align with emerging best practices and principles in the design of quality competency-based online…

  18. Comparison of Physics Frameworks for WebGL-Based Game Engine

    Directory of Open Access Journals (Sweden)

    Yogya Resa

    2014-03-01

    Full Text Available Recently, a new technology called WebGL shows a lot of potentials for developing games. However since this technology is still new, there are still many potentials in the game development area that are not explored yet. This paper tries to uncover the potential of integrating physics frameworks with WebGL technology in a game engine for developing 2D or 3D games. Specifically we integrated three open source physics frameworks: Bullet, Cannon, and JigLib into a WebGL-based game engine. Using experiment, we assessed these frameworks in terms of their correctness or accuracy, performance, completeness and compatibility. The results show that it is possible to integrate open source physics frameworks into a WebGLbased game engine, and Bullet is the best physics framework to be integrated into the WebGL-based game engine.

  19. Statistical Learning Framework with Adaptive Retraining for Condition-Based Maintenance

    International Nuclear Information System (INIS)

    An, Sang Ha; Chang, Soon Heung; Heo, Gyun Young; Seo, Ho Joon; Kim, Su Young

    2009-01-01

    As systems become more complex and more critical in our daily lives, the need for the maintenance based on the reliable monitoring and diagnosis has become more apparent. However, in reality, the general opinion has been that 'maintenance is a necessary evil' or 'nothing can be done to improve maintenance costs'. Perhaps these were true statements twenty years ago when many of the diagnostic technologies were not fully developed. The developments of microprocessor or computer based instrumentation that can be used to monitor the operating condition of plant equipment, machinery and systems have provided the means to manage the maintenance operation. They have provided the means to reduce or eliminate unnecessary repairs, prevent catastrophic machine failures and reduce the negative impact of the maintenance operation on the profitability of manufacturing and production plants. Condition-based maintenance (CBM) techniques help determine the condition of in-service equipment in order to predict when maintenance should be performed. Most of the statistical learning techniques are only valid as long as the physics of a system does not change. If any significant change such as the replacement of a component or equipment occurs in the system, the statistical learning model should be re-trained or re-developed to adapt the new system. In this research, authors will propose a statistical learning framework which can be applicable for various CBMs, and the concept of the adaptive retraining technique will be described to support the execution of the framework so that the monitoring system does not need to be re-developed or re-trained even though there are any significant changes in the system or component

  20. Light Hydrocarbons Adsorption Mechanisms in Two Calcium-based Microporous Metal Organic Frameworks

    KAUST Repository

    Plonka, Anna M.; Chen, Xianyin; Wang, Hao; Krishna, Rajamani; Dong, Xinglong; Banerjee, Debasis; Woerner, William R.; Han, Yu; Li, Jing; Parise, John B.

    2016-01-01

    measurments and gas adsorption isotherm measurements. Two calcium-based MOFs, designated as SBMOF-1 and SBMOF-2 (SB: Stony Brook), form three-dimensional frameworks with one-dimensional open channels. As determined form single crystal diffraction experiments

  1. Titanium-based zeolitic imidazolate framework for chemical fixation of carbon dioxide

    Data.gov (United States)

    U.S. Environmental Protection Agency — A titanium-based zeolitic imidazolate framework (Ti-ZIF) with high surface area and porous morphology has been synthesized and its application as a recyclable...

  2. A discrete element based simulation framework to investigate particulate spray deposition processes

    KAUST Repository

    Mukherjee, Debanjan; Zohdi, Tarek I.

    2015-01-01

    © 2015 Elsevier Inc. This work presents a computer simulation framework based on discrete element method to analyze manufacturing processes that comprise a loosely flowing stream of particles in a carrier fluid being deposited on a target surface

  3. FILTWAM - A Framework for Online Game-based Communication Skills Training

    NARCIS (Netherlands)

    Bahreini, Kiavash; Nadolski, Rob; Qi, Wen; Westera, Wim

    2013-01-01

    Bahreini, K., Nadolski, R., Qi, W., & Westera, W. (2012, October). FILTWAM - A Framework for Online Game-based Communication Skills Training. Poster presented at reaseach day in Pretoria building at the Open University of the Netherlands, Heerlen, The Netherlands.

  4. Titanium-based zeolitic imidazolate framework for chemical fixation of carbon dioxide

    Science.gov (United States)

    A titanium-based zeolitic imidazolate framework (Ti-ZIF) with high surface area and porous morphology was synthesized and itsefficacy was demonstrated in the synthesis of cyclic carbonates from epoxides and carbon dioxide.

  5. Uniframe: A Unified Framework for Developing Service-Oriented, Component-Based Distributed Software Systems

    National Research Council Canada - National Science Library

    Raje, Rajeev R; Olson, Andrew M; Bryant, Barrett R; Burt, Carol C; Auguston, Makhail

    2005-01-01

    .... It describes how this approach employs a unifying framework for specifying such systems to unite the concepts of service-oriented architectures, a component-based software engineering methodology...

  6. Evaluating assessment quality in competence-based education: A qualitative comparison of two frameworks

    NARCIS (Netherlands)

    Baartman, Liesbeth; Bastiaens, Theo; Kirschner, Paul A.; Van der Vleuten, Cees

    2009-01-01

    Baartman, L. K. J., Bastiaens, T. J., Kirschner, P. A., & Van der Vleuten, C. P. M. (2007). Evaluation assessment quality in competence-based education: A qualitative comparison of two frameworks. Educational Research Review, 2, 114-129.

  7. Fixation of carbon dioxide into dimethyl carbonate over titanium-based zeolitic thiophene-benzimidazolate framework

    Data.gov (United States)

    U.S. Environmental Protection Agency — A titanium-based zeolitic thiophene-benzimidazolate framework has been designed for the direct synthesis of dimethyl carbonate (DMC) from methanol and carbon...

  8. Using a data fusion-based activity recognition framework to determine surveillance system requirements

    CSIR Research Space (South Africa)

    Le Roux, WH

    2007-07-01

    Full Text Available A technique is proposed to extract system requirements for a maritime area surveillance system, based on an activity recognition framework originally intended for the characterisation, prediction and recognition of intentional actions for threat...

  9. Movement-based interaction in camera spaces: a conceptual framework

    DEFF Research Database (Denmark)

    Eriksson, Eva; Hansen, Thomas Riisgaard; Lykke-Olesen, Andreas

    2007-01-01

    In this paper we present three concepts that address movement-based interaction using camera tracking. Based on our work with several movementbased projects we present four selected applications, and use these applications to leverage our discussion, and to describe our three main concepts space,...

  10. Evidence-Based Practice: A Framework for Making Effective Decisions

    Science.gov (United States)

    Spencer, Trina D.; Detrich, Ronnie; Slocum, Timothy A.

    2012-01-01

    The research to practice gap in education has been a long-standing concern. The enactment of No Child Left Behind brought increased emphasis on the value of using scientifically based instructional practices to improve educational outcomes. It also brought education into the broader evidence-based practice movement that started in medicine and has…

  11. Frameworks: A Community-Based Approach to Preventing Youth Suicide

    Science.gov (United States)

    Baber, Kristine; Bean, Gretchen

    2009-01-01

    Few youth suicide prevention programs are theory based and systematically evaluated. This study evaluated the pilot implementation of a community-based youth suicide prevention project guided by an ecological perspective. One hundred fifty-seven adults representing various constituencies from educators to health care providers and 131 ninth-grade…

  12. Generalized hydrogeologic framework and groundwater budget for a groundwater availability study for the glacial aquifer system of the United States

    Science.gov (United States)

    Reeves, Howard W.; Bayless, E. Randall; Dudley, Robert W.; Feinstein, Daniel T.; Fienen, Michael N.; Hoard, Christopher J.; Hodgkins, Glenn A.; Qi, Sharon L.; Roth, Jason L.; Trost, Jared J.

    2017-12-14

    The glacial aquifer system groundwater availability study seeks to quantify (1) the status of groundwater resources in the glacial aquifer system, (2) how these resources have changed over time, and (3) likely system response to future changes in anthropogenic and environmental conditions. The glacial aquifer system extends from Maine to Alaska, although the focus of this report is the part of the system in the conterminous United States east of the Rocky Mountains. The glacial sand and gravel principal aquifer is the largest source of public and self-supplied industrial supply for any principal aquifer and also is an important source for irrigation supply. Despite its importance for water supply, water levels in the glacial aquifer system are generally stable varying with climate and only locally from pumping. The hydrogeologic framework developed for this study includes the information from waterwell records and classification of material types from surficial geologic maps into likely aquifers dominated by sand and gravel deposits. Generalized groundwater budgets across the study area highlight the variation in recharge and discharge primarily driven by climate.

  13. Which health technologies should be funded? A prioritization framework based explicitly on value for money

    Directory of Open Access Journals (Sweden)

    Golan Ofra

    2012-11-01

    Full Text Available Abstract Background Deciding which health technologies to fund involves confronting some of the most difficult choices in medicine. As for other countries, the Israeli health system is faced each year with having to make these difficult decisions. The Public National Advisory Committee, known as ‘the Basket Committee’, selects new technologies for the basic list of health care that all Israelis are entitled to access, known as the ‘health basket’. We introduce a framework for health technology prioritization based explicitly on value for money that enables the main variables considered by decision-makers to be explicitly included. Although the framework’s exposition is in terms of the Basket Committee selecting new technologies for Israel’s health basket, we believe that the framework would also work well for other countries. Methods Our proposed prioritization framework involves comparing four main variables for each technology: 1. Incremental benefits, including ‘equity benefits’, to Israel’s population; 2. Incremental total cost to Israel’s health system; 3. Quality of evidence; and 4. Any additional ‘X-factors’ not elsewhere included, such as strategic or legal factors, etc. Applying methodology from multi-criteria decision analysis, the multiple dimensions comprising the first variable are aggregated via a points system. Results The four variables are combined for each technology and compared across the technologies in the ‘Value for Money (VfM Chart’. The VfM Chart can be used to identify technologies that are good value for money, and, given a budget constraint, to select technologies that should be funded. This is demonstrated using 18 illustrative technologies. Conclusions The VfM Chart is an intuitively appealing decision-support tool for helping decision-makers to focus on the inherent tradeoffs involved in health technology prioritization. Such deliberations can be performed in a systematic and transparent

  14. MANDIBULAR ASYMMETRY CHARACTERIZATION USING GENERALIZED TENSOR-BASED MORPHOMETRY.

    Science.gov (United States)

    Paniagua, Beatriz; Alhadidi, Abeer; Cevidanes, Lucia; Styner, Martin; Oguz, Ipek

    2011-12-31

    Quantitative assessment of facial asymmetry is crucial for successful planning of corrective surgery. We propose a tensor-based morphometry (TBM) framework to locate and quantify asymmetry using 3D CBCT images. To this end, we compute a rigid transformation between the mandible segmentation and its mirror image, which yields global rotation and translation with respect to the cranial base to guide the surgery's first stage. Next, we nonrigidly register the rigidly aligned images and use TBM methods to locally analyze the deformation field. This yields data on the location, amount and direction of "growth" (or "shrinkage") between the left and right sides. We visualize this data in a volumetric manner and via scalar and vector maps on the mandibular surface to provide the surgeon with optimal understanding of the patient's anatomy. We illustrate the feasibility and strength of our technique on 3 representative patients with a wide range of facial asymmetries.

  15. Problem-Based Learning in a General Psychology Course.

    Science.gov (United States)

    Willis, Sandra A.

    2002-01-01

    Describes the adoption of problem-based learning (PBL) techniques in a general psychology course. States that the instructor used a combination of techniques, including think-pair-share, lecture/discussion, and PBL. Notes means and standard deviations for graded components of PBL format versus lecture/discussion format. (Contains 18 references.)…

  16. A General Polygon-based Deformable Model for Object Recognition

    DEFF Research Database (Denmark)

    Jensen, Rune Fisker; Carstensen, Jens Michael

    1999-01-01

    We propose a general scheme for object localization and recognition based on a deformable model. The model combines shape and image properties by warping a arbitrary prototype intensity template according to the deformation in shape. The shape deformations are constrained by a probabilistic distr...

  17. Values based practice: a framework for thinking with.

    Science.gov (United States)

    Mohanna, Kay

    2017-07-01

    Values are those principles that govern behaviours, and values-based practice has been described as a theory and skills base for effective healthcare decision-making where different (and hence potentially conflicting) values are in play. The emphasis is on good process rather than pre-set right outcomes, aiming to achieve balanced decision-making. In this article we will consider the utility of this model by looking at leadership development, a current area of much interest and investment in healthcare. Copeland points out that 'values based leadership behaviors are styles with a moral, authentic and ethical dimension', important qualities in healthcare decision-making.

  18. The implementation of the Common Assessment Framework in the assessment of administrative and efficiency of nursing service in General Hospital of island region.

    Directory of Open Access Journals (Sweden)

    Apostolos Efkarpidis

    2017-10-01

    Full Text Available Introduction: The Common Assessment Framework (CAF is an easy to use tool of Total Quality Management (TQM which is available to the public sector organizations in Europe. The Common Assessment Framework applied every two years providing a self-assessment framework to organizations, conceptually similar with the European Foundation Quality Management, taking into account their differences. It is common for all public organizations ensuring the comparability of results between similar services. Objective: To measure the existing administrative and operational level of Nursing Service in General Hospital. Methodology: The study was authorized by the General Manager of the Hospital. We followed the steps of the procedure provided the Implementation Guide of the CAF and used the questionnaire of the Guide. The sample consisted of 32 employees (Self-Assessment Group of all categories in a total of 101 employees. The analysis was performed with the Excel program. Results: The results of the evaluation were initiated between two rating scales based on the average (50 points a of the scale with grades from 31 to 50 (below average indicating the relative satisfaction for some activity and the moderate level of efficiency and effectiveness, and b of the scale with grades from 51 to 70 (above average indicating satisfaction for some activity and the good level of efficiency and effectiveness. The rating on the nine criteria of the CAF were: 1 Leadership (57.36, 2 Strategy – Planning (44.1, 3 Human Resource Management (50.07, 4 Partnerships – Resources (41,18, 5 Process Management – Changes (30.04, 6 Results for the Citizen (49.17, 7 Results for the Human Resources (42.13, 8 Results in Society (53, 97, 9 Main results (53,78. Conclusions: Τhe Nursing Service of the General Hospital was evaluated for its administration as well as the organizational performance by staff and citizens, based on the CAF. Seventy two weak points were spoted needing

  19. Energy Sharing Framework for Microgrid-Powered Cellular Base Stations

    KAUST Repository

    Farooq, Muhammad Junaid; Ghazzai, Hakim; Kadri, Abdullah; Elsawy, Hesham; Alouini, Mohamed-Slim

    2017-01-01

    Cellular base stations (BSs) are increasingly becoming equipped with renewable energy generators to reduce operational expenditures and carbon footprint of wireless communications. Moreover, advancements in the traditional electricity grid allow two

  20. Cat swarm optimization based evolutionary framework for multi document summarization

    Science.gov (United States)

    Rautray, Rasmita; Balabantaray, Rakesh Chandra

    2017-07-01

    Today, World Wide Web has brought us enormous quantity of on-line information. As a result, extracting relevant information from massive data has become a challenging issue. In recent past text summarization is recognized as one of the solution to extract useful information from vast amount documents. Based on number of documents considered for summarization, it is categorized as single document or multi document summarization. Rather than single document, multi document summarization is more challenging for the researchers to find accurate summary from multiple documents. Hence in this study, a novel Cat Swarm Optimization (CSO) based multi document summarizer is proposed to address the problem of multi document summarization. The proposed CSO based model is also compared with two other nature inspired based summarizer such as Harmony Search (HS) based summarizer and Particle Swarm Optimization (PSO) based summarizer. With respect to the benchmark Document Understanding Conference (DUC) datasets, the performance of all algorithms are compared in terms of different evaluation metrics such as ROUGE score, F score, sensitivity, positive predicate value, summary accuracy, inter sentence similarity and readability metric to validate non-redundancy, cohesiveness and readability of the summary respectively. The experimental analysis clearly reveals that the proposed approach outperforms the other summarizers included in the study.

  1. A Robust, Water-Based, Functional Binder Framework for High-Energy Lithium-Sulfur Batteries.

    Science.gov (United States)

    Lacey, Matthew J; Österlund, Viking; Bergfelt, Andreas; Jeschull, Fabian; Bowden, Tim; Brandell, Daniel

    2017-07-10

    We report here a water-based functional binder framework for the lithium-sulfur battery systems, based on the general combination of a polyether and an amide-containing polymer. These binders are applied to positive electrodes optimised towards high-energy electrochemical performance based only on commercially available materials. Electrodes with up to 4 mAh cm -2 capacity and 97-98 % coulombic efficiency are achievable in electrodes with a 65 % total sulfur content and a poly(ethylene oxide):poly(vinylpyrrolidone) (PEO:PVP) binder system. Exchange of either binder component for a different polymer with similar functionality preserves the high capacity and coulombic efficiency. The improvement in coulombic efficiency from the inclusion of the coordinating amide group was also observed in electrodes where pyrrolidone moieties were covalently grafted to the carbon black, indicating the role of this functionality in facilitating polysulfide adsorption to the electrode surface. The mechanical properties of the electrodes appear not to significantly influence sulfur utilisation or coulombic efficiency in the short term but rather determine retention of these properties over extended cycling. These results demonstrate the robustness of this very straightforward approach, as well as the considerable scope for designing binder materials with targeted properties. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. A new framework for assessing hospital crisis management based on resilience engineering approach.

    Science.gov (United States)

    Shirali, Gh A; Azadian, Sh; Saki, A

    2016-06-14

    In recent years, an increasing number of natural and man-made disasters have exposed many people and properties to various disasters. This has resulted in approximately 75,000 deaths worldwide every year due to disasters. Crisis management is becoming increasingly important to cope effectively with the magnitude and potential damage resulting from disasters. Hospitals, as the final point in the rescue chain, have a key role in the crisis management and need to be resilient against disasters. The purpose of this paper is to present a new framework for assessing the crisis management based on resilience principles in hospital infrastructure of a developing country. A questionnaire was developed and completed by 310 staff (nurses and managers) of eight hospitals in Iran. The findings indicate that the eight hospitals included in the study have moderate conditions in general, while hospitals X3, X4, and X7 have poor conditions in the crisis management. Consequently, it seems that the crisis management system was not resilient in all these hospitals in general. Using resilience engineering in assessing crisis management can improve and develop the ability of the hospitals' management to cope with any type of disaster.

  3. Cooperation dynamics of generalized reciprocity in state-based social dilemmas

    Science.gov (United States)

    Stojkoski, Viktor; Utkovski, Zoran; Basnarkov, Lasko; Kocarev, Ljupco

    2018-05-01

    We introduce a framework for studying social dilemmas in networked societies where individuals follow a simple state-based behavioral mechanism based on generalized reciprocity, which is rooted in the principle "help anyone if helped by someone." Within this general framework, which applies to a wide range of social dilemmas including, among others, public goods, donation, and snowdrift games, we study the cooperation dynamics on a variety of complex network examples. By interpreting the studied model through the lenses of nonlinear dynamical systems, we show that cooperation through generalized reciprocity always emerges as the unique attractor in which the overall level of cooperation is maximized, while simultaneously exploitation of the participating individuals is prevented. The analysis elucidates the role of the network structure, here captured by a local centrality measure which uniquely quantifies the propensity of the network structure to cooperation by dictating the degree of cooperation displayed both at the microscopic and macroscopic level. We demonstrate the applicability of the analysis on a practical example by considering an interaction structure that couples a donation process with a public goods game.

  4. Generalized SMO algorithm for SVM-based multitask learning.

    Science.gov (United States)

    Cai, Feng; Cherkassky, Vladimir

    2012-06-01

    Exploiting additional information to improve traditional inductive learning is an active research area in machine learning. In many supervised-learning applications, training data can be naturally separated into several groups, and incorporating this group information into learning may improve generalization. Recently, Vapnik proposed a general approach to formalizing such problems, known as "learning with structured data" and its support vector machine (SVM) based optimization formulation called SVM+. Liang and Cherkassky showed the connection between SVM+ and multitask learning (MTL) approaches in machine learning, and proposed an SVM-based formulation for MTL called SVM+MTL for classification. Training the SVM+MTL classifier requires the solution of a large quadratic programming optimization problem which scales as O(n(3)) with sample size n. So there is a need to develop computationally efficient algorithms for implementing SVM+MTL. This brief generalizes Platt's sequential minimal optimization (SMO) algorithm to the SVM+MTL setting. Empirical results show that, for typical SVM+MTL problems, the proposed generalized SMO achieves over 100 times speed-up, in comparison with general-purpose optimization routines.

  5. COMDES-II: A Component-Based Framework for Generative Development of Distributed Real-Time Control Systems

    DEFF Research Database (Denmark)

    Ke, Xu; Sierszecki, Krzysztof; Angelov, Christo K.

    2007-01-01

    The paper presents a generative development methodology and component models of COMDES-II, a component-based software framework for distributed embedded control systems with real-time constraints. The adopted methodology allows for rapid modeling and validation of control software at a higher lev...... methodology for COMDES-II from a general perspective, describes the component models in details and demonstrates their application through a DC-Motor control system case study.......The paper presents a generative development methodology and component models of COMDES-II, a component-based software framework for distributed embedded control systems with real-time constraints. The adopted methodology allows for rapid modeling and validation of control software at a higher level...

  6. Design and construction of porous metal-organic frameworks based on flexible BPH pillars

    Science.gov (United States)

    Hao, Xiang-Rong; Yang, Guang-sheng; Shao, Kui-Zhan; Su, Zhong-Min; Yuan, Gang; Wang, Xin-Long

    2013-02-01

    Three metal-organic frameworks (MOFs), [Co2(BPDC)2(4-BPH)·3DMF]n (1), [Cd2(BPDC)2(4-BPH)2·2DMF]n (2) and [Ni2(BDC)2(3-BPH)2 (H2O)·4DMF]n (3) (H2BPDC=biphenyl-4,4'-dicarboxylic acid, H2BDC=terephthalic acid, BPH=bis(pyridinylethylidene)hydrazine and DMF=N,N'-dimethylformamide), have been solvothermally synthesized based on the insertion of heterogeneous BPH pillars. Framework 1 has "single-pillared" MOF-5-like motif with inner cage diameters of up to 18.6 Å. Framework 2 has "double pillared" MOF-5-like motif with cage diameters of 19.2 Å while 3 has "double pillared" 8-connected framework with channel diameters of 11.0 Å. Powder X-ray diffraction (PXRD) shows that 3 is a dynamic porous framework.

  7. Edge detection methods based on generalized type-2 fuzzy logic

    CERN Document Server

    Gonzalez, Claudia I; Castro, Juan R; Castillo, Oscar

    2017-01-01

    In this book four new methods are proposed. In the first method the generalized type-2 fuzzy logic is combined with the morphological gra-dient technique. The second method combines the general type-2 fuzzy systems (GT2 FSs) and the Sobel operator; in the third approach the me-thodology based on Sobel operator and GT2 FSs is improved to be applied on color images. In the fourth approach, we proposed a novel edge detec-tion method where, a digital image is converted a generalized type-2 fuzzy image. In this book it is also included a comparative study of type-1, inter-val type-2 and generalized type-2 fuzzy systems as tools to enhance edge detection in digital images when used in conjunction with the morphologi-cal gradient and the Sobel operator. The proposed generalized type-2 fuzzy edge detection methods were tested with benchmark images and synthetic images, in a grayscale and color format. Another contribution in this book is that the generalized type-2 fuzzy edge detector method is applied in the preproc...

  8. Security Framework for Agent-Based Cloud Computing

    Directory of Open Access Journals (Sweden)

    K Venkateshwaran

    2015-06-01

    Full Text Available Agent can play a key role in bringing suitable cloud services to the customer based on their requirements. In agent based cloud computing, agent does negotiation, coordination, cooperation and collaboration on behalf of the customer to make the decisions in efficient manner. However the agent based cloud computing have some security issues like (a. addition of malicious agent in the cloud environment which could demolish the process by attacking other agents, (b. denial of service by creating flooding attacks on other involved agents. (c. Some of the exceptions in the agent interaction protocol such as Not-Understood and Cancel_Meta protocol can be misused and may lead to terminating the connection of all the other agents participating in the negotiating services. Also, this paper proposes algorithms to solve these issues to ensure that there will be no intervention of any malicious activities during the agent interaction.

  9. An FPGA- Based General-Purpose Data Acquisition Controller

    Science.gov (United States)

    Robson, C. C. W.; Bousselham, A.; Bohm

    2006-08-01

    System development in advanced FPGAs allows considerable flexibility, both during development and in production use. A mixed firmware/software solution allows the developer to choose what shall be done in firmware or software, and to make that decision late in the process. However, this flexibility comes at the cost of increased complexity. We have designed a modular development framework to help to overcome these issues of increased complexity. This framework comprises a generic controller that can be adapted for different systems by simply changing the software or firmware parts. The controller can use both soft and hard processors, with or without an RTOS, based on the demands of the system to be developed. The resulting system uses the Internet for both control and data acquisition. In our studies we developed the embedded system in a Xilinx Virtex-II Pro FPGA, where we used both PowerPC and MicroBlaze cores, http, Java, and LabView for control and communication, together with the MicroC/OS-II and OSE operating systems

  10. Design and construction of porous metal–organic frameworks based on flexible BPH pillars

    International Nuclear Information System (INIS)

    Hao, Xiang-Rong; Yang, Guang-sheng; Shao, Kui-Zhan; Su, Zhong-Min; Yuan, Gang; Wang, Xin-Long

    2013-01-01

    Three metal–organic frameworks (MOFs), [Co 2 (BPDC) 2 (4-BPH)·3DMF] n (1), [Cd 2 (BPDC) 2 (4-BPH) 2 ·2DMF] n (2) and [Ni 2 (BDC) 2 (3-BPH) 2 (H 2 O)·4DMF] n (3) (H 2 BPDC=biphenyl-4,4′-dicarboxylic acid, H 2 BDC=terephthalic acid, BPH=bis(pyridinylethylidene)hydrazine and DMF=N,N′-dimethylformamide), have been solvothermally synthesized based on the insertion of heterogeneous BPH pillars. Framework 1 has “single-pillared” MOF-5-like motif with inner cage diameters of up to 18.6 Å. Framework 2 has “double pillared” MOF-5-like motif with cage diameters of 19.2 Å while 3 has “double pillared” 8-connected framework with channel diameters of 11.0 Å. Powder X-ray diffraction (PXRD) shows that 3 is a dynamic porous framework. - Graphical abstract: By insertion of flexible BPH pillars based on “pillaring” strategy, three metal–organic frameworks are obtained showing that the porous frameworks can be constructed in a much greater variety. Highlights: ► Frameworks 1 and 2 have MOF-5 like motif. ► The cube-like cages in 1 and 2 are quite large, comparable to the IRMOF-10. ► Framework 1 is “single-pillared” mode while 2 is “double-pillared” mode. ► PXRD and gas adsorption analysis show that 3 is a dynamic porous framework.

  11. Design and construction of porous metal-organic frameworks based on flexible BPH pillars

    Energy Technology Data Exchange (ETDEWEB)

    Hao, Xiang-Rong; Yang, Guang-sheng; Shao, Kui-Zhan [Institute of Functional Material Chemistry, Faculty of Chemistry, Northeast Normal University, Changchun 130024, Jilin (China); Su, Zhong-Min, E-mail: zmsu@nenu.edu.cn [Institute of Functional Material Chemistry, Faculty of Chemistry, Northeast Normal University, Changchun 130024, Jilin (China); Yuan, Gang; Wang, Xin-Long [Institute of Functional Material Chemistry, Faculty of Chemistry, Northeast Normal University, Changchun 130024, Jilin (China)

    2013-02-15

    Three metal-organic frameworks (MOFs), [Co{sub 2}(BPDC){sub 2}(4-BPH){center_dot}3DMF]{sub n} (1), [Cd{sub 2}(BPDC){sub 2}(4-BPH){sub 2}{center_dot}2DMF]{sub n} (2) and [Ni{sub 2}(BDC){sub 2}(3-BPH){sub 2} (H{sub 2}O){center_dot}4DMF]{sub n} (3) (H{sub 2}BPDC=biphenyl-4,4 Prime -dicarboxylic acid, H{sub 2}BDC=terephthalic acid, BPH=bis(pyridinylethylidene)hydrazine and DMF=N,N Prime -dimethylformamide), have been solvothermally synthesized based on the insertion of heterogeneous BPH pillars. Framework 1 has 'single-pillared' MOF-5-like motif with inner cage diameters of up to 18.6 A. Framework 2 has 'double pillared' MOF-5-like motif with cage diameters of 19.2 A while 3 has 'double pillared' 8-connected framework with channel diameters of 11.0 A. Powder X-ray diffraction (PXRD) shows that 3 is a dynamic porous framework. - Graphical abstract: By insertion of flexible BPH pillars based on 'pillaring' strategy, three metal-organic frameworks are obtained showing that the porous frameworks can be constructed in a much greater variety. Highlights: Black-Right-Pointing-Pointer Frameworks 1 and 2 have MOF-5 like motif. Black-Right-Pointing-Pointer The cube-like cages in 1 and 2 are quite large, comparable to the IRMOF-10. Black-Right-Pointing-Pointer Framework 1 is 'single-pillared' mode while 2 is 'double-pillared' mode. Black-Right-Pointing-Pointer PXRD and gas adsorption analysis show that 3 is a dynamic porous framework.

  12. A Knowledge-based Recommendation Framework using SVN Numbers

    Directory of Open Access Journals (Sweden)

    Roddy Cabezas Padilla

    2017-06-01

    Full Text Available Current knowledge based recommender systems, despite proven useful and having a high impact, persist with some shortcomings. Among its limitations are the lack of more flexible models and the inclusion of indeterminacy of the factors involved for computing a global similarity.

  13. Framework for benchmarking FA-based string recognizers

    CSIR Research Space (South Africa)

    Ngassam, EK

    2010-10-01

    Full Text Available of suggested algorithms by domain-specific FA-implementers requires prior knowledge of the behaviour (performance-wise) of each algorithm in order to make an informed choice. The authors propose a based string recognizers such that FA-implementers could capture...

  14. A Framework For Agent-Based Educational Guidance And ...

    African Journals Online (AJOL)

    This work applies principles of artificial intelligence and agent development of educational guidance and counselling. An agent – based expert system is developed. The system supports the storage and intelligent interactive processing of the knowledge acquired by study and experience of the human expert in the domain ...

  15. A Variational Framework for Exemplar-Based Image Inpainting

    Science.gov (United States)

    2010-04-01

    Physical Review 106(4), 620–30 (1957) 37. Jia, J., Tang, C.K.: Inference of segmented color and texture description by tensor voting . IEEE Trans. on PAMI 26...use of other patch error functions based on the comparison of structure tensors , which could provide a more robust estimation of the morpho- logical

  16. A Framework for Constraint-Programming based Configuration

    DEFF Research Database (Denmark)

    Queva, Matthieu Stéphane Benoit

    Product configuration systems play an important role in the development of Mass Customisation, allowing the companies to reduce their costs while offering highly customised products. Such systems are often based on a configuration model, representing the product knowledge necessary to perform...

  17. A Framework for Reverse Engineering Large C++ Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Byelas, Heorhiy; Voinea, Lucian

    2009-01-01

    When assessing the quality and maintainability of large C++ code bases, tools are needed for extracting several facts from the source code, such as: architecture, structure, code smells, and quality metrics. Moreover, these facts should be presented in such ways so that one can correlate them and

  18. A Framework for Reverse Engineering Large C++ Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Byelas, Heorhiy; Voinea, Lucian

    2008-01-01

    When assessing the quality and maintainability of large C++ code bases, tools are needed for extracting several facts from the source code, such as: architecture, structure, code smells, and quality metrics. Moreover, these facts should be presented in such ways so that one can correlate them and

  19. Assessing excellence in translational cancer research: a consensus based framework

    NARCIS (Netherlands)

    Rajan, A.; Caldas, C.; van Luenen, H.; Saghatchian, M.; van Harten, Willem H.

    2013-01-01

    Background: It takes several years on average to translate basic research findings into clinical research and eventually deliver patient benefits. An expert-based excellence assessment can help improve this process by: identifying high performing Comprehensive Cancer Centres; best practices in

  20. Comparison of hand-craft feature based SVM and CNN based deep learning framework for automatic polyp classification.

    Science.gov (United States)

    Younghak Shin; Balasingham, Ilangko

    2017-07-01

    Colonoscopy is a standard method for screening polyps by highly trained physicians. Miss-detected polyps in colonoscopy are potential risk factor for colorectal cancer. In this study, we investigate an automatic polyp classification framework. We aim to compare two different approaches named hand-craft feature method and convolutional neural network (CNN) based deep learning method. Combined shape and color features are used for hand craft feature extraction and support vector machine (SVM) method is adopted for classification. For CNN approach, three convolution and pooling based deep learning framework is used for classification purpose. The proposed framework is evaluated using three public polyp databases. From the experimental results, we have shown that the CNN based deep learning framework shows better classification performance than the hand-craft feature based methods. It achieves over 90% of classification accuracy, sensitivity, specificity and precision.

  1. A General Mathematical Framework for Calculating Systems-Scale Efficiency of Energy Extraction and Conversion: Energy Return on Investment (EROI) and Other Energy Return Ratios

    OpenAIRE

    Adam R. Brandt; Michael Dale

    2011-01-01

    The efficiencies of energy extraction and conversion systems are typically expressed using energy return ratios (ERRs) such as the net energy ratio (NER) or energy return on investment (EROI). A lack of a general mathematical framework prevents inter-comparison of NER/EROI estimates between authors: methods used are not standardized, nor is there a framework for succinctly reporting results in a consistent fashion. In this paper we derive normalized mathematical forms of four ERRs for energy ...

  2. A generalized disjunctive programming framework for the optimal synthesis and analysis of processes for ethanol production from corn stover.

    Science.gov (United States)

    Scott, Felipe; Aroca, Germán; Caballero, José Antonio; Conejeros, Raúl

    2017-07-01

    The aim of this study is to analyze the techno-economic performance of process configurations for ethanol production involving solid-liquid separators and reactors in the saccharification and fermentation stage, a family of process configurations where few alternatives have been proposed. Since including these process alternatives creates a large number of possible process configurations, a framework for process synthesis and optimization is proposed. This approach is supported on kinetic models fed with experimental data and a plant-wide techno-economic model. Among 150 process configurations, 40 show an improved MESP compared to a well-documented base case (BC), almost all include solid separators and some show energy retrieved in products 32% higher compared to the BC. Moreover, 16 of them also show a lower capital investment per unit of ethanol produced per year. Several of the process configurations found in this work have not been reported in the literature. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Electrochemical properties of copper-based compounds with polyanion frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Mizuno, Yoshifumi; Hata, Shoma; Suzuki, Kota; Hirayama, Masaaki; Kanno, Ryoji, E-mail: kanno@echem.titech.ac.jp

    2016-03-15

    The copper-based polyanion compounds Li{sub 6}CuB{sub 4}O{sub 10} and Li{sub 2}CuP{sub 2}O{sub 7} were synthesized using a conventional solid-state reaction, and their electrochemical properties were determined. Li{sub 6}CuB{sub 4}O{sub 10} showed reversible capacity of 340 mA g{sup −1} at the first discharge–charge process, while Li{sub 2}CuP{sub 2}O{sub 7} showed large irreversible capacity and thus low charge capacity. Ex situ X-ray diffraction (XRD) and X-ray absorption near edge structure (XANES) measurements revealed that the electrochemical Li{sup +} intercalation/deintercalation reaction in Li{sub 6}CuB{sub 4}O{sub 10} occurred via reversible Cu{sup 2+}/Cu{sup +} reduction/oxidation reaction. These differences in their discharge/charge mechanisms are discussed based on the strength of the Cu–O covalency via their inductive effects. - Graphical abstract: Electrochemical properties for Cu-based polyanion compounds were investigated. The electrochemical reaction mechanisms are strongly affected by their Cu–O covalentcy. - Highlights: • Electrochemical properties of Cu-based polyanion compounds were investigated. • The Li{sup +} intercalation/deintercalation reaction progressed in Li{sub 6}CuB{sub 4}O{sub 10}. • The electrochemical displacement reaction progressed in Li{sub 2}CuP{sub 2}O{sub 7}. • The strength of Cu–O covalency affects the reaction mechanism.

  4. The Climate Change Education Evidence Base: Lessons Learned from NOAA's Monitoring and Evaluation Framework Implementation

    Science.gov (United States)

    Baek, J.

    2012-12-01

    effort has provided some shared understanding and general guidance, there is still a lack of guidance to make decisions at any level of the community. A recent memorandum from the Office of Management and Budget provides more specific guidance around the generation and utilization of evidence. For example, the amount of funding awarded through grants should be weighted by the level of the evidence supporting a proposed project. As the field of climate change education establishes an evidence base, study designs should address a greater number of internal validity threats through comparison groups and reliable common measures. In addition, OMB invites agencies to develop systematic measurement of costs and costs per outcome. A growing evidence base, one that includes data that includes costs and even monetizes benefits, can inform decisions based on the strongest returns on investments within a portfolio. This paper will provide examples from NOAA's Monitoring and Evaluation Framework Implementation project that illustrate how NOAA is facing these challenges. This is intended to inform climate change educators, evaluators, and researchers in ways to integrate evaluation into the management of their programs while providing insight across the portfolio.

  5. Using framework-based synthesis for conducting reviews of qualitative studies.

    Science.gov (United States)

    Dixon-Woods, Mary

    2011-04-14

    Framework analysis is a technique used for data analysis in primary qualitative research. Recent years have seen its being adapted to conduct syntheses of qualitative studies. Framework-based synthesis shows considerable promise in addressing applied policy questions. An innovation in the approach, known as 'best fit' framework synthesis, has been published in BMC Medical Research Methodology this month. It involves reviewers in choosing a conceptual model likely to be suitable for the question of the review, and using it as the basis of their initial coding framework. This framework is then modified in response to the evidence reported in the studies in the reviews, so that the final product is a revised framework that may include both modified factors and new factors that were not anticipated in the original model. 'Best fit' framework-based synthesis may be especially suitable in addressing urgent policy questions where the need for a more fully developed synthesis is balanced by the need for a quick answer. Please see related article: http://www.biomedcentral.com/1471-2288/11/29.

  6. A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture

    Science.gov (United States)

    Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.

    2005-01-01

    Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.

  7. SenseSeer, mobile-cloud-based lifelogging framework

    OpenAIRE

    Albatal, Rami; Gurrin, Cathal; Zhou, Jiang; Yang, Yang; Carthy, Denise; LI, Na

    2013-01-01

    Smart-phones are becoming our constant companions, they are with us all of the time, being used for calling, web surfing, apps, music listening, TV viewing, social networking, buying, gaming, and a myriad of other uses. Smart-phones are a technology that knows us much better than most of us could imagine. Based on our usage and the fact that we are never far away from our smart phones, they know where we go, who we interact with, what information we consume, and with a little clever software,...

  8. A Framework For Enhancing Privacy In Location Based Services Using K-Anonymity Model

    Directory of Open Access Journals (Sweden)

    Jane Mugi

    2015-08-01

    Full Text Available Abstract This paper presents a framework for enhancing privacy in Location Based Services using K-anonymity model. Users of location based services have to reveal their location information in order to use these services however this has threatened the user privacy. K-anonymity approach has been studied extensively in various forms. However it is only effective when the user location is fixed. When a user moves and continuously sends their location information the location service provider can approximate user trajectory which poses a threat to the trajectory privacy of the user. This framework will ensure that user privacy is enhanced for both snapshot and continuous queries. The efficiency and effectiveness of the proposed framework was evaluated the results indicate that the proposed framework has high success rate and good run time performance.

  9. Complexity analysis based on generalized deviation for financial markets

    Science.gov (United States)

    Li, Chao; Shang, Pengjian

    2018-03-01

    In this paper, a new modified method is proposed as a measure to investigate the correlation between past price and future volatility for financial time series, known as the complexity analysis based on generalized deviation. In comparison with the former retarded volatility model, the new approach is both simple and computationally efficient. The method based on the generalized deviation function presents us an exhaustive way showing the quantization of the financial market rules. Robustness of this method is verified by numerical experiments with both artificial and financial time series. Results show that the generalized deviation complexity analysis method not only identifies the volatility of financial time series, but provides a comprehensive way distinguishing the different characteristics between stock indices and individual stocks. Exponential functions can be used to successfully fit the volatility curves and quantify the changes of complexity for stock market data. Then we study the influence for negative domain of deviation coefficient and differences during the volatile periods and calm periods. after the data analysis of the experimental model, we found that the generalized deviation model has definite advantages in exploring the relationship between the historical returns and future volatility.

  10. Development of a testing methodology for computerized procedure system based on JUnit framework and MFM

    International Nuclear Information System (INIS)

    Qin, Wei

    2004-02-01

    Paper Based Procedure (PBP) and Computerized Procedure System (CPS) are studied to demonstrate that it is necessary to develop CPS in Nuclear Power Plant (NPP) Instrumentation and Control (I and C) system. Computerized procedure system is actually a software system. All the desired and undesired properties of a software system can be described and evaluated as software qualities. Generally, software qualities can be categorized into product quality and process quality. In order to achieve product quality, the process quality of a software system should also be considered and achieved. Characteristics of CPS will be described to analyse the product and process of an example CPS: ImPRO. At the same time, several main product and process issues will be analysed from Verification and Validation (V and V) point of view. It is concluded and suggested that V and V activities can also be regarded as a software development process, this point of view then is applied to the V and V activities of ImPRO as a systematic approach of testing of ImPRO. To support and realize this approach, suitable testing technologies and testing strategies are suggested based on JUnit framework and Multi-level Flow Modeling (MFM)

  11. A Generic Framework for Location-Based Services (LBS Provisioning

    Directory of Open Access Journals (Sweden)

    Ioannis Priggouris

    2006-01-01

    Full Text Available Location Based Services can be considered as one of the most rapidly expanding fields of the mobile communications sector, with an impressively large application range. The proliferation of mobile/wireless Internet and mobile computing, and the constantly increasing use of handheld, mobile devices and position tracking technologies prepared the grounds for the introduction of this new type of services. The combination of position fixing mechanisms with location-dependent, geographical information, can offer truly customized personal communication services through the mobile phone or other type of devices. Prompted by the avalanche of technology advances in the aforementioned areas in this paper we present an integrated platform for delivering Location Based Services (LBS. The platform covers the full life cycle of a LBS starting from the specification of the service, covering issues like the deployment and maintenance of services, the service invocation and the final delivery of the produced results to the invoking user. A prototype implementation of the discussed platform was developed and used to perform a series of trial services, with the purpose of demonstrating the pursued functionality.

  12. Managing corneal foreign bodies in office-based general practice.

    Science.gov (United States)

    Fraenkel, Alison; Lee, Lawrence R; Lee, Graham A

    2017-03-01

    Patients with a corneal foreign body may first present to their general practitioner (GP). Safe and efficacious management of these presentations avoids sight-threatening and eye-threatening complications. Removal of a simple, superficial foreign body without a slit lamp is within The Royal Australian College of General Practitioners' (RACGP's) curriculum and scope of practice. Knowing the rele-vant procedural skills and indications for referral is equally important. The objective of this article is to provide an evidence-based and expert-based guide to the management of corneal foreign bodies in the GP's office. History is key to identifying patient characteristics and mechanisms of ocular injury that are red flags for referral. Examination tech-niques and methods of superficial foreign body removal without a slit lamp are outlined, as well as the procedural threshold for referral to an ophthalmologist.

  13. A cognitive framework to inform the design of professional development supporting teachers' classroom assessment of inquiry-based science

    Science.gov (United States)

    Matese, Gabrielle

    Inquiry-based science places new demands on teachers for assessing students' growth, both of deep conceptual understanding as well as developing inquiry skills. In addition, new ideas about classroom assessment, such as the importance of formative assessment, are gaining currency. While we have ideas about what classroom assessment consistent with inquiry-based pedagogy might look like, and why it is necessary, we have little understanding of what it takes to implement it. That teachers face a challenge in doing so is well-documented. Researchers have noted that teachers attempting changes in classroom assessment often bring with them incompatible beliefs, knowledge, and practices. However, noting general incompatibility is insufficient to support addressing these issues through professional development. In response to this need, I initiated a research project to identify and describe in more detail the categories of beliefs, knowledge and skills that play an important role in inquiry-based science assessment practices. I created an assessment framework outlining specific categories of beliefs, knowledge, and skills affecting particular classroom assessment practices. I then used the framework to examine teachers' classroom assessment practices and to create comparative cases between three middle-school science teachers, highlighting how the different cognitive factors affect four particular assessment practices. The comparative cases demonstrate the framework's utility for analyzing and explicating teacher assessment practices. As a tool for analyzing and understanding teacher practice, the framework supports the design of professional development. To demonstrate the value of the framework, I draw on the comparative cases to identify implications for the design of professional development to support teachers' classroom assessment of inquiry-based science. In this dissertation I provide a brief overview of the framework and its rationale, present an example of the

  14. Invariant object recognition based on the generalized discrete radon transform

    Science.gov (United States)

    Easley, Glenn R.; Colonna, Flavia

    2004-04-01

    We introduce a method for classifying objects based on special cases of the generalized discrete Radon transform. We adjust the transform and the corresponding ridgelet transform by means of circular shifting and a singular value decomposition (SVD) to obtain a translation, rotation and scaling invariant set of feature vectors. We then use a back-propagation neural network to classify the input feature vectors. We conclude with experimental results and compare these with other invariant recognition methods.

  15. Multiple access chaotic digital communication based on generalized synchronization

    International Nuclear Information System (INIS)

    Lu Junguo

    2005-01-01

    A novel method for multiple access chaotic digital communication based on the concept of chaos generalized synchronization and the on-line least square method is proposed. This method can be used for transmitting multiple digital information signals concurrently. We illustrate the method using a Lorenz system driving a Chua's circuit and then examine the robustness of the proposed method with respect to noise in communication channel

  16. Design of MPPT Controller Monitoring Software Based on QT Framework

    Science.gov (United States)

    Meng, X. Z.; Lu, P. G.

    2017-10-01

    The MPPT controller was a hardware device for tracking the maximum power point of solar photovoltaic array. Multiple controllers could be working as networking mode by specific communicating protocol. In this article, based on C++ GUI programming with Qt frame, we designed one sort of desktop application for monitoring and analyzing operational parameter of MPPT controller. The type of communicating protocol for building network was Modbus protocol which using Remote Terminal Unit mode and The desktop application of host computer was connected with all the controllers in the network through RS485 communication or ZigBee wireless communication. Using this application, user could monitor the parameter of controller wherever they were by internet.

  17. Web-based networking within the framework of ANENT

    International Nuclear Information System (INIS)

    Han, K.W.; Lee, E.J.; Kim, Y.T.; Nam, Y.M.; Kim, H.K.

    2004-01-01

    The Korea Atomic Energy Research Institute (KAERI) is actively participating in the Asian Network for Education in Nuclear Technology (ANENT), which is an IAEA activity to promote nuclear knowledge management. This has led KAERI to conduct a web-based networking for nuclear education and training in Asia. The networking encompasses the establishment of a relevant website and a system for a sustainable operation of the website. The established ANENT website features function as a database providing collected information, a link facilitating a systematic worldwide access to relevant websites, and an activity implementation for supporting the individual tasks of ANENT. The required information is being collected and loaded onto the database, and the website will be improved step by step. Consequently, networking is expected to play an important role, through cooperating with other networks, and thus contributing to a future global network for a sustainable development of nuclear technology. (author)

  18. A framework of quality improvement interventions to implement evidence-based practices for pressure ulcer prevention.

    Science.gov (United States)

    Padula, William V; Mishra, Manish K; Makic, Mary Beth F; Valuck, Robert J

    2014-06-01

    To enhance the learner's competence with knowledge about a framework of quality improvement (QI) interventions to implement evidence-based practices for pressure ulcer (PrU) prevention. This continuing education activity is intended for physicians and nurses with an interest in skin and wound care. After participating in this educational activity, the participant should be better able to:1. Summarize the process of creating and initiating the best-practice framework of QI for PrU prevention.2. Identify the domains and QI interventions for the best-practice framework of QI for PrU prevention. Pressure ulcer (PrU) prevention is a priority issue in US hospitals. The National Pressure Ulcer Advisory Panel endorses an evidence-based practice (EBP) protocol to help prevent PrUs. Effective implementation of EBPs requires systematic change of existing care units. Quality improvement interventions offer a mechanism of change to existing structures in order to effectively implement EBPs for PrU prevention. The best-practice framework developed by Nelson et al is a useful model of quality improvement interventions that targets process improvement in 4 domains: leadership, staff, information and information technology, and performance and improvement. At 2 academic medical centers, the best-practice framework was shown to physicians, nurses, and health services researchers. Their insight was used to modify the best-practice framework as a reference tool for quality improvement interventions in PrU prevention. The revised framework includes 25 elements across 4 domains. Many of these elements support EBPs for PrU prevention, such as updates in PrU staging and risk assessment. The best-practice framework offers a reference point to initiating a bundle of quality improvement interventions in support of EBPs. Hospitals and clinicians tasked with quality improvement efforts can use this framework to problem-solve PrU prevention and other critical issues.

  19. Shear force bond analysis between acrylic resin bases and retention framework (open- and mesh-type)

    Science.gov (United States)

    Royhan, A.; Indrasari, M.; Masulili, C.

    2017-08-01

    Occlusions between teeth and the activity of the muscles around an artificial tooth during mastication create a force on dentures. This force causes friction between acrylic resin bases and retention frameworks that can lead to the complete loss of the acrylic resin base from the framework. The purpose of this study was to analyze the design of retention frameworks and determine which ones have a better resistance to shear forces in order to prevent the loss of heat cured acrylic resin base (HCARB). Six samples each of open-and mesh-type retention frameworks, both types made of Co-Cr material, and HCARB, were shear tested by means of a universal testing machine. The average shear force required to release the HCARB for mesh-type retention frameworks was 28.84 kgf, and the average for the open-type was 26.52 kgf. There was no significant difference between the shear forces required to remove HCARB from open- and mesh-type retention frameworks.

  20. Pendekatan Hot-Fit Framework dalam Generalized Structural Component Analysis pada Sistem Informasi Manajemen Barang Milik Daerah: Sebuah Pengujian Efek Resiprokal

    Directory of Open Access Journals (Sweden)

    Shofana Erimalata

    2016-04-01

    Full Text Available This study aims to examines the determinant of the information quality of fixed assets on the accrual-based balance sheet using HOT-Fit Framework approach with Generalized Structural Component Analysis (GeSCA method. The study using questionnaire with 90 respondents who represented all the local government agencies of Mataram City Government. Data anaylisis employs structural equation model (SEM. The study revelas there is a reciprocal relation between organizational controling and the information quality of fixed assets. The study also indicates that the software quality of Sistem Informasi Manajemen Barang Milik Daerah/Management Information System for Local Government’s Goods (SIMDA BMD affecting the user satisfaction and organizational controling. The implications of these results can be used as consideration in adjusting the Mataram City Government accounting policy regarding fixed assets administration in order to produce quality information on fixed assets for the local government accrual-based balance sheet. Then, users of information system are needs to trained in order to increase competence to conduct the administration of fixed assets accrual based, so it can contribute to improve the quality of fixed asset information presented on the accrual based balance sheet.

  1. Web-based networking within the framework of ANENT

    International Nuclear Information System (INIS)

    Han, K.W.; Lee, E.J.; Kim, Y.T.; Nam, Y.M.

    2004-01-01

    Recognizing the importance of nuclear knowledge management, KAERI has been actively involved in the establishment of the IAEA Asian Network for Higher Education in Nuclear Technology (ANENT). The institute, on behalf of the Korean government, initiated discussions with the IAEA on the concept of ANENT and hosted an IAEA Consultant Meeting in July 2003, which was intended to prepare a draft report for the establishment of ANENT. From the preparatory stage, the institute volunteered to establish a website to support the ANENT activities. This led the ANENT Coordination Committee, at its first meeting in April 2004, to designate KAERI as the coordinating organization for a work package on the 'Web-based Exchange of Information and Material for Nuclear Education and Training'. The committee also identified four more work packages and the respective coordinators at the same meeting. To implement the task of the web-based exchange, a website (www.anent-tepm.org) was designed with three functional objectives. The first function was to provide the ANENT member websites with a comprehensive connection with each other as well as to other sites relevant to nuclear education and training. The second one was to provide the collected information and materials. The last one was to provide a systematic and sustainable means to add, revise, and share the information and materials of high quality. As a result, the web site has been structured to deal with the overall information about ANENT, group activities (e.g. Coordination Committee meetings and work packages), inter-organization (or network) link, thematic information/materials database (or link), and the management of human resources. The ANENT website has been temporarily operated and is being revised to fulfil the objectives and reach a consensus among the ANENT members. In parallel, a set of information about education and training courses and teaching materials available from the network members is being collected, which

  2. Turbulent Simulations of Divertor Detachment Based On BOUT + + Framework

    Science.gov (United States)

    Chen, Bin; Xu, Xueqiao; Xia, Tianyang; Ye, Minyou

    2015-11-01

    China Fusion Engineering Testing Reactor is under conceptual design, acting as a bridge between ITER and DEMO. The detached divertor operation offers great promise for a reduction of heat flux onto divertor target plates for acceptable erosion. Therefore, a density scan is performed via an increase of D2 gas puffing rates in the range of 0 . 0 ~ 5 . 0 ×1023s-1 by using the B2-Eirene/SOLPS 5.0 code package to study the heat flux control and impurity screening property. As the density increases, it shows a gradually change of the divertor operation status, from low-recycling regime to high-recycling regime and finally to detachment. Significant radiation loss inside the confined plasma in the divertor region during detachment leads to strong parallel density and temperature gradients. Based on the SOLPS simulations, BOUT + + simulations will be presented to investigate the stability and turbulent transport under divertor plasma detachment, particularly the strong parallel gradient driven instabilities and enhanced plasma turbulence to spread heat flux over larger surface areas. The correlation between outer mid-plane and divertor turbulence and the related transport will be analyzed. Prepared by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-675075.

  3. Fish Ontology framework for taxonomy-based fish recognition

    Science.gov (United States)

    Ali, Najib M.; Khan, Haris A.; Then, Amy Y-Hui; Ving Ching, Chong; Gaur, Manas

    2017-01-01

    Life science ontologies play an important role in Semantic Web. Given the diversity in fish species and the associated wealth of information, it is imperative to develop an ontology capable of linking and integrating this information in an automated fashion. As such, we introduce the Fish Ontology (FO), an automated classification architecture of existing fish taxa which provides taxonomic information on unknown fish based on metadata restrictions. It is designed to support knowledge discovery, provide semantic annotation of fish and fisheries resources, data integration, and information retrieval. Automated classification for unknown specimens is a unique feature that currently does not appear to exist in other known ontologies. Examples of automated classification for major groups of fish are demonstrated, showing the inferred information by introducing several restrictions at the species or specimen level. The current version of FO has 1,830 classes, includes widely used fisheries terminology, and models major aspects of fish taxonomy, grouping, and character. With more than 30,000 known fish species globally, the FO will be an indispensable tool for fish scientists and other interested users. PMID:28929028

  4. Fish Ontology framework for taxonomy-based fish recognition

    Directory of Open Access Journals (Sweden)

    Najib M. Ali

    2017-09-01

    Full Text Available Life science ontologies play an important role in Semantic Web. Given the diversity in fish species and the associated wealth of information, it is imperative to develop an ontology capable of linking and integrating this information in an automated fashion. As such, we introduce the Fish Ontology (FO, an automated classification architecture of existing fish taxa which provides taxonomic information on unknown fish based on metadata restrictions. It is designed to support knowledge discovery, provide semantic annotation of fish and fisheries resources, data integration, and information retrieval. Automated classification for unknown specimens is a unique feature that currently does not appear to exist in other known ontologies. Examples of automated classification for major groups of fish are demonstrated, showing the inferred information by introducing several restrictions at the species or specimen level. The current version of FO has 1,830 classes, includes widely used fisheries terminology, and models major aspects of fish taxonomy, grouping, and character. With more than 30,000 known fish species globally, the FO will be an indispensable tool for fish scientists and other interested users.

  5. Section 3. General issues in management : Heuristics or experience-based techniques for making accounting judgments and learning

    OpenAIRE

    Schiller, Stefan

    2013-01-01

    The purpose of this paper is to further the development of initial accounting for internally generated intangible assets, relevant to both academics and practitioners, examining what happens when accountants are given principles-based discretion. This paper draws on existing insights into heuristics or experience-based techniques for making accounting judgments. Knowledge about judgment under uncertainty, and the general framework offered by the heuristics and biases program in particular, fo...

  6. Theoretical frameworks informing family-based child and adolescent obesity interventions

    DEFF Research Database (Denmark)

    Alulis, Sarah; Grabowski, Dan

    2017-01-01

    into focus. However, the use of theoretical frameworks to strengthen these interventions is rare and very uneven. OBJECTIVE AND METHOD: To conduct a qualitative meta-synthesis of family-based interventions for child and adolescent obesity to identify the theoretical frameworks applied, thus understanding how...... inconsistencies and a significant void between research results and health care practice. Based on the analysis, this article proposes three themes to be used as focus points when designing future interventions and when selecting theories for the development of solid, theory-based frameworks for application...... cognitive, self-efficacy and Family Systems Theory appeared most frequently. The remaining 24 were classified as theory-related as theoretical elements of self-monitoring; stimulus control, reinforcement and modelling were used. CONCLUSION: The designs of family-based interventions reveal numerous...

  7. A Framework for Effective User Interface Design for Web-Based Electronic Commerce Applications

    Directory of Open Access Journals (Sweden)

    Justyna Burns

    2001-01-01

    Full Text Available Efficient delivery of relevant product information is increasingly becoming the central basis of competition between firms. The interface design represents the central component for successful information delivery to consumers. However, interface design for web-based information systems is probably more an art than a science at this point in time. Much research is needed to understand properties of an effective interface for electronic commerce. This paper develops a framework identifying the relationship between user factors, the role of the user interface and overall system success for web-based electronic commerce. The paper argues that web-based systems for electronic commerce have some similar properties to decision support systems (DSS and adapts an established DSS framework to the electronic commerce domain. Based on a limited amount of research studying web browser interface design, the framework identifies areas of research needed and outlines possible relationships between consumer characteristics, interface design attributes and measures of overall system success.

  8. A non-voxel-based broad-beam (NVBB) framework for IMRT treatment planning.

    Science.gov (United States)

    Lu, Weiguo

    2010-12-07

    We present a novel framework that enables very large scale intensity-modulated radiation therapy (IMRT) planning in limited computation resources with improvements in cost, plan quality and planning throughput. Current IMRT optimization uses a voxel-based beamlet superposition (VBS) framework that requires pre-calculation and storage of a large amount of beamlet data, resulting in large temporal and spatial complexity. We developed a non-voxel-based broad-beam (NVBB) framework for IMRT capable of direct treatment parameter optimization (DTPO). In this framework, both objective function and derivative are evaluated based on the continuous viewpoint, abandoning 'voxel' and 'beamlet' representations. Thus pre-calculation and storage of beamlets are no longer needed. The NVBB framework has linear complexities (O(N(3))) in both space and time. The low memory, full computation and data parallelization nature of the framework render its efficient implementation on the graphic processing unit (GPU). We implemented the NVBB framework and incorporated it with the TomoTherapy treatment planning system (TPS). The new TPS runs on a single workstation with one GPU card (NVBB-GPU). Extensive verification/validation tests were performed in house and via third parties. Benchmarks on dose accuracy, plan quality and throughput were compared with the commercial TomoTherapy TPS that is based on the VBS framework and uses a computer cluster with 14 nodes (VBS-cluster). For all tests, the dose accuracy of these two TPSs is comparable (within 1%). Plan qualities were comparable with no clinically significant difference for most cases except that superior target uniformity was seen in the NVBB-GPU for some cases. However, the planning time using the NVBB-GPU was reduced many folds over the VBS-cluster. In conclusion, we developed a novel NVBB framework for IMRT optimization. The continuous viewpoint and DTPO nature of the algorithm eliminate the need for beamlets and lead to better plan

  9. A non-voxel-based broad-beam (NVBB) framework for IMRT treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Lu Weiguo, E-mail: wlu@tomotherapy.co [TomoTherapy Inc., 1240 Deming Way, Madison, WI 53717 (United States)

    2010-12-07

    We present a novel framework that enables very large scale intensity-modulated radiation therapy (IMRT) planning in limited computation resources with improvements in cost, plan quality and planning throughput. Current IMRT optimization uses a voxel-based beamlet superposition (VBS) framework that requires pre-calculation and storage of a large amount of beamlet data, resulting in large temporal and spatial complexity. We developed a non-voxel-based broad-beam (NVBB) framework for IMRT capable of direct treatment parameter optimization (DTPO). In this framework, both objective function and derivative are evaluated based on the continuous viewpoint, abandoning 'voxel' and 'beamlet' representations. Thus pre-calculation and storage of beamlets are no longer needed. The NVBB framework has linear complexities (O(N{sup 3})) in both space and time. The low memory, full computation and data parallelization nature of the framework render its efficient implementation on the graphic processing unit (GPU). We implemented the NVBB framework and incorporated it with the TomoTherapy treatment planning system (TPS). The new TPS runs on a single workstation with one GPU card (NVBB-GPU). Extensive verification/validation tests were performed in house and via third parties. Benchmarks on dose accuracy, plan quality and throughput were compared with the commercial TomoTherapy TPS that is based on the VBS framework and uses a computer cluster with 14 nodes (VBS-cluster). For all tests, the dose accuracy of these two TPSs is comparable (within 1%). Plan qualities were comparable with no clinically significant difference for most cases except that superior target uniformity was seen in the NVBB-GPU for some cases. However, the planning time using the NVBB-GPU was reduced many folds over the VBS-cluster. In conclusion, we developed a novel NVBB framework for IMRT optimization. The continuous viewpoint and DTPO nature of the algorithm eliminate the need for beamlets

  10. Agent Based Framework Architecture for Supporting Content Adaptation for Mobile Government

    Directory of Open Access Journals (Sweden)

    Hasan Omar Al-Sakran

    2013-01-01

    Full Text Available Rapid spread of smart mobile technology that supports internet access is transforming the way governments provide services to their citizens. Mobile devices have different capabilities based on the manufacturers and models. This paper proposes a new framework for adapting the content of M-government services using mobile agent technology. The framework is based on a mediation architecture that uses multiple mobile agents and XML as semi-structure mediation language. The flexibility of the mediation and XML provide an adaptive environment to stream data based on the capabilities of the device sending the query to the system.

  11. A model-based framework for incremental scale-up of wastewater treatment processes

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Sin, Gürkan

    Scale-up is traditionally done following specific ratios or rules of thumb which do not lead to optimal results. We present a generic framework to assist in scale-up of wastewater treatment processes based on multiscale modelling, multiobjective optimisation and a validation of the model at the new...... large scale. The framework is illustrated by the scale-up of a complete autotropic nitrogen removal process. The model based multiobjective scaleup offers a promising improvement compared to the rule of thumbs based emprical scale up rules...

  12. A framework for the quantitative assessment of performance-based system resilience

    International Nuclear Information System (INIS)

    Tran, Huy T.; Balchanos, Michael; Domerçant, Jean Charles; Mavris, Dimitri N.

    2017-01-01

    Increasing system complexity and threat uncertainty require the consideration of resilience in the design and analysis of engineered systems. While the resilience engineering community has begun to converge on a definition and set of characteristics for resilience, methods for quantifying the concept are still limited in their applicability to system designers. This paper proposes a framework for assessing resilience that focuses on the ability of a system to absorb disruptions, recover from them, and adapt over time. The framework extends current approaches by explicitly considering temporal aspects of system responses to disruptions, volatility in system performance data, and the possibility of multiple disruption events. Notional system performance data is generated using the logistic function, providing an experimental platform for a parametric comparison of the proposed resilience metric with an integration-based metric. An information exchange network model is used to demonstrate the applicability of the framework towards system design tradeoff studies using stochastic simulations. The presented framework is domain-agnostic and flexible, such that it can be applied to a variety of systems and adjusted to focus on specific aspects of resilience. - Highlights: • We propose a quantitative framework and metrics for assessing system resilience. • Metrics focus on absorption, recovery, and adaptation to disruptions. • The framework accepts volatile data and is easily automated for simulation studies. • The framework is applied to a model of adaptive information exchange networks. • Results show benefits of network adaptation against random and targeted threats.

  13. Moments Based Framework for Performance Analysis of One-Way/Two-Way CSI-Assisted AF Relaying

    KAUST Repository

    Xia, Minghua

    2012-09-01

    When analyzing system performance of conventional one-way relaying or advanced two-way relaying, these two techniques are always dealt with separately and, thus, their performance cannot be compared efficiently. Moreover, for ease of mathematical tractability, channels considered in such studies are generally assumed to be subject to Rayleigh fading or to be Nakagami-$m$ channels with integer fading parameters, which is impractical in typical urban environments. In this paper, we propose a unified moments-based framework for general performance analysis of channel-state-information (CSI) assisted amplify-and-forward (AF) relaying systems. The framework is applicable to both one-way and two-way relaying over arbitrary Nakagami-$m$ fading channels, and it includes previously reported results as special cases. Specifically, the mathematical framework is firstly developed under the umbrella of the weighted harmonic mean of two Gamma-distributed variables in conjunction with the theory of Pad\\\\\\'e approximants. Then, general expressions for the received signal-to-noise ratios of the users in one-way/two-way relaying systems and the corresponding moments, moment generation function, and cumulative density function are established. Subsequently, the mathematical framework is applied to analyze, compare, and gain insights into system performance of one-way and two-way relaying techniques, in terms of outage probability, average symbol error probability, and achievable data rate. All analytical results are corroborated by simulation results as well as previously reported results whenever available, and they are shown to be efficient tools to evaluate and compare system performance of one-way and two-way relaying.

  14. A framework for model-based optimization of bioprocesses under uncertainty: Identifying critical parameters and operating variables

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    This study presents the development and application of a systematic model-based framework for bioprocess optimization, evaluated on a cellulosic ethanol production case study. The implementation of the framework involves the use of dynamic simulations, sophisticated uncertainty analysis (Monte...

  15. CHIME : service-oriented framework for adaptive web-based systems

    NARCIS (Netherlands)

    Chepegin, V.; Aroyo, L.M.; De Bra, P.M.E.; Houben, G.J.P.M.; De Bra, P.M.E.

    2003-01-01

    In this paper we present our view on how the current development of knowledge engineering in the context of Semantic Web can contribute to the better applicability, reusability and sharability of adaptive web-based systems. We propose a service-oriented framework for adaptive web-based systems,

  16. Towards a framework for a professional development programme: empowering teachers for context-based chemistry education.

    NARCIS (Netherlands)

    Stolk, M.; Bulte, A.M.W.; de Jong, O.; Pilot, A.

    2009-01-01

    The aim of this study is to develop a framework for professional development programmes that empowers chemistry teachers to teach and design context-based chemistry curricula. Firstly, teachers involvement, their concerns and their professional development in several context-based curriculum

  17. A life-cycle based decision-making framework for electricity generation system planning

    Energy Technology Data Exchange (ETDEWEB)

    Norrie, S.J.; Fang, L. [Ryerson Polytechnic Univ., Toronto, ON (Canada). Environmental Applied Science and Management Graduate Program

    2006-07-01

    This paper proposed a framework for the consideration of multiple objectives in the long-term planning of electricity generation systems. The framework was comprised of 3 components: (1) information based on life-cycle inventories of electricity generation technologies; (2) a set of alternative scenarios to be evaluated and ranked using the framework; and (3) stakeholder values for decision objectives. Scenarios were developed to represent a set of future conditions, and values were derived through the use of questionnaires. Planning for electricity generation in Ontario was selected as a test case for the DM framework. Three scenarios were presented: (1) a business as usual scenario characterized by large, central power plants; (2) a mix of central power plants, distributed generation, and advanced conventional fuel technologies; and (3) small-scale distributed and renewable energy sources and aggressive demand-side management. The life-cycle based information from the scenario evaluation was used to estimate the performance of each scenario on the established decision criteria. Results showed that scenario 3 was the closest to achieving the fundamental objectives according to the decision criteria. It was concluded that the DM framework showed that the use of holistic environmental information and preferential information for multiple objectives can be integrated into a framework that openly and consistently evaluates a set of alternative scenarios. 31 refs., 7 tabs., 4 figs.

  18. An MGF-based unified framework to determine the joint statistics of partial sums of ordered i.n.d. random variables

    KAUST Repository

    Nam, Sungsik; Yang, Hongchuan; Alouini, Mohamed-Slim; Kim, Dongin

    2014-01-01

    framework to determine the joint statistics of partial sums of ordered i.n.d. RVs. Our mathematical formalism is illustrated with an application on the exact performance analysis of the capture probability of generalized selection combining (GSC)-based RAKE

  19. A new web-based framework development for fuzzy multi-criteria group decision-making.

    Science.gov (United States)

    Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik

    2016-01-01

    Fuzzy multi-criteria group decision making (FMCGDM) process is usually used when a group of decision-makers faces imprecise data or linguistic variables to solve the problems. However, this process contains many methods that require many time-consuming calculations depending on the number of criteria, alternatives and decision-makers in order to reach the optimal solution. In this study, a web-based FMCGDM framework that offers decision-makers a fast and reliable response service is proposed. The proposed framework includes commonly used tools for multi-criteria decision-making problems such as fuzzy Delphi, fuzzy AHP and fuzzy TOPSIS methods. The integration of these methods enables taking advantages of the strengths and complements each method's weakness. Finally, a case study of location selection for landfill waste in Morocco is performed to demonstrate how this framework can facilitate decision-making process. The results demonstrate that the proposed framework can successfully accomplish the goal of this study.

  20. A security framework for nationwide health information exchange based on telehealth strategy.

    Science.gov (United States)

    Zaidan, B B; Haiqi, Ahmed; Zaidan, A A; Abdulnabi, Mohamed; Kiah, M L Mat; Muzamel, Hussaen

    2015-05-01

    This study focuses on the situation of health information exchange (HIE) in the context of a nationwide network. It aims to create a security framework that can be implemented to ensure the safe transmission of health information across the boundaries of care providers in Malaysia and other countries. First, a critique of the major elements of nationwide health information networks is presented from the perspective of security, along with such topics as the importance of HIE, issues, and main approaches. Second, a systematic evaluation is conducted on the security solutions that can be utilized in the proposed nationwide network. Finally, a secure framework for health information transmission is proposed within a central cloud-based model, which is compatible with the Malaysian telehealth strategy. The outcome of this analysis indicates that a complete security framework for a global structure of HIE is yet to be defined and implemented. Our proposed framework represents such an endeavor and suggests specific techniques to achieve this goal.

  1. Systematic Optimization-Based Integrated Chemical Product–Process Design Framework

    DEFF Research Database (Denmark)

    Cignitti, Stefano; Mansouri, Seyed Soheil; Woodley, John M.

    2018-01-01

    An integrated optimization-based framework for product and process design is proposed. The framework uses a set of methods and tools to obtain the optimal product–process design solution given a set of economic and environmental sustainability targets. The methods and tools required are property...... of the framework is demonstrated through three case studies: (i) refrigeration cycle unit for R134a replacement, (ii) a mixed working fluid design problem for R134a replacement, and (iii) pure solvent design for water-acetic acid LLE extraction. Through the application of the framework it is demonstrated that all...... prediction through group contributions, unless supported with a database, computer-aided molecular and mixture/blend design for generation of novel as well as existing products and mathematical programming for formulating and solving multiscale integrated process–product design problems. The application...

  2. A Framework-Based Approach for Fault-Tolerant Service Robots

    Directory of Open Access Journals (Sweden)

    Heejune Ahn

    2012-11-01

    Full Text Available Recently the component-based approach has become a major trend in intelligent service robot development due to its reusability and productivity. The framework in a component-based system should provide essential services for application components. However, to our knowledge the existing robot frameworks do not yet support fault tolerance service. Moreover, it is often believed that faults can be handled only at the application level. In this paper, by extending the robot framework with the fault tolerance function, we argue that the framework-based fault tolerance approach is feasible and even has many benefits, including that: 1 the system integrators can build fault tolerance applications from non-fault-aware components; 2 the constraints of the components and the operating environment can be considered at the time of integration, which – cannot be anticipated eaily at the time of component development; 3 consistency in system reliability can be obtained even in spite of diverse application component sources. In the proposed construction, we build XML rule files defining the rules for probing and determining the fault conditions of each component, contamination cases from a faulty component, and the possible recovery and safety methods. The rule files are established by a system integrator and the fault manager in the framework controls the fault tolerance process according to the rules. We demonstrate that the fault-tolerant framework can incorporate widely accepted fault tolerance techniques. The effectiveness and real-time performance of the framework-based approach and its techniques are examined by testing an autonomous mobile robot in typical fault scenarios.

  3. Towards a Transferable UAV-Based Framework for River Hydromorphological Characterization.

    Science.gov (United States)

    Rivas Casado, Mónica; González, Rocío Ballesteros; Ortega, José Fernando; Leinster, Paul; Wright, Ros

    2017-09-26

    The multiple protocols that have been developed to characterize river hydromorphology, partly in response to legislative drivers such as the European Union Water Framework Directive (EU WFD), make the comparison of results obtained in different countries challenging. Recent studies have analyzed the comparability of existing methods, with remote sensing based approaches being proposed as a potential means of harmonizing hydromorphological characterization protocols. However, the resolution achieved by remote sensing products may not be sufficient to assess some of the key hydromorphological features that are required to allow an accurate characterization. Methodologies based on high resolution aerial photography taken from Unmanned Aerial Vehicles (UAVs) have been proposed by several authors as potential approaches to overcome these limitations. Here, we explore the applicability of an existing UAV based framework for hydromorphological characterization to three different fluvial settings representing some of the distinct ecoregions defined by the WFD geographical intercalibration groups (GIGs). The framework is based on the automated recognition of hydromorphological features via tested and validated Artificial Neural Networks (ANNs). Results show that the framework is transferable to the Central-Baltic and Mediterranean GIGs with accuracies in feature identification above 70%. Accuracies of 50% are achieved when the framework is implemented in the Very Large Rivers GIG. The framework successfully identified vegetation, deep water, shallow water, riffles, side bars and shadows for the majority of the reaches. However, further algorithm development is required to ensure a wider range of features (e.g., chutes, structures and erosion) are accurately identified. This study also highlights the need to develop an objective and fit for purpose hydromorphological characterization framework to be adopted within all EU member states to facilitate comparison of results.

  4. A General Accelerated Degradation Model Based on the Wiener Process.

    Science.gov (United States)

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-12-06

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  5. Learning in neural networks based on a generalized fluctuation theorem

    Science.gov (United States)

    Hayakawa, Takashi; Aoyagi, Toshio

    2015-11-01

    Information maximization has been investigated as a possible mechanism of learning governing the self-organization that occurs within the neural systems of animals. Within the general context of models of neural systems bidirectionally interacting with environments, however, the role of information maximization remains to be elucidated. For bidirectionally interacting physical systems, universal laws describing the fluctuation they exhibit and the information they possess have recently been discovered. These laws are termed fluctuation theorems. In the present study, we formulate a theory of learning in neural networks bidirectionally interacting with environments based on the principle of information maximization. Our formulation begins with the introduction of a generalized fluctuation theorem, employing an interpretation appropriate for the present application, which differs from the original thermodynamic interpretation. We analytically and numerically demonstrate that the learning mechanism presented in our theory allows neural networks to efficiently explore their environments and optimally encode information about them.

  6. The inherent dangers of using computable general equilibrium models as a single integrated modelling framework for sustainability impact assessment. A critical note on Boehringer and Loeschel (2006)

    International Nuclear Information System (INIS)

    Scrieciu, S. Serban

    2007-01-01

    The search for methods of assessment that best evaluate and integrate the trade-offs and interactions between the economic, environmental and social components of development has been receiving a new impetus due to the requirement that sustainability concerns be incorporated into the policy formulation process. A paper forthcoming in Ecological Economics (Boehringer, C., Loeschel, A., in press. Computable general equilibrium models for sustainability impact assessment: status quo and prospects, Ecological Economics.) claims that Computable General Equilibrium (CGE) models may potentially represent the much needed 'back-bone' tool to carry out reliable integrated quantitative Sustainability Impact Assessments (SIAs). While acknowledging the usefulness of CGE models for some dimensions of SIA, this commentary questions the legitimacy of employing this particular economic modelling tool as a single integrating modelling framework for a comprehensive evaluation of the multi-dimensional, dynamic and complex interactions between policy and sustainability. It discusses several inherent dangers associated with the advocated prospects for the CGE modelling approach to contribute to comprehensive and reliable sustainability impact assessments. The paper warns that this reductionist viewpoint may seriously infringe upon the basic values underpinning the SIA process, namely a transparent, heterogeneous, balanced, inter-disciplinary, consultative and participatory take to policy evaluation and building of the evidence-base. (author)

  7. Optimal design under uncertainty of a passive defense structure against snow avalanches: from a general Bayesian framework to a simple analytical model

    Directory of Open Access Journals (Sweden)

    N. Eckert

    2008-10-01

    Full Text Available For snow avalanches, passive defense structures are generally designed by considering high return period events. In this paper, taking inspiration from other natural hazards, an alternative method based on the maximization of the economic benefit of the defense structure is proposed. A general Bayesian framework is described first. Special attention is given to the problem of taking the poor local information into account in the decision-making process. Therefore, simplifying assumptions are made. The avalanche hazard is represented by a Peak Over Threshold (POT model. The influence of the dam is quantified in terms of runout distance reduction with a simple relation derived from small-scale experiments using granular media. The costs corresponding to dam construction and the damage to the element at risk are roughly evaluated for each dam height-hazard value pair, with damage evaluation corresponding to the maximal expected loss. Both the classical and the Bayesian risk functions can then be computed analytically. The results are illustrated with a case study from the French avalanche database. A sensitivity analysis is performed and modelling assumptions are discussed in addition to possible further developments.

  8. A rule-based verification and control framework in ATLAS Trigger-DAQ

    CERN Document Server

    Kazarov, A; Lehmann-Miotto, G; Sloper, J E; Ryabov, Yu; Computing In High Energy and Nuclear Physics

    2007-01-01

    In order to meet the requirements of ATLAS data taking, the ATLAS Trigger-DAQ system is composed of O(1000) of applications running on more than 2600 computers in a network. With such system size, s/w and h/w failures are quite often. To minimize system downtime, the Trigger-DAQ control system shall include advanced verification and diagnostics facilities. The operator should use tests and expertise of the TDAQ and detectors developers in order to diagnose and recover from errors, if possible automatically. The TDAQ control system is built as a distributed tree of controllers, where behavior of each controller is defined in a rule-based language allowing easy customization. The control system also includes verification framework which allow users to develop and configure tests for any component in the system with different levels of complexity. It can be used as a stand-alone test facility for a small detector installation, as part of the general TDAQ initialization procedure, and for diagnosing the problems ...

  9. The SNEDAX Data Base, General Description and Users' Instructions

    International Nuclear Information System (INIS)

    Helm, F.

    1996-09-01

    The SNEDAX Data Base contains information on assemblies built and experiments performed in the fast neutron critical facilities SNEAK (FZK Karlsruhe), MASURCA (CEA Cadarache), ZEBRA (AEA Winfrith) and RRR (Rossendorf Ringzonenreaktor). This report describes the general scope of SNEDAX, the transfer of information from the experimental facilities, and the capabilities to produce graphics and input files for frequently used computer codes. The first part contains general information about the contents and the capabilities of the data base. The second part gives the instructions for persons who actually work with it. The contents in both parts is arranged in a similar way using as far as practical an analogous decimal coding of the sections. In the figures and the annex examples are given in the way in which the data are stored and how they are presented as graphics. The data base is described as it exists in the middle of 1996. It is recognized that there are still many improvements desirable, in particular with respect to a consistent description of the experiments, giving reasonable but not excessive amount of information. Since work in the field of fast critical experiments will be discontinued at FZK, it is planned that the further administration and improvement will be taken over by CEA Cadarache with the support of IPPE Obninsk

  10. Using a New Event-Based Simulation Framework for Investigating Resource Provisioning in Clouds

    Directory of Open Access Journals (Sweden)

    Simon Ostermann

    2011-01-01

    Full Text Available Today, Cloud computing proposes an attractive alternative to building large-scale distributed computing environments by which resources are no longer hosted by the scientists' computational facilities, but leased from specialised data centres only when and for how long they are needed. This new class of Cloud resources raises new interesting research questions in the fields of resource management, scheduling, fault tolerance, or quality of service, requiring hundreds to thousands of experiments for finding valid solutions. To enable such research, a scalable simulation framework is typically required for early prototyping, extensive testing and validation of results before the real deployment is performed. The scope of this paper is twofold. In the first part we present GroudSim, a Grid and Cloud simulation toolkit for scientific computing based on a scalable simulation-independent discrete-event engine. GroudSim provides a comprehensive set of features for complex simulation scenarios from simple job executions on leased computing resources to file transfers, calculation of costs and background load on resources. Simulations can be parameterised and are easily extendable by probability distribution packages for failures which normally occur in complex distributed environments. Experimental results demonstrate the improved scalability of GroudSim compared to a related process-based simulation approach. In the second part, we show the use of the GroudSim simulator to analyse the problem of dynamic provisioning of Cloud resources to scientific workflows that do not benefit from sufficient Grid resources as required by their computational demands. We propose and study four strategies for provisioning and releasing Cloud resources that take into account the general leasing model encountered in today's commercial Cloud environments based on resource bulks, fuzzy descriptions and hourly payment intervals. We study the impact of our techniques to the

  11. Gradient-based adaptation of general gaussian kernels.

    Science.gov (United States)

    Glasmachers, Tobias; Igel, Christian

    2005-10-01

    Gradient-based optimizing of gaussian kernel functions is considered. The gradient for the adaptation of scaling and rotation of the input space is computed to achieve invariance against linear transformations. This is done by using the exponential map as a parameterization of the kernel parameter manifold. By restricting the optimization to a constant trace subspace, the kernel size can be controlled. This is, for example, useful to prevent overfitting when minimizing radius-margin generalization performance measures. The concepts are demonstrated by training hard margin support vector machines on toy data.

  12. Generalized logistic map and its application in chaos based cryptography

    Science.gov (United States)

    Lawnik, M.

    2017-12-01

    The logistic map is commonly used in, for example, chaos based cryptography. However, its properties do not render a safe construction of encryption algorithms. Thus, the scope of the paper is a proposal of generalization of the logistic map by means of a wellrecognized family of chaotic maps. In the next step, an analysis of Lyapunov exponent and the distribution of the iterative variable are studied. The obtained results confirm that the analyzed model can safely and effectively replace a classic logistic map for applications involving chaotic cryptography.

  13. Generalized model for Memristor-based Wien family oscillators

    KAUST Repository

    Talukdar, Abdul Hafiz Ibne

    2012-07-23

    In this paper, we report the unconventional characteristics of Memristor in Wien oscillators. Generalized mathematical models are developed to analyze four members of the Wien family using Memristors. Sustained oscillation is reported for all types though oscillating resistance and time dependent poles are present. We have also proposed an analytical model to estimate the desired amplitude of oscillation before the oscillation starts. These Memristor-based oscillation results, presented for the first time, are in good agreement with simulation results. © 2011 Elsevier Ltd.

  14. Governmental, Legal and Regulatory Framework for Safety. General Safety Requirements. Part 1 (French Edition); Cadre gouvernemental, legislatif et reglementaire de la surete. Prescriptions generales de surete. Partie 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-11-15

    The objective of this publication is to establish requirements in respect of the governmental, legal and regulatory framework for safety. It covers the essential aspects of the framework for establishing a regulatory body and taking other actions necessary to ensure the effective regulatory control of facilities and activities utilized for peaceful purposes. Other responsibilities and functions, such as liaison within the global safety regime and on support services for safety (including radiation protection), emergency preparedness and response, nuclear security, and the State system of accounting for and control of nuclear material, are also covered.

  15. Governmental, Legal and Regulatory Framework for Safety. General Safety Requirements. Part 1 (Spanish Edition); Marco gubernamental, juridico y regulador para la seguridad. Requisitos de Seguridad Generales. Parte 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-11-15

    The objective of this publication is to establish requirements in respect of the governmental, legal and regulatory framework for safety. It covers the essential aspects of the framework for establishing a regulatory body and taking other actions necessary to ensure the effective regulatory control of facilities and activities utilized for peaceful purposes. Other responsibilities and functions, such as liaison within the global safety regime and on support services for safety (including radiation protection), emergency preparedness and response, nuclear security, and the State system of accounting for and control of nuclear material, are also covered.

  16. Framework for ethical decision-making based on mission, vision and values of the institution.

    Science.gov (United States)

    Kotalik, Jaro; Covino, Cathy; Doucette, Nadine; Henderson, Steve; Langlois, Michelle; McDaid, Karen; Pedri, Louisa M

    2014-06-01

    The authors led the development of a framework for ethical decision-making for an Academic Health Sciences Centre. They understood the existing mission, vision, and values statement (MVVs) of the centre as a foundational assertion that embodies an ethical commitment of the institution. Reflecting the Patient and Family Centred Model of Care the institution is living, the MVVs is a suitable base on which to construct an ethics framework. The resultant framework consists of a set of questions for each of the MVVs. Users of the framework are expected to identify two or more possible decisions to address the issue at hand and then, by applying the provided sequence of questions to each, examine these options and determine the overall ethically preferable decision. The construction of such a framework requires the creative involvement of the institution's staff. Thus the development of the framework can represent a training process in ethical decision-making as well as advance the ethical atmosphere of the institution. This novel approach has the advantage of placing the MVVs on active duty, at the centre of ethical decision-making, and lifts it from its otherwise relative obscurity in most institutions.

  17. The role of advanced nursing in lung cancer: A framework based development.

    Science.gov (United States)

    Serena, A; Castellani, P; Fucina, N; Griesser, A-C; Jeanmonod, J; Peters, S; Eicher, M

    2015-12-01

    Advanced Practice Lung Cancer Nurses (APLCN) are well-established in several countries but their role has yet to be established in Switzerland. Developing an innovative nursing role requires a structured approach to guide successful implementation and to meet the overarching goal of improved nursing sensitive patient outcomes. The "Participatory, Evidence-based, Patient-focused process, for guiding the development, implementation, and evaluation of advanced practice nursing" (PEPPA framework) is one approach that was developed in the context of the Canadian health system. The purpose of this article is to describe the development of an APLCN model at a Swiss Academic Medical Center as part of a specialized Thoracic Cancer Center and to evaluate the applicability of PEPPA framework in this process. In order to develop and implement the APLCN role, we applied the first seven phases of the PEPPA framework. This article spreads the applicability of the PEPPA framework for an APLCN development. This framework allowed us to i) identify key components of an APLCN model responsive to lung cancer patients' health needs, ii) identify role facilitators and barriers, iii) implement the APLCN role and iv) design a feasibility study of this new role. The PEPPA framework provides a structured process for implementing novel Advanced Practice Nursing roles in a local context, particularly where such roles are in their infancy. Two key points in the process include assessing patients' health needs and involving key stakeholders. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. A full life cycle nuclear knowledge management framework based on digital system

    International Nuclear Information System (INIS)

    Wang, Minglu; Zheng, Mingguang; Tian, Lin; Qiu, Zhongming; Li, Xiaoyan

    2017-01-01

    Highlights: • A full life cycle nuclear power plant knowledge management framework is introduced. • This framework benefits the safe design, construction, operation and maintenance. • This framework enhances safety, economy and reliability of nuclear power plant. - Abstract: The nuclear power plant is highly knowledge-intensive facility. With the rapid advent and development of modern information and communication technology, knowledge management in nuclear industry has been provided with new approaches and possibilities. This paper introduces a full cycle nuclear power plant knowledge management framework based on digital system and tries to find solutions to knowledge creation, sharing, transfer, application and further innovation in nuclear industry. This framework utilizes information and digital technology to build top-tier object driven work environment, automatic design and analysis integration platform, digital dynamic performance Verification & Validation (V&V) platform, collaborative manufacture procedure, digital construction platform, online monitoring and configuration management which benefit knowledge management in NPP full life cycle. The suggested framework will strengthen the design basis of the nuclear power plants (NPPs) and will ensure the safety of the NPP design throughout the whole lifetime of the plant.

  19. Morphology Dependent Flow Stress in Nickel-Based Superalloys in the Multi-Scale Crystal Plasticity Framework

    Directory of Open Access Journals (Sweden)

    Shahriyar Keshavarz

    2017-11-01

    Full Text Available This paper develops a framework to obtain the flow stress of nickel-based superalloys as a function of γ-γ’ morphology. The yield strength is a major factor in the design of these alloys. This work provides additional effects of γ’ morphology in the design scope that has been adopted for the model developed by authors. In general, the two-phase γ-γ’ morphology in nickel-based superalloys can be divided into three variables including γ’ shape, γ’ volume fraction and γ’ size in the sub-grain microstructure. In order to obtain the flow stress, non-Schmid crystal plasticity constitutive models at two length scales are employed and bridged through a homogenized multi-scale framework. The multi-scale framework includes two sub-grain and homogenized grain scales. For the sub-grain scale, a size-dependent, dislocation-density-based finite element model (FEM of the representative volume element (RVE with explicit depiction of the γ-γ’ morphology is developed as a building block for the homogenization. For the next scale, an activation-energy-based crystal plasticity model is developed for the homogenized single crystal of Ni-based superalloys. The constitutive models address the thermo-mechanical behavior of nickel-based superalloys for a large temperature range and include orientation dependencies and tension-compression asymmetry. This homogenized model is used to obtain the morphology dependence on the flow stress in nickel-based superalloys and can significantly expedite crystal plasticity FE simulations in polycrystalline microstructures, as well as higher scale FE models in order to cast and design superalloys.

  20. Ethical Frameworks in Public Health Decision-Making: Defending a Value-Based and Pluralist Approach.

    Science.gov (United States)

    Grill, Kalle; Dawson, Angus

    2017-12-01

    A number of ethical frameworks have been proposed to support decision-making in public health and the evaluation of public health policy and practice. This is encouraging, since ethical considerations are of paramount importance in health policy. However, these frameworks have various deficiencies, in part because they incorporate substantial ethical positions. In this article, we discuss and criticise a framework developed by James Childress and Ruth Bernheim, which we consider to be the state of the art in the field. Their framework distinguishes aims, such as the promotion of public health, from constraints on the pursuit of those aims, such as the requirement to avoid limitations to liberty, or the requirement to be impartial. We show how this structure creates both theoretical and practical problems. We then go on to present and defend a more practical framework, one that is neutral in avoiding precommitment to particular values and how they ought to be weighted. We believe ethics is at the very heart of such weightings and our framework is developed to reflect this belief. It is therefore both pluralist and value-based. We compare our new framework to Childress and Bernheim's and outline its advantages. It is justified by its impetus to consider a wide range of alternatives and its tendency to direct decisions towards the best alternatives, as well as by the information provided by the ranking of alternatives and transparent explication of the judgements that motivate this ranking. The new framework presented should be useful to decision-makers in public health, as well as being a means to stimulate further reflection on the role of ethics in public health.

  1. Ultrawide Bandwidth Receiver Based on a Multivariate Generalized Gaussian Distribution

    KAUST Repository

    Ahmed, Qasim Zeeshan

    2015-04-01

    Multivariate generalized Gaussian density (MGGD) is used to approximate the multiple access interference (MAI) and additive white Gaussian noise in pulse-based ultrawide bandwidth (UWB) system. The MGGD probability density function (pdf) is shown to be a better approximation of a UWB system as compared to multivariate Gaussian, multivariate Laplacian and multivariate Gaussian-Laplacian mixture (GLM). The similarity between the simulated and the approximated pdf is measured with the help of modified Kullback-Leibler distance (KLD). It is also shown that MGGD has the smallest KLD as compared to Gaussian, Laplacian and GLM densities. A receiver based on the principles of minimum bit error rate is designed for the MGGD pdf. As the requirement is stringent, the adaptive implementation of the receiver is also carried out in this paper. Training sequence of the desired user is the only requirement when implementing the detector adaptively. © 2002-2012 IEEE.

  2. Reference Information Based Remote Sensing Image Reconstruction with Generalized Nonconvex Low-Rank Approximation

    Directory of Open Access Journals (Sweden)

    Hongyang Lu

    2016-06-01

    Full Text Available Because of the contradiction between the spatial and temporal resolution of remote sensing images (RSI and quality loss in the process of acquisition, it is of great significance to reconstruct RSI in remote sensing applications. Recent studies have demonstrated that reference image-based reconstruction methods have great potential for higher reconstruction performance, while lacking accuracy and quality of reconstruction. For this application, a new compressed sensing objective function incorporating a reference image as prior information is developed. We resort to the reference prior information inherent in interior and exterior data simultaneously to build a new generalized nonconvex low-rank approximation framework for RSI reconstruction. Specifically, the innovation of this paper consists of the following three respects: (1 we propose a nonconvex low-rank approximation for reconstructing RSI; (2 we inject reference prior information to overcome over smoothed edges and texture detail losses; (3 on this basis, we combine conjugate gradient algorithms and a single-value threshold (SVT simultaneously to solve the proposed algorithm. The performance of the algorithm is evaluated both qualitatively and quantitatively. Experimental results demonstrate that the proposed algorithm improves several dBs in terms of peak signal to noise ratio (PSNR and preserves image details significantly compared to most of the current approaches without reference images as priors. In addition, the generalized nonconvex low-rank approximation of our approach is naturally robust to noise, and therefore, the proposed algorithm can handle low resolution with noisy inputs in a more unified framework.

  3. Consensus-based training and assessment model for general surgery.

    Science.gov (United States)

    Szasz, P; Louridas, M; de Montbrun, S; Harris, K A; Grantcharov, T P

    2016-05-01

    Surgical education is becoming competency-based with the implementation of in-training milestones. Training guidelines should reflect these changes and determine the specific procedures for such milestone assessments. This study aimed to develop a consensus view regarding operative procedures and tasks considered appropriate for junior and senior trainees, and the procedures that can be used as technical milestone assessments for trainee progression in general surgery. A Delphi process was followed where questionnaires were distributed to all 17 Canadian general surgery programme directors. Items were ranked on a 5-point Likert scale, with consensus defined as Cronbach's α of at least 0·70. Items rated 4 or above on the 5-point Likert scale by 80 per cent of the programme directors were included in the models. Two Delphi rounds were completed, with 14 programme directors taking part in round one and 11 in round two. The overall consensus was high (Cronbach's α = 0·98). The training model included 101 unique procedures and tasks, 24 specific to junior trainees, 68 specific to senior trainees, and nine appropriate to all. The assessment model included four procedures. A system of operative procedures and tasks for junior- and senior-level trainees has been developed along with an assessment model for trainee progression. These can be used as milestones in competency-based assessments. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.

  4. A unified MGF-based capacity analysis of diversity combiners over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-03-01

    Unified exact ergodic capacity results for L-branch coherent diversity combiners including equal-gain combining (EGC) and maximal-ratio combining (MRC) are not known. This paper develops a novel generic framework for the capacity analysis of L-branch EGC/MRC over generalized fading channels. The framework is used to derive new results for the gamma-shadowed generalized Nakagami-m fading model which can be a suitable model for the fading environments encountered by high frequency (60 GHz and above) communications. The mathematical formalism is illustrated with some selected numerical and simulation results confirming the correctness of our newly proposed framework. © 2012 IEEE.

  5. Teaching and Learning Numerical Analysis and Optimization: A Didactic Framework and Applications of Inquiry-Based Learning

    Science.gov (United States)

    Lappas, Pantelis Z.; Kritikos, Manolis N.

    2018-01-01

    The main objective of this paper is to propose a didactic framework for teaching Applied Mathematics in higher education. After describing the structure of the framework, several applications of inquiry-based learning in teaching numerical analysis and optimization are provided to illustrate the potential of the proposed framework. The framework…

  6. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments.

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  7. A framework for optimal kernel-based manifold embedding of medical image data.

    Science.gov (United States)

    Zimmer, Veronika A; Lekadir, Karim; Hoogendoorn, Corné; Frangi, Alejandro F; Piella, Gemma

    2015-04-01

    Kernel-based dimensionality reduction is a widely used technique in medical image analysis. To fully unravel the underlying nonlinear manifold the selection of an adequate kernel function and of its free parameters is critical. In practice, however, the kernel function is generally chosen as Gaussian or polynomial and such standard kernels might not always be optimal for a given image dataset or application. In this paper, we present a study on the effect of the kernel functions in nonlinear manifold embedding of medical image data. To this end, we first carry out a literature review on existing advanced kernels developed in the statistics, machine learning, and signal processing communities. In addition, we implement kernel-based formulations of well-known nonlinear dimensional reduction techniques such as Isomap and Locally Linear Embedding, thus obtaining a unified framework for manifold embedding using kernels. Subsequently, we present a method to automatically choose a kernel function and its associated parameters from a pool of kernel candidates, with the aim to generate the most optimal manifold embeddings. Furthermore, we show how the calculated selection measures can be extended to take into account the spatial relationships in images, or used to combine several kernels to further improve the embedding results. Experiments are then carried out on various synthetic and phantom datasets for numerical assessment of the methods. Furthermore, the workflow is applied to real data that include brain manifolds and multispectral images to demonstrate the importance of the kernel selection in the analysis of high-dimensional medical images. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. A machine learning framework involving EEG-based functional connectivity to diagnose major depressive disorder (MDD).

    Science.gov (United States)

    Mumtaz, Wajid; Ali, Syed Saad Azhar; Yasin, Mohd Azhar Mohd; Malik, Aamir Saeed

    2018-02-01

    Major depressive disorder (MDD), a debilitating mental illness, could cause functional disabilities and could become a social problem. An accurate and early diagnosis for depression could become challenging. This paper proposed a machine learning framework involving EEG-derived synchronization likelihood (SL) features as input data for automatic diagnosis of MDD. It was hypothesized that EEG-based SL features could discriminate MDD patients and healthy controls with an acceptable accuracy better than measures such as interhemispheric coherence and mutual information. In this work, classification models such as support vector machine (SVM), logistic regression (LR) and Naïve Bayesian (NB) were employed to model relationship between the EEG features and the study groups (MDD patient and healthy controls) and ultimately achieved discrimination of study participants. The results indicated that the classification rates were better than chance. More specifically, the study resulted into SVM classification accuracy = 98%, sensitivity = 99.9%, specificity = 95% and f-measure = 0.97; LR classification accuracy = 91.7%, sensitivity = 86.66%, specificity = 96.6% and f-measure = 0.90; NB classification accuracy = 93.6%, sensitivity = 100%, specificity = 87.9% and f-measure = 0.95. In conclusion, SL could be a promising method for diagnosing depression. The findings could be generalized to develop a robust CAD-based tool that may help for clinical purposes.

  9. Clinical TVA-based studies: a general review

    Directory of Open Access Journals (Sweden)

    Thomas eHabekost

    2015-03-01

    Full Text Available In combination with whole report and partial report tasks, the Theory of Visual Attention (TVA can be used to estimate individual differences in five basic attentional parameters: The visual processing speed, the storage capacity of visual short-term memory, the perceptual threshold, the efficiency of top-down selectivity, and the spatial bias of attentional weighting. TVA-based assessment has been used in about 30 studies to investigate attentional deficits in a range of neurological and psychiatric conditions: (a neglect and simultanagnosia, (b reading disturbances, (c aging and neurodegenerative diseases, and most recently (d neurodevelopmental disorders. The article introduces TVA based assessment, discusses its methodology and psychometric properties, and reviews the progress made in each of the four research fields. The empirical results demonstrate the general usefulness of TVA-based assessment for many types of clinical neuropsychological research. The method’s most important qualities are cognitive specificity and theoretical grounding, but it is also characterized by good reliability and sensitivity to minor deficits. The review concludes by pointing to promising new areas for clinical TVA-based research.

  10. A Streams-Based Framework for Defining Location-Based Queries

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Xuegang, Huang

    2007-01-01

    n infrastructure is emerging that supports the delivery of on-line, location-enabled services to mobile users. Such services involve novel database queries, and the database research community is quite active in proposing techniques for the efficient processing of such queries. In parallel to this......, the management of data streams has become an active area of research. While most research in mobile services concerns performance issues, this paper aims to establish a formal framework for defining the semantics of queries encountered in mobile services, most notably the so-called continuous queries...... that are particularly relevant in this context. Rather than inventing an entirely new framework, the paper proposes a framework that builds on concepts from data streams and temporal databases. Definitions of example queries demonstrates how the framework enables clear formulation of query semantics and the comparison...

  11. Towards A Streams-Based Framework for Defining Location-Based Queries

    DEFF Research Database (Denmark)

    Huang, Xuegang; Jensen, Christian S.

    2004-01-01

    An infrastructure is emerging that supports the delivery of on-line, location-enabled services to mobile users. Such services involve novel database queries, and the database research community is quite active in proposing techniques for the effi- cient processing of such queries. In parallel...... to this, the management of data streams has become an active area of research. While most research in mobile services concerns performance issues, this paper aims to establish a formal framework for defining the semantics of queries encountered in mobile services, most notably the so-called continuous...... queries that are particularly relevant in this context. Rather than inventing an entirely new framework, the paper proposes a framework that builds on concepts from data streams and temporal databases. Definitions of example queries demonstrates how the framework enables clear formulation of query...

  12. Small drinking water systems under spatiotemporal water quality variability: a risk-based performance benchmarking framework.

    Science.gov (United States)

    Bereskie, Ty; Haider, Husnain; Rodriguez, Manuel J; Sadiq, Rehan

    2017-08-23

    Traditional approaches for benchmarking drinking water systems are binary, based solely on the compliance and/or non-compliance of one or more water quality performance indicators against defined regulatory guidelines/standards. The consequence of water quality failure is dependent on location within a water supply system as well as time of the year (i.e., season) with varying levels of water consumption. Conventional approaches used for water quality comparison purposes fail to incorporate spatiotemporal variability and degrees of compliance and/or non-compliance. This can lead to misleading or inaccurate performance assessment data used in the performance benchmarking process. In this research, a hierarchical risk-based water quality performance benchmarking framework is proposed to evaluate small drinking water systems (SDWSs) through cross-comparison amongst similar systems. The proposed framework (R WQI framework) is designed to quantify consequence associated with seasonal and location-specific water quality issues in a given drinking water supply system to facilitate more efficient decision-making for SDWSs striving for continuous performance improvement. Fuzzy rule-based modelling is used to address imprecision associated with measuring performance based on singular water quality guidelines/standards and the uncertainties present in SDWS operations and monitoring. This proposed R WQI framework has been demonstrated using data collected from 16 SDWSs in Newfoundland and Labrador and Quebec, Canada, and compared to the Canadian Council of Ministers of the Environment WQI, a traditional, guidelines/standard-based approach. The study found that the R WQI framework provides an in-depth state of water quality and benchmarks SDWSs more rationally based on the frequency of occurrence and consequence of failure events.

  13. A Design Based Research Framework for Implementing a Transnational Mobile and Blended Learning Solution

    Science.gov (United States)

    Palalas, Agnieszka; Berezin, Nicole; Gunawardena, Charlotte; Kramer, Gretchen

    2015-01-01

    The article proposes a modified Design-Based Research (DBR) framework which accommodates the various socio-cultural factors that emerged in the longitudinal PA-HELP research study at Central University College (CUC) in Ghana, Africa. A transnational team of stakeholders from Ghana, Canada, and the USA collaborated on the development,…

  14. A three-level framework for performance-based railway timetabling

    DEFF Research Database (Denmark)

    Goverde, Rob M P; Bešinović, Nikola; Binder, Anne

    2016-01-01

    . This paper presents a performance-based railway timetabling framework integrating timetable construction and evaluation on three levels: microscopic, macroscopic, and a corridor fine-tuning level, where each performance indicator is optimized or evaluated at the appropriate level. A modular implementation...

  15. The Person over Standardisation: A Humanistic Framework for Teacher Learning in Diverse School-Based Contexts

    Science.gov (United States)

    Kazanjian, Christopher J.; Choi, Su-Jin

    2016-01-01

    This paper argues that the purpose of education is to help students realise their unique potentials and pursue inner directions. With this assumption, we critique the inadequacy of the current emphasis on standardisation and provide a theoretical framework for teacher education based on humanistic psychology. Three tenets of humanistic psychology,…

  16. Simulation-Based Business Case for PSS: A System Dynamics Framework

    DEFF Research Database (Denmark)

    Rodrigues, Vinicius Picanco; Pigosso, Daniela Cristina Antelmi; McAloone, Tim C.

    2017-01-01

    of a business case for PSS implementation and management, based on a System Dynamics simulation framework. With amaturity-oriented theoretical perspective and the associated capability concepts, the study provides insights into how the development of PSScapabilities can potentially affect corporate performance...

  17. Design and evaluation of a data-driven scenario generation framework for game-based training

    NARCIS (Netherlands)

    Luo, L.; Yin, H.; Cai, W.; Zhong, J.; Lees, M.

    Generating suitable game scenarios that can cater for individual players has become an emerging challenge in procedural content generation. In this paper, we propose a data-driven scenario generation framework for game-based training. An evolutionary scenario generation process is designed with a

  18. Meeting International Society for Technology in Education Competencies with a Problem-Based Learning Video Framework

    Science.gov (United States)

    Skoretz, Yvonne M.; Cottle, Amy E.

    2011-01-01

    Meeting International Society for Technology in Education competencies creates a challenge for teachers. The authors provide a problem-based video framework that guides teachers in enhancing 21st century skills to meet those competencies. To keep the focus on the content, the authors suggest teaching the technology skills only at the point the…

  19. Developing a Competency-Based Pan-European Accreditation Framework for Health Promotion

    Science.gov (United States)

    Battel-Kirk, Barbara; Van der Zanden, Gerard; Schipperen, Marielle; Contu, Paolo; Gallardo, Carmen; Martinez, Ana; Garcia de Sola, Silvia; Sotgiu, Alessandra; Zaagsma, Miriam; Barry, Margaret M.

    2012-01-01

    Background: The CompHP Pan-European Accreditation Framework for Health Promotion was developed as part of the CompHP Project that aimed to develop competency-based standards and an accreditation system for health promotion practice, education, and training in Europe. Method: A phased, multiple-method approach was employed to facilitate consensus…

  20. Mobotware – A Plug-in Based Framework For Mobile Robots

    DEFF Research Database (Denmark)

    Beck, Anders Billesø; Andersen, Nils Axel; Andersen, Jens Christian

    2010-01-01

    combined with inter-module communication based on TCP/IP sockets and human readable XML-protocol makes it easy to use the system on a wide range of hardware platforms, configurations and computer platform distributions. The framework has until now been interfaced to 7 different hardware platforms and has...