WorldWideScience

Sample records for sophisticated computational framework

  1. Sophisticated Players and Sophisticated Agents

    NARCIS (Netherlands)

    Rustichini, A.

    1998-01-01

    A sophisticated player is an individual who takes the action of the opponents, in a strategic situation, as determined by decision of rational opponents, and acts accordingly. A sophisticated agent is rational in the choice of his action, but ignores the fact that he is part of a strategic

  2. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species.

    Directory of Open Access Journals (Sweden)

    Marie Devaine

    2017-11-01

    Full Text Available Theory of Mind (ToM, i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded. However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity or social group size (a proxy for social network complexity are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees engage in simple dyadic games against artificial ToM players (via a familiar human caregiver. Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size. Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities.

  3. SMEs and new ventures need business model sophistication

    DEFF Research Database (Denmark)

    Kesting, Peter; Günzel-Jensen, Franziska

    2015-01-01

    , and Spreadshirt, this article develops a framework that introduces five business model sophistication strategies: (1) uncover additional functions of your product, (2) identify strategic benefits for third parties, (3) take advantage of economies of scope, (4) utilize cross-selling opportunities, and (5) involve...

  4. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    Science.gov (United States)

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  5. A lightweight distributed framework for computational offloading in mobile cloud computing.

    Directory of Open Access Journals (Sweden)

    Muhammad Shiraz

    Full Text Available The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs. Therefore, Mobile Cloud Computing (MCC leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  6. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  7. A Software Rejuvenation Framework for Distributed Computing

    Science.gov (United States)

    Chau, Savio

    2009-01-01

    A performability-oriented conceptual framework for software rejuvenation has been constructed as a means of increasing levels of reliability and performance in distributed stateful computing. As used here, performability-oriented signifies that the construction of the framework is guided by the concept of analyzing the ability of a given computing system to deliver services with gracefully degradable performance. The framework is especially intended to support applications that involve stateful replicas of server computers.

  8. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    Models are playing important roles in design and analysis of chemicals based products and the processes that manufacture them. Computer-aided methods and tools have the potential to reduce the number of experiments, which can be expensive and time consuming, and there is a benefit of working...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...

  9. Computer information systems framework

    International Nuclear Information System (INIS)

    Shahabuddin, S.

    1989-01-01

    Management information systems (MIS) is a commonly used term in computer profession. The new information technology has caused management to expect more from computer. The process of supplying information follows a well defined procedure. MIS should be capable for providing usable information to the various areas and levels of organization. MIS is different from data processing. MIS and business hierarchy provides a good framework for many organization which are using computers. (A.B.)

  10. Pension fund sophistication and investment policy

    NARCIS (Netherlands)

    de Dreu, J.|info:eu-repo/dai/nl/364537906; Bikker, J.A.|info:eu-repo/dai/nl/06912261X

    This paper assesses the sophistication of pension funds’ investment policies using data on 748 Dutch pension funds during the 1999–2006 period. We develop three indicators of sophistication: gross rounding of investment choices, investments in alternative sophisticated asset classes and ‘home bias’.

  11. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    Science.gov (United States)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  12. In Praise of the Sophists.

    Science.gov (United States)

    Gibson, Walker

    1993-01-01

    Discusses the thinking of the Greek Sophist philosophers, particularly Gorgias and Protagoras, and their importance and relevance for contemporary English instructors. Considers the problem of language as signs of reality in the context of Sophist philosophy. (HB)

  13. HCI^2 Framework: A software framework for multimodal human-computer interaction systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2013-01-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a

  14. A Computational Framework for Bioimaging Simulation

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  15. A Computational Framework for Bioimaging Simulation.

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  16. A Computational Framework for Bioimaging Simulation.

    Directory of Open Access Journals (Sweden)

    Masaki Watabe

    Full Text Available Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  17. Computer Aided Solvent Selection and Design Framework

    DEFF Research Database (Denmark)

    Mitrofanov, Igor; Conte, Elisa; Abildskov, Jens

    and computer-aided tools and methods for property prediction and computer-aided molecular design (CAMD) principles. This framework is applicable for solvent selection and design in product design as well as process design. The first module of the framework is dedicated to the solvent selection and design...... in terms of: physical and chemical properties (solvent-pure properties); Environment, Health and Safety (EHS) characteristic (solvent-EHS properties); operational properties (solvent–solute properties). 3. Performing the search. The search step consists of two stages. The first is a generation and property...... identification of solvent candidates using special software ProCAMD and ProPred, which are the implementations of computer-aided molecular techniques. The second consists of assigning the RS-indices following the reaction–solvent and then consulting the known solvent database and identifying the set of solvents...

  18. FAST: framework for heterogeneous medical image computing and visualization.

    Science.gov (United States)

    Smistad, Erik; Bozorgi, Mohammadmehdi; Lindseth, Frank

    2015-11-01

    Computer systems are becoming increasingly heterogeneous in the sense that they consist of different processors, such as multi-core CPUs and graphic processing units. As the amount of medical image data increases, it is crucial to exploit the computational power of these processors. However, this is currently difficult due to several factors, such as driver errors, processor differences, and the need for low-level memory handling. This paper presents a novel FrAmework for heterogeneouS medical image compuTing and visualization (FAST). The framework aims to make it easier to simultaneously process and visualize medical images efficiently on heterogeneous systems. FAST uses common image processing programming paradigms and hides the details of memory handling from the user, while enabling the use of all processors and cores on a system. The framework is open-source, cross-platform and available online. Code examples and performance measurements are presented to show the simplicity and efficiency of FAST. The results are compared to the insight toolkit (ITK) and the visualization toolkit (VTK) and show that the presented framework is faster with up to 20 times speedup on several common medical imaging algorithms. FAST enables efficient medical image computing and visualization on heterogeneous systems. Code examples and performance evaluations have demonstrated that the toolkit is both easy to use and performs better than existing frameworks, such as ITK and VTK.

  19. A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.

    Science.gov (United States)

    Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao

    2018-05-23

    The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.

  20. BlueSky Cloud Framework: An E-Learning Framework Embracing Cloud Computing

    Science.gov (United States)

    Dong, Bo; Zheng, Qinghua; Qiao, Mu; Shu, Jian; Yang, Jie

    Currently, E-Learning has grown into a widely accepted way of learning. With the huge growth of users, services, education contents and resources, E-Learning systems are facing challenges of optimizing resource allocations, dealing with dynamic concurrency demands, handling rapid storage growth requirements and cost controlling. In this paper, an E-Learning framework based on cloud computing is presented, namely BlueSky cloud framework. Particularly, the architecture and core components of BlueSky cloud framework are introduced. In BlueSky cloud framework, physical machines are virtualized, and allocated on demand for E-Learning systems. Moreover, BlueSky cloud framework combines with traditional middleware functions (such as load balancing and data caching) to serve for E-Learning systems as a general architecture. It delivers reliable, scalable and cost-efficient services to E-Learning systems, and E-Learning organizations can establish systems through these services in a simple way. BlueSky cloud framework solves the challenges faced by E-Learning, and improves the performance, availability and scalability of E-Learning systems.

  1. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances.......We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ...

  2. JACOB: An Enterprise Framework for Computational Chemistry

    Science.gov (United States)

    Waller, Mark P; Dresselhaus, Thomas; Yang, Jack

    2013-01-01

    Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: http://www.wallerlab.org/jacob. © 2013 Wiley Periodicals, Inc. PMID:23553271

  3. JACOB: an enterprise framework for computational chemistry.

    Science.gov (United States)

    Waller, Mark P; Dresselhaus, Thomas; Yang, Jack

    2013-06-15

    Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.

  4. Review of Cloud Computing and existing Frameworks for Cloud adoption

    OpenAIRE

    Chang, Victor; Walters, Robert John; Wills, Gary

    2014-01-01

    This paper presents a selected review for Cloud Computing and explains the benefits and risks of adopting Cloud Computing in a business environment. Although all the risks identified may be associated with two major Cloud adoption challenges, a framework is required to support organisations as they begin to use Cloud and minimise risks of Cloud adoption. Eleven Cloud Computing frameworks are investigated and a comparison of their strengths and limitations is made; the result of the comparison...

  5. Learning openCV computer vision with the openCV library

    CERN Document Server

    Bradski, Gary

    2008-01-01

    Learning OpenCV puts you right in the middle of the rapidly expanding field of computer vision. Written by the creators of OpenCV, the widely used free open-source library, this book introduces you to computer vision and demonstrates how you can quickly build applications that enable computers to see" and make decisions based on the data. With this book, any developer or hobbyist can get up and running with the framework quickly, whether it's to build simple or sophisticated vision applications

  6. Development and Application of a Numerical Framework for Improving Building Foundation Heat Transfer Calculations

    Science.gov (United States)

    Kruis, Nathanael J. F.

    Heat transfer from building foundations varies significantly in all three spatial dimensions and has important dynamic effects at all timescales, from one hour to several years. With the additional consideration of moisture transport, ground freezing, evapotranspiration, and other physical phenomena, the estimation of foundation heat transfer becomes increasingly sophisticated and computationally intensive to the point where accuracy must be compromised for reasonable computation time. The tools currently available to calculate foundation heat transfer are often either too limited in their capabilities to draw meaningful conclusions or too sophisticated to use in common practices. This work presents Kiva, a new foundation heat transfer computational framework. Kiva provides a flexible environment for testing different numerical schemes, initialization methods, spatial and temporal discretizations, and geometric approximations. Comparisons within this framework provide insight into the balance of computation speed and accuracy relative to highly detailed reference solutions. The accuracy and computational performance of six finite difference numerical schemes are verified against established IEA BESTEST test cases for slab-on-grade heat conduction. Of the schemes tested, the Alternating Direction Implicit (ADI) scheme demonstrates the best balance between accuracy, performance, and numerical stability. Kiva features four approaches of initializing soil temperatures for an annual simulation. A new accelerated initialization approach is shown to significantly reduce the required years of presimulation. Methods of approximating three-dimensional heat transfer within a representative two-dimensional context further improve computational performance. A new approximation called the boundary layer adjustment method is shown to improve accuracy over other established methods with a negligible increase in computation time. This method accounts for the reduced heat transfer

  7. Cloud computing and ROI a new framework for it strategy

    CERN Document Server

    Mohapatra, Sanjay

    2014-01-01

    This book develops an IT strategy for cloud computing that helps businesses evaluate their readiness for cloud services and calculate the ROI. The framework provided helps reduce risks involved in transitioning from traditional "on site" IT strategy to virtual "cloud computing." Since the advent of cloud computing, many organizations have made substantial gains implementing this innovation. Cloud computing allows companies to focus more on their core competencies, as IT enablement is taken care of through cloud services. Cloud Computing and ROI includes case studies covering retail, automobile and food processing industries. Each of these case studies have successfully implemented the cloud computing framework and their strategies are explained. As cloud computing may not be ideal for all businesses, criteria?are also offered to help determine if this strategy should be adopt.

  8. Sophisticated Approval Voting, Ignorance Priors, and Plurality Heuristics: A Behavioral Social Choice Analysis in a Thurstonian Framework

    Science.gov (United States)

    Regenwetter, Michel; Ho, Moon-Ho R.; Tsetlin, Ilia

    2007-01-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two…

  9. Framework for computer-aided systems design

    International Nuclear Information System (INIS)

    Esselman, W.H.

    1992-01-01

    Advanced computer technology, analytical methods, graphics capabilities, and expert systems contribute to significant changes in the design process. Continued progress is expected. Achieving the ultimate benefits of these computer-based design tools depends on successful research and development on a number of key issues. A fundamental understanding of the design process is a prerequisite to developing these computer-based tools. In this paper a hierarchical systems design approach is described, and methods by which computers can assist the designer are examined. A framework is presented for developing computer-based design tools for power plant design. These tools include expert experience bases, tutorials, aids in decision making, and tools to develop the requirements, constraints, and interactions among subsystems and components. Early consideration of the functional tasks is encouraged. Methods of acquiring an expert's experience base is a fundamental research problem. Computer-based guidance should be provided in a manner that supports the creativity, heuristic approaches, decision making, and meticulousness of a good designer

  10. Computer-aided Framework for Design of Pure, Mixed and Blended Products

    DEFF Research Database (Denmark)

    Cignitti, Stefano; Zhang, Lei; Gani, Rafiqul

    2015-01-01

    This paper presents a framework for computer-aided design of pure, mixed and blended chemical based products. The framework is a systematic approach to convert a Computer-aided Molecular, Mixture and Blend Design (CAMbD) formulation, based on needs and target properties, into a mixed integer non...

  11. A Computational Framework for Efficient Low Temperature Plasma Simulations

    Science.gov (United States)

    Verma, Abhishek Kumar; Venkattraman, Ayyaswamy

    2016-10-01

    Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.

  12. A Software Framework for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2009-01-01

    This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

  13. An Adaptive Sensor Mining Framework for Pervasive Computing Applications

    Science.gov (United States)

    Rashidi, Parisa; Cook, Diane J.

    Analyzing sensor data in pervasive computing applications brings unique challenges to the KDD community. The challenge is heightened when the underlying data source is dynamic and the patterns change. We introduce a new adaptive mining framework that detects patterns in sensor data, and more importantly, adapts to the changes in the underlying model. In our framework, the frequent and periodic patterns of data are first discovered by the Frequent and Periodic Pattern Miner (FPPM) algorithm; and then any changes in the discovered patterns over the lifetime of the system are discovered by the Pattern Adaptation Miner (PAM) algorithm, in order to adapt to the changing environment. This framework also captures vital context information present in pervasive computing applications, such as the startup triggers and temporal information. In this paper, we present a description of our mining framework and validate the approach using data collected in the CASAS smart home testbed.

  14. Sophistication and Performance of Italian Agri‐food Exports

    Directory of Open Access Journals (Sweden)

    Anna Carbone

    2012-06-01

    Full Text Available Nonprice competition is increasingly important in world food markets. Recently, the expression ‘export sophistication’ has been introduced in the economic literature to refer to a wide set of attributes that increase product value. An index has been proposed to measure sophistication in an indirect way through the per capita GDP of exporting countries (Lall et al., 2006; Haussmann et al., 2007.The paper applies the sophistication measure to the Italian food export sector, moving from an analysis of trends and performance of Italian food exports. An original way to disentangle different components in the temporal variation of the sophistication index is also proposed.Results show that the sophistication index offers original insights on recent trends in world food exports and with respect to Italian core food exports.

  15. Toward a theoretical framework for trustworthy cyber sensing

    Science.gov (United States)

    Xu, Shouhuai

    2010-04-01

    Cyberspace is an indispensable part of the economy and society, but has been "polluted" with many compromised computers that can be abused to launch further attacks against the others. Since it is likely that there always are compromised computers, it is important to be aware of the (dynamic) cyber security-related situation, which is however challenging because cyberspace is an extremely large-scale complex system. Our project aims to investigate a theoretical framework for trustworthy cyber sensing. With the perspective of treating cyberspace as a large-scale complex system, the core question we aim to address is: What would be a competent theoretical (mathematical and algorithmic) framework for designing, analyzing, deploying, managing, and adapting cyber sensor systems so as to provide trustworthy information or input to the higher layer of cyber situation-awareness management, even in the presence of sophisticated malicious attacks against the cyber sensor systems?

  16. Framework for Computer-Aided Evolution of Object-Oriented Designs

    NARCIS (Netherlands)

    Ciraci, S.; van den Broek, P.M.; Aksit, Mehmet

    2008-01-01

    In this paper, we describe a framework for the computer aided evolution of the designs of object-oriented software systems. Evolution mechanisms are software structures that prepare software for certain type of evolutions. The framework uses a database which holds the evolution mechanisms, modeled

  17. Computational models in physics teaching: a framework

    Directory of Open Access Journals (Sweden)

    Marco Antonio Moreira

    2012-08-01

    Full Text Available The purpose of the present paper is to present a theoretical framework to promote and assist meaningful physics learning through computational models. Our proposal is based on the use of a tool, the AVM diagram, to design educational activities involving modeling and computer simulations. The idea is to provide a starting point for the construction and implementation of didactical approaches grounded in a coherent epistemological view about scientific modeling.

  18. A Framework for Understanding Physics Students' Computational Modeling Practices

    Science.gov (United States)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  19. A Computational Framework for Flood Risk Assessment in The Netherlands

    Directory of Open Access Journals (Sweden)

    A.A. Markus

    2010-01-01

    Full Text Available The safety of dikes in The Netherlands, located in the delta of the rivers Rhine, Meuse and Scheldt, has been the subject of debate for more than ten years. The safety (or flood risk of a particular area may depend on the safety of other areas. This is referred to as effects of river system behaviour on flood risk (quantified as the estimated number of casualties and economic damage. A computational framework was developed to assess these effects. It consists of several components that are loosely coupled via data files and Tcl scripts to manage the individual programs and keep track of the state of the computations. The computations involved are lengthy (days or even weeks on a Linux cluster, which makes the framework currently more suitable for planning and design than for real-time operation. While the framework was constructed ad hoc, it can also be viewed more formally as a tuplespace Realising this makes it possible to adopt the philosophy for other similar frameworks.

  20. Reuse, Recycle, Reweigh: Combating Influenza through Efficient Sequential Bayesian Computation for Massive Data

    OpenAIRE

    Tom, Jennifer A.; Sinsheimer, Janet S.; Suchard, Marc A.

    2010-01-01

    Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, of...

  1. Sophistication of computational science and fundamental physics simulations

    International Nuclear Information System (INIS)

    Ishiguro, Seiji; Ito, Atsushi; Usami, Shunsuke; Ohtani, Hiroaki; Sakagami, Hitoshi; Toida, Mieko; Hasegawa, Hiroki; Horiuchi, Ritoku; Miura, Hideaki

    2016-01-01

    Numerical experimental reactor research project is composed of the following studies: (1) nuclear fusion simulation research with a focus on specific physical phenomena of specific equipment, (2) research on advanced simulation method to increase predictability or expand its application range based on simulation, (3) visualization as the foundation of simulation research, (4) research for advanced computational science such as parallel computing technology, and (5) research aiming at elucidation of fundamental physical phenomena not limited to specific devices. Specifically, a wide range of researches with medium- to long-term perspectives are being developed: (1) virtual reality visualization, (2) upgrading of computational science such as multilayer simulation method, (3) kinetic behavior of plasma blob, (4) extended MHD theory and simulation, (5) basic plasma process such as particle acceleration due to interaction of wave and particle, and (6) research related to laser plasma fusion. This paper reviews the following items: (1) simultaneous visualization in virtual reality space, (2) multilayer simulation of collisionless magnetic reconnection, (3) simulation of microscopic dynamics of plasma coherent structure, (4) Hall MHD simulation of LHD, (5) numerical analysis for extension of MHD equilibrium and stability theory, (6) extended MHD simulation of 2D RT instability, (7) simulation of laser plasma, (8) simulation of shock wave and particle acceleration, and (9) study on simulation of homogeneous isotropic MHD turbulent flow. (A.O.)

  2. A Framework for Federated Two-Factor Authentication Enabling Cost-Effective Secure Access to Distributed Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Ezell, Matthew A [ORNL; Rogers, Gary L [University of Tennessee, Knoxville (UTK); Peterson, Gregory D. [University of Tennessee, Knoxville (UTK)

    2012-01-01

    As cyber attacks become increasingly sophisticated, the security measures used to mitigate the risks must also increase in sophistication. One time password (OTP) systems provide strong authentication because security credentials are not reusable, thus thwarting credential replay attacks. The credential changes regularly, making brute-force attacks significantly more difficult. In high performance computing, end users may require access to resources housed at several different service provider locations. The ability to share a strong token between multiple computing resources reduces cost and complexity. The National Science Foundation (NSF) Extreme Science and Engineering Discovery Environment (XSEDE) provides access to digital resources, including supercomputers, data resources, and software tools. XSEDE will offer centralized strong authentication for services amongst service providers that leverage their own user databases and security profiles. This work implements a scalable framework built on standards to provide federated secure access to distributed cyberinfrastructure.

  3. A Secure Framework for Location Verification in Pervasive Computing

    Science.gov (United States)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  4. How Our Cognition Shapes and Is Shaped by Technology: A Common Framework for Understanding Human Tool-Use Interactions in the Past, Present, and Future

    Directory of Open Access Journals (Sweden)

    François Osiurak

    2018-03-01

    Full Text Available Over the evolution, humans have constantly developed and improved their technologies. This evolution began with the use of physical tools, those tools that increase our sensorimotor abilities (e.g., first stone tools, modern knives, hammers, pencils. Although we still use some of these tools, we also employ in daily life more sophisticated tools for which we do not systematically understand the underlying physical principles (e.g., computers, cars. Current research is also turned toward the development of brain–computer interfaces directly linking our brain activity to machines (i.e., symbiotic tools. The ultimate goal of research on this topic is to identify the key cognitive processes involved in these different modes of interaction. As a primary step to fulfill this goal, we offer a first attempt at a common framework, based on the idea that humans shape technologies, which also shape us in return. The framework proposed is organized into three levels, describing how we interact when using physical (Past, sophisticated (Present, and symbiotic (Future technologies. Here we emphasize the role played by technical reasoning and practical reasoning, two key cognitive processes that could nevertheless be progressively suppressed by the proficient use of sophisticated and symbiotic tools. We hope that this framework will provide a common ground for researchers interested in the cognitive basis of human tool-use interactions, from paleoanthropology to neuroergonomics.

  5. The First Sophists and the Uses of History.

    Science.gov (United States)

    Jarratt, Susan C.

    1987-01-01

    Reviews the history of intellectual views on the Greek sophists in three phases: (1) their disparagement by Plato and Aristotle as the morally disgraceful "other"; (2) nineteenth century British positivists' reappraisal of these relativists as ethically and scientifically superior; and (3) twentieth century versions of the sophists as…

  6. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  7. Cumulative Dominance and Probabilistic Sophistication

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.H.

    2000-01-01

    Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

  8. PMDP: A Framework for Preserving Multiparty Data Privacy in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ji Li

    2017-01-01

    Full Text Available The amount of Internet data is significantly increasing due to the development of network technology, inducing the appearance of big data. Experiments have shown that deep mining and analysis on large datasets would introduce great benefits. Although cloud computing supports data analysis in an outsourced and cost-effective way, it brings serious privacy issues when sending the original data to cloud servers. Meanwhile, the returned analysis result suffers from malicious inference attacks and also discloses user privacy. In this paper, to conquer the above privacy issues, we propose a general framework for Preserving Multiparty Data Privacy (PMDP for short in cloud computing. The PMDP framework can protect numeric data computing and publishing with the assistance of untrusted cloud servers and achieve delegation of storage simultaneously. Our framework is built upon several cryptography primitives (e.g., secure multiparty computation and differential privacy mechanism, which guarantees its security against semihonest participants without collusion. We further instantiate PMDP with specific algorithms and demonstrate its security, efficiency, and advantages by presenting security analysis and performance discussion. Moreover, we propose a security enhanced framework sPMDP to resist malicious inside participants and outside adversaries. We illustrate that both PMDP and sPMDP are reliable and scale well and thus are desirable for practical applications.

  9. Does Investors' Sophistication Affect Persistence and Pricing of Discretionary Accruals?

    OpenAIRE

    Lanfeng Kao

    2007-01-01

    This paper examines whether the sophistication of market investors influences management's strategy on discretionary accounting choice, and thus changes the persistence of discretionary accruals. The results show that the persistence of discretionary accruals for firms face with naive investors is lower than that for firms face with sophisticated investors. The results also demonstrate that sophisticated investors indeed incorporate the implications of current earnings components into future ...

  10. The BioIntelligence Framework: a new computational platform for biomedical knowledge computing.

    Science.gov (United States)

    Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles; Mousses, Spyro

    2013-01-01

    Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information.

  11. Amelie: A recombinant computing framework for ambient awareness

    NARCIS (Netherlands)

    Metaxas, G.; Markopoulos, P.; Aarts, E.H.L.; Tschlegi, M.

    2009-01-01

    This paper presents Amelie, a service oriented framework that supports the implementation of awareness systems. Amelie adopts the tenets of Recombinant computing to address an important non-functional requirement for Ambient Intelligence software, namely the heterogeneous combination of services and

  12. A computational fluid dynamics simulation framework for ventricular catheter design optimization.

    Science.gov (United States)

    Weisenberg, Sofy H; TerMaath, Stephanie C; Barbier, Charlotte N; Hill, Judith C; Killeffer, James A

    2017-11-10

    OBJECTIVE Cerebrospinal fluid (CSF) shunts are the primary treatment for patients suffering from hydrocephalus. While proven effective in symptom relief, these shunt systems are plagued by high failure rates and often require repeated revision surgeries to replace malfunctioning components. One of the leading causes of CSF shunt failure is obstruction of the ventricular catheter by aggregations of cells, proteins, blood clots, or fronds of choroid plexus that occlude the catheter's small inlet holes or even the full internal catheter lumen. Such obstructions can disrupt CSF diversion out of the ventricular system or impede it entirely. Previous studies have suggested that altering the catheter's fluid dynamics may help to reduce the likelihood of complete ventricular catheter failure caused by obstruction. However, systematic correlation between a ventricular catheter's design parameters and its performance, specifically its likelihood to become occluded, still remains unknown. Therefore, an automated, open-source computational fluid dynamics (CFD) simulation framework was developed for use in the medical community to determine optimized ventricular catheter designs and to rapidly explore parameter influence for a given flow objective. METHODS The computational framework was developed by coupling a 3D CFD solver and an iterative optimization algorithm and was implemented in a high-performance computing environment. The capabilities of the framework were demonstrated by computing an optimized ventricular catheter design that provides uniform flow rates through the catheter's inlet holes, a common design objective in the literature. The baseline computational model was validated using 3D nuclear imaging to provide flow velocities at the inlet holes and through the catheter. RESULTS The optimized catheter design achieved through use of the automated simulation framework improved significantly on previous attempts to reach a uniform inlet flow rate distribution using

  13. The conceptualization and measurement of cognitive health sophistication.

    Science.gov (United States)

    Bodie, Graham D; Collins, William B; Jensen, Jakob D; Davis, Lashara A; Guntzviller, Lisa M; King, Andy J

    2013-01-01

    This article develops a conceptualization and measure of cognitive health sophistication--the complexity of an individual's conceptual knowledge about health. Study 1 provides initial validity evidence for the measure--the Healthy-Unhealthy Other Instrument--by showing its association with other cognitive health constructs indicative of higher health sophistication. Study 2 presents data from a sample of low-income adults to provide evidence that the measure does not depend heavily on health-related vocabulary or ethnicity. Results from both studies suggest that the Healthy-Unhealthy Other Instrument can be used to capture variability in the sophistication or complexity of an individual's health-related schematic structures on the basis of responses to two simple open-ended questions. Methodological advantages of the Healthy-Unhealthy Other Instrument and suggestions for future research are highlighted in the discussion.

  14. ProFUSO: Business process and ontology-based framework to develop ubiquitous computing support systems for chronic patients' management.

    Science.gov (United States)

    Jimenez-Molina, Angel; Gaete-Villegas, Jorge; Fuentes, Javier

    2018-06-01

    New advances in telemedicine, ubiquitous computing, and artificial intelligence have supported the emergence of more advanced applications and support systems for chronic patients. This trend addresses the important problem of chronic illnesses, highlighted by multiple international organizations as a core issue in future healthcare. Despite the myriad of exciting new developments, each application and system is designed and implemented for specific purposes and lacks the flexibility to support different healthcare concerns. Some of the known problems of such developments are the integration issues between applications and existing healthcare systems, the reusability of technical knowledge in the creation of new and more sophisticated systems and the usage of data gathered from multiple sources in the generation of new knowledge. This paper proposes a framework for the development of chronic disease support systems and applications as an answer to these shortcomings. Through this framework our pursuit is to create a common ground methodology upon which new developments can be created and easily integrated to provide better support to chronic patients, medical staff and other relevant participants. General requirements are inferred for any support system from the primary attention process of chronic patients by the Business Process Management Notation. Numerous technical approaches are proposed to design a general architecture that considers the medical organizational requirements in the treatment of a patient. A framework is presented for any application in support of chronic patients and evaluated by a case study to test the applicability and pertinence of the solution. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. STICK: Spike Time Interval Computational Kernel, a Framework for General Purpose Computation Using Neurons, Precise Timing, Delays, and Synchrony.

    Science.gov (United States)

    Lagorce, Xavier; Benosman, Ryad

    2015-11-01

    There has been significant research over the past two decades in developing new platforms for spiking neural computation. Current neural computers are primarily developed to mimic biology. They use neural networks, which can be trained to perform specific tasks to mainly solve pattern recognition problems. These machines can do more than simulate biology; they allow us to rethink our current paradigm of computation. The ultimate goal is to develop brain-inspired general purpose computation architectures that can breach the current bottleneck introduced by the von Neumann architecture. This work proposes a new framework for such a machine. We show that the use of neuron-like units with precise timing representation, synaptic diversity, and temporal delays allows us to set a complete, scalable compact computation framework. The framework provides both linear and nonlinear operations, allowing us to represent and solve any function. We show usability in solving real use cases from simple differential equations to sets of nonlinear differential equations leading to chaotic attractors.

  16. Obfuscation, Learning, and the Evolution of Investor Sophistication

    OpenAIRE

    Bruce Ian Carlin; Gustavo Manso

    2011-01-01

    Investor sophistication has lagged behind the growing complexity of retail financial markets. To explore this, we develop a dynamic model to study the interaction between obfuscation and investor sophistication in mutual fund markets. Taking into account different learning mechanisms within the investor population, we characterize the optimal timing of obfuscation for financial institutions who offer retail products. We show that educational initiatives that are directed to facilitate learnin...

  17. Amelie: A Recombinant Computing Framework for Ambient Awareness

    Science.gov (United States)

    Metaxas, Georgios; Markopoulos, Panos; Aarts, Emile

    This paper presents Amelie, a service oriented framework that supports the implementation of awareness systems. Amelie adopts the tenets of Recombinant computing to address an important non-functional requirement for Ambient Intelligence software, namely the heterogeneous combination of services and components. Amelie is founded upon FN-AAR an abstract model of Awareness Systems which enables the immediate expression and implementation of socially salient requirements, such as symmetry and social translucence. We discuss the framework and show how system behaviours can be specified using the Awareness Mark-up Language AML.

  18. Probabilistic Sophistication, Second Order Stochastic Dominance, and Uncertainty Aversion

    OpenAIRE

    Simone Cerreia-Vioglio; Fabio Maccheroni; Massimo Marinacci; Luigi Montrucchio

    2010-01-01

    We study the interplay of probabilistic sophistication, second order stochastic dominance, and uncertainty aversion, three fundamental notions in choice under uncertainty. In particular, our main result, Theorem 2, characterizes uncertainty averse preferences that satisfy second order stochastic dominance, as well as uncertainty averse preferences that are probabilistically sophisticated.

  19. Cloud computing strategic framework (FY13 - FY15).

    Energy Technology Data Exchange (ETDEWEB)

    Arellano, Lawrence R.; Arroyo, Steven C.; Giese, Gerald J.; Cox, Philip M.; Rogers, G. Kelly

    2012-11-01

    This document presents an architectural framework (plan) and roadmap for the implementation of a robust Cloud Computing capability at Sandia National Laboratories. It is intended to be a living document and serve as the basis for detailed implementation plans, project proposals and strategic investment requests.

  20. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer-aided...... methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task....... The methodology has been implemented into a computer-aided modeling framework, which combines expert skills, tools, and database connections that are required for the different steps of the model development work-flow with the goal to increase the efficiency of the modeling process. The framework has two main...

  1. Software engineering frameworks for the cloud computing paradigm

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents the latest research on Software Engineering Frameworks for the Cloud Computing Paradigm, drawn from an international selection of researchers and practitioners. The book offers both a discussion of relevant software engineering approaches and practical guidance on enterprise-wide software deployment in the cloud environment, together with real-world case studies. Features: presents the state of the art in software engineering approaches for developing cloud-suitable applications; discusses the impact of the cloud computing paradigm on software engineering; offers guidance an

  2. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-07-01

    Computational steering has revolutionized the traditional workflow in high performance computing (HPC) applications. The standard workflow that consists of preparation of an application’s input, running of a simulation, and visualization of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation application at run-time. It allows modification of application-defined control parameters at run-time using various user-steering applications. In this project, we propose a computational steering framework for HPC environments that provides an innovative solution and easy-to-use platform, which allows users to connect and interact with running application(s) in real-time. This framework uses RealityGrid as the underlying steering library and adds several enhancements to the library to enable steering support for Blue Gene systems. Included in the scope of this project is the development of a scalable and efficient steering relay server that supports many-to-many connectivity between multiple steered applications and multiple steering clients. Steered applications can range from intermediate simulation and physical modeling applications to complex computational fluid dynamics (CFD) applications or advanced visualization applications. The Blue Gene supercomputer presents special challenges for remote access because the compute nodes reside on private networks. This thesis presents an implemented solution and demonstrates it on representative applications. Thorough implementation details and application enablement steps are also presented in this thesis to encourage direct usage of this framework.

  3. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  4. MOOSE: A parallel computational framework for coupled systems of nonlinear equations

    International Nuclear Information System (INIS)

    Gaston, Derek; Newman, Chris; Hansen, Glen; Lebrun-Grandie, Damien

    2009-01-01

    Systems of coupled, nonlinear partial differential equations (PDEs) often arise in simulation of nuclear processes. MOOSE: Multiphysics Object Oriented Simulation Environment, a parallel computational framework targeted at the solution of such systems, is presented. As opposed to traditional data-flow oriented computational frameworks, MOOSE is instead founded on the mathematical principle of Jacobian-free Newton-Krylov (JFNK). Utilizing the mathematical structure present in JFNK, physics expressions are modularized into 'Kernels,' allowing for rapid production of new simulation tools. In addition, systems are solved implicitly and fully coupled, employing physics-based preconditioning, which provides great flexibility even with large variance in time scales. A summary of the mathematics, an overview of the structure of MOOSE, and several representative solutions from applications built on the framework are presented.

  5. Benchmarking high performance computing architectures with CMS’ skeleton framework

    Science.gov (United States)

    Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.

    2017-10-01

    In 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta, Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.

  6. The role of sophisticated accounting system in strategy management

    OpenAIRE

    Naranjo Gil, David

    2004-01-01

    Organizations are designing more sophisticated accounting information systems to meet the strategic goals and enhance their performance. This study examines the effect of accounting information system design on the performance of organizations pursuing different strategic priorities. The alignment between sophisticated accounting information systems and organizational strategy is analyzed. The enabling effect of the accounting information system on performance is also examined. Relationships ...

  7. Financial Literacy and Financial Sophistication in the Older Population

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S.; Curto, Vilsa

    2017-01-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications. PMID:28553191

  8. Financial Literacy and Financial Sophistication in the Older Population.

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S; Curto, Vilsa

    2014-10-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications.

  9. A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation

    Science.gov (United States)

    Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis

    2011-06-01

    This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.

  10. [Computer aided design for fixed partial denture framework based on reverse engineering technology].

    Science.gov (United States)

    Sun, Yu-chun; Lü, Pei-jun; Wang, Yong

    2006-03-01

    To explore a computer aided design (CAD) route for the framework of domestic fixed partial denture (FPD) and confirm the suitable method of 3-D CAD. The working area of a dentition model was scanned with a 3-D mechanical scanner. Using the reverse engineering (RE) software, margin and border curves were extracted and several reference curves were created to ensure the dimension and location of pontic framework that was taken from the standard database. The shoulder parts of the retainers were created after axial surfaces constructed. The connecting areas, axial line and curving surface of the framework connector were finally created. The framework of a three-unit FPD was designed with RE technology, which showed smooth surfaces and continuous contours. The design route is practical. The result of this study is significant in theory and practice, which will provide a reference for establishing the computer aided design/computer aided manufacture (CAD/CAM) system of domestic FPD.

  11. Commentary on: "Toward Computer-Based Support of Metacognitive Skills: A Computational Framework to Coach Self Explanation"

    Science.gov (United States)

    Conati, Cristina

    2016-01-01

    This paper is a commentary on "Toward Computer-Based Support of Meta-Cognitive Skills: a Computational Framework to Coach Self-Explanation", by Cristina Conati and Kurt Vanlehn, published in the "IJAED" in 2000 (Conati and VanLehn 2010). This work was one of the first examples of Intelligent Learning Environments (ILE) that…

  12. A K-6 Computational Thinking Curriculum Framework : Implications for Teacher Knowledge

    NARCIS (Netherlands)

    Angeli, C.; Voogt, J.; Fluck, A.; Webb, M.; Cox, M.; Malyn-Smith, J.; Zagami, J.

    2016-01-01

    Adding computer science as a separate school subject to the core K-6 curriculum is a complex issue with educational challenges. The authors herein address two of these challenges: (1) the design of the curriculum based on a generic computational thinking framework, and (2) the knowledge teachers

  13. ProjectQ: An Open Source Software Framework for Quantum Computing

    OpenAIRE

    Steiger, Damian S.; Häner, Thomas; Troyer, Matthias

    2016-01-01

    We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through...

  14. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  15. Accelerating Development of EV Batteries Through Computer-Aided Engineering (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.

    2012-12-01

    The Department of Energy's Vehicle Technology Program has launched the Computer-Aided Engineering for Automotive Batteries (CAEBAT) project to work with national labs, industry and software venders to develop sophisticated software. As coordinator, NREL has teamed with a number of companies to help improve and accelerate battery design and production. This presentation provides an overview of CAEBAT, including its predictive computer simulation of Li-ion batteries known as the Multi-Scale Multi-Dimensional (MSMD) model framework. MSMD's modular, flexible architecture connects the physics of battery charge/discharge processes, thermal control, safety and reliability in a computationally efficient manner. This allows independent development of submodels at the cell and pack levels.

  16. The Impact of Financial Sophistication on Adjustable Rate Mortgage Ownership

    Science.gov (United States)

    Smith, Hyrum; Finke, Michael S.; Huston, Sandra J.

    2011-01-01

    The influence of a financial sophistication scale on adjustable-rate mortgage (ARM) borrowing is explored. Descriptive statistics and regression analysis using recent data from the Survey of Consumer Finances reveal that ARM borrowing is driven by both the least and most financially sophisticated households but for different reasons. Less…

  17. Temporal locality optimizations for stencil operations for parallel object-oriented scientific frameworks on cache-based architectures

    Energy Technology Data Exchange (ETDEWEB)

    Bassetti, F.; Davis, K.; Quinlan, D.

    1998-12-01

    High-performance scientific computing relies increasingly on high-level large-scale object-oriented software frameworks to manage both algorithmic complexity and the complexities of parallelism: distributed data management, process management, inter-process communication, and load balancing. This encapsulation of data management, together with the prescribed semantics of a typical fundamental component of such object-oriented frameworks--a parallel or serial array-class library--provides an opportunity for increasingly sophisticated compile-time optimization techniques. This paper describes a technique for introducing cache blocking suitable for certain classes of numerical algorithms, demonstrates and analyzes the resulting performance gains, and indicates how this optimization transformation is being automated.

  18. Framework of Resource Management for Intercloud Computing

    Directory of Open Access Journals (Sweden)

    Mohammad Aazam

    2014-01-01

    Full Text Available There has been a very rapid increase in digital media content, due to which media cloud is gaining importance. Cloud computing paradigm provides management of resources and helps create extended portfolio of services. Through cloud computing, not only are services managed more efficiently, but also service discovery is made possible. To handle rapid increase in the content, media cloud plays a very vital role. But it is not possible for standalone clouds to handle everything with the increasing user demands. For scalability and better service provisioning, at times, clouds have to communicate with other clouds and share their resources. This scenario is called Intercloud computing or cloud federation. The study on Intercloud computing is still in its start. Resource management is one of the key concerns to be addressed in Intercloud computing. Already done studies discuss this issue only in a trivial and simplistic way. In this study, we present a resource management model, keeping in view different types of services, different customer types, customer characteristic, pricing, and refunding. The presented framework was implemented using Java and NetBeans 8.0 and evaluated using CloudSim 3.0.3 toolkit. Presented results and their discussion validate our model and its efficiency.

  19. A computer-aided molecular design framework for crystallization solvent design

    DEFF Research Database (Denmark)

    Karunanithi, Arunprakash T.; Achenie, Luke E.K.; Gani, Rafiqul

    2006-01-01

    One of the key decisions in designing solution crystallization processes is the selection of solvents. In this paper, we present a computer-aided molecular design (CAMD) framework for the design and selection of solvents and/or anti-solvents for solution crystallization. The CAMD problem is formu......One of the key decisions in designing solution crystallization processes is the selection of solvents. In this paper, we present a computer-aided molecular design (CAMD) framework for the design and selection of solvents and/or anti-solvents for solution crystallization. The CAMD problem...... solvent molecules. Solvent design and selection for two types of solution crystallization processes namely cooling crystallization and drowning out crystallization are presented. In the first case study, the design of single compound solvent for crystallization of ibuprofen, which is an important...

  20. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    . To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient......'s CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns......Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging...

  1. CernVM Co-Pilot: an Extensible Framework for Building Scalable Cloud Computing Infrastructures

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    CernVM Co-Pilot is a framework for instantiating an ad-hoc computing infrastructure on top of distributed computing resources. Such resources include commercial computing clouds (e.g. Amazon EC2), scientific computing clouds (e.g. CERN lxcloud), as well as the machines of users participating in volunteer computing projects (e.g. BOINC). The framework consists of components that communicate using the Extensible Messaging and Presence protocol (XMPP), allowing for new components to be developed in virtually any programming language and interfaced to existing Grid and batch computing infrastructures exploited by the High Energy Physics community. Co-Pilot has been used to execute jobs for both the ALICE and ATLAS experiments at CERN. CernVM Co-Pilot is also one of the enabling technologies behind the LHC@home 2.0 volunteer computing project, which is the first such project that exploits virtual machine technology. The use of virtual machines eliminates the necessity of modifying existing applications and adapt...

  2. CernVM Co-Pilot: an Extensible Framework for Building Scalable Computing Infrastructures on the Cloud

    Science.gov (United States)

    Harutyunyan, A.; Blomer, J.; Buncic, P.; Charalampidis, I.; Grey, F.; Karneyeu, A.; Larsen, D.; Lombraña González, D.; Lisec, J.; Segal, B.; Skands, P.

    2012-12-01

    CernVM Co-Pilot is a framework for instantiating an ad-hoc computing infrastructure on top of managed or unmanaged computing resources. Co-Pilot can either be used to create a stand-alone computing infrastructure, or to integrate new computing resources into existing infrastructures (such as Grid or batch). Unlike traditional middleware systems, Co-Pilot components communicate using the Extensible Messaging and Presence protocol (XMPP). This allows the system to be easily scaled in case of a high load, and it also simplifies the development of new components. In this contribution we present the latest developments and the current status of the framework, discuss how it can be extended to suit the needs of a particular community, as well as describe the operational experience of using the framework in the LHC@home 2.0 volunteer computing project.

  3. CernVM Co-Pilot: an Extensible Framework for Building Scalable Computing Infrastructures on the Cloud

    International Nuclear Information System (INIS)

    Harutyunyan, A; Blomer, J; Buncic, P; Charalampidis, I; Grey, F; Karneyeu, A; Larsen, D; Lombraña González, D; Lisec, J; Segal, B; Skands, P

    2012-01-01

    CernVM Co-Pilot is a framework for instantiating an ad-hoc computing infrastructure on top of managed or unmanaged computing resources. Co-Pilot can either be used to create a stand-alone computing infrastructure, or to integrate new computing resources into existing infrastructures (such as Grid or batch). Unlike traditional middleware systems, Co-Pilot components communicate using the Extensible Messaging and Presence protocol (XMPP). This allows the system to be easily scaled in case of a high load, and it also simplifies the development of new components. In this contribution we present the latest developments and the current status of the framework, discuss how it can be extended to suit the needs of a particular community, as well as describe the operational experience of using the framework in the LHC at home 2.0 volunteer computing project.

  4. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    Science.gov (United States)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  5. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    International Nuclear Information System (INIS)

    Hadjidoukas, P.E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-01-01

    We present Π4U, 1 an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow

  6. Polyphony: A Workflow Orchestration Framework for Cloud Computing

    Science.gov (United States)

    Shams, Khawaja S.; Powell, Mark W.; Crockett, Tom M.; Norris, Jeffrey S.; Rossi, Ryan; Soderstrom, Tom

    2010-01-01

    Cloud Computing has delivered unprecedented compute capacity to NASA missions at affordable rates. Missions like the Mars Exploration Rovers (MER) and Mars Science Lab (MSL) are enjoying the elasticity that enables them to leverage hundreds, if not thousands, or machines for short durations without making any hardware procurements. In this paper, we describe Polyphony, a resilient, scalable, and modular framework that efficiently leverages a large set of computing resources to perform parallel computations. Polyphony can employ resources on the cloud, excess capacity on local machines, as well as spare resources on the supercomputing center, and it enables these resources to work in concert to accomplish a common goal. Polyphony is resilient to node failures, even if they occur in the middle of a transaction. We will conclude with an evaluation of a production-ready application built on top of Polyphony to perform image-processing operations of images from around the solar system, including Mars, Saturn, and Titan.

  7. A computational framework for automation of point defect calculations

    International Nuclear Information System (INIS)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei

    2017-01-01

    We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.

  8. A Framework for Debugging Geoscience Projects in a High Performance Computing Environment

    Science.gov (United States)

    Baxter, C.; Matott, L.

    2012-12-01

    High performance computing (HPC) infrastructure has become ubiquitous in today's world with the emergence of commercial cloud computing and academic supercomputing centers. Teams of geoscientists, hydrologists and engineers can take advantage of this infrastructure to undertake large research projects - for example, linking one or more site-specific environmental models with soft computing algorithms, such as heuristic global search procedures, to perform parameter estimation and predictive uncertainty analysis, and/or design least-cost remediation systems. However, the size, complexity and distributed nature of these projects can make identifying failures in the associated numerical experiments using conventional ad-hoc approaches both time- consuming and ineffective. To address these problems a multi-tiered debugging framework has been developed. The framework allows for quickly isolating and remedying a number of potential experimental failures, including: failures in the HPC scheduler; bugs in the soft computing code; bugs in the modeling code; and permissions and access control errors. The utility of the framework is demonstrated via application to a series of over 200,000 numerical experiments involving a suite of 5 heuristic global search algorithms and 15 mathematical test functions serving as cheap analogues for the simulation-based optimization of pump-and-treat subsurface remediation systems.

  9. Optimizing transformations of stencil operations for parallel object-oriented scientific frameworks on cache-based architectures

    Energy Technology Data Exchange (ETDEWEB)

    Bassetti, F.; Davis, K.; Quinlan, D.

    1998-12-31

    High-performance scientific computing relies increasingly on high-level large-scale object-oriented software frameworks to manage both algorithmic complexity and the complexities of parallelism: distributed data management, process management, inter-process communication, and load balancing. This encapsulation of data management, together with the prescribed semantics of a typical fundamental component of such object-oriented frameworks--a parallel or serial array-class library--provides an opportunity for increasingly sophisticated compile-time optimization techniques. This paper describes two optimizing transformations suitable for certain classes of numerical algorithms, one for reducing the cost of inter-processor communication, and one for improving cache utilization; demonstrates and analyzes the resulting performance gains; and indicates how these transformations are being automated.

  10. Cognitive Load and Strategic Sophistication

    OpenAIRE

    Allred, Sarah; Duffy, Sean; Smith, John

    2013-01-01

    We study the relationship between the cognitive load manipulation and strategic sophistication. The cognitive load manipulation is designed to reduce the subject's cognitive resources that are available for deliberation on a choice. In our experiment, subjects are placed under a large cognitive load (given a difficult number to remember) or a low cognitive load (given a number which is not difficult to remember). Subsequently, the subjects play a one-shot game then they are asked to recall...

  11. A computer-aided framework for development, identification andmanagement of physiologically-based pharmacokinetic models

    DEFF Research Database (Denmark)

    Heitzig, Martina; Linninger, Andreas; Sin, Gürkan

    2014-01-01

    The objective of this work is the development of a generic computer-aided modelling framework to support the development of physiologically-based pharmacokinetic models thereby increasing the efficiency and quality of the modelling process. In particular, the framework systematizes the modelling...

  12. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    Science.gov (United States)

    Fushing, Hsieh; McAssey, Michael P; Beisner, Brianne; McCowan, Brenda

    2011-03-15

    We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  13. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    Directory of Open Access Journals (Sweden)

    Hsieh Fushing

    2011-03-01

    Full Text Available We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  14. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    Science.gov (United States)

    Bhardwaj, Jyotirmoy; Gupta, Karunesh K.; Gupta, Rajiv

    2018-02-01

    New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS) approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.

  15. Toward Confirming a Framework for Securing the Virtual Machine Image in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Raid Khalid Hussein

    2017-04-01

    Full Text Available The concept of cloud computing has arisen thanks to academic work in the fields of utility computing, distributed computing, virtualisation, and web services. By using cloud computing, which can be accessed from anywhere, newly-launched businesses can minimise their start-up costs. Among the most important notions when it comes to the construction of cloud computing is virtualisation. While this concept brings its own security risks, these risks are not necessarily related to the cloud. The main disadvantage of using cloud computing is linked to safety and security. This is because anybody which chooses to employ cloud computing will use someone else’s hard disk and CPU in order to sort and store data. In cloud environments, a great deal of importance is placed on guaranteeing that the virtual machine image is safe and secure. Indeed, a previous study has put forth a framework with which to protect the virtual machine image in cloud computing. As such, the present study is primarily concerned with confirming this theoretical framework so as to ultimately secure the virtual machine image in cloud computing. This will be achieved by carrying out interviews with experts in the field of cloud security.

  16. Moral foundations and political attitudes: The moderating role of political sophistication.

    Science.gov (United States)

    Milesi, Patrizia

    2016-08-01

    Political attitudes can be associated with moral concerns. This research investigated whether people's level of political sophistication moderates this association. Based on the Moral Foundations Theory, this article examined whether political sophistication moderates the extent to which reliance on moral foundations, as categories of moral concerns, predicts judgements about policy positions. With this aim, two studies examined four policy positions shown by previous research to be best predicted by the endorsement of Sanctity, that is, the category of moral concerns focused on the preservation of physical and spiritual purity. The results showed that reliance on Sanctity predicted political sophisticates' judgements, as opposed to those of unsophisticates, on policy positions dealing with equal rights for same-sex and unmarried couples and with euthanasia. Political sophistication also interacted with Fairness endorsement, which includes moral concerns for equal treatment of everybody and reciprocity, in predicting judgements about equal rights for unmarried couples, and interacted with reliance on Authority, which includes moral concerns for obedience and respect for traditional authorities, in predicting opposition to stem cell research. Those findings suggest that, at least for these particular issues, endorsement of moral foundations can be associated with political attitudes more strongly among sophisticates than unsophisticates. © 2015 International Union of Psychological Science.

  17. FILTWAM - A Framework for Online Affective Computing in Serious Games

    NARCIS (Netherlands)

    Bahreini, Kiavash; Westera, Wim; Nadolski, Rob

    2012-01-01

    Bahreini, K., Westera, W., & Nadolski, R. (2012, 29-31 October). FILTWAM - A Framework for Online Affective Computing in Serious Games. Presentation at the 4th International Conference on Games and Virtual Worlds for Serious Applications, VS-GAMES’12, Genoa, Italy.

  18. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    Energy Technology Data Exchange (ETDEWEB)

    Hadjidoukas, P.E.; Angelikopoulos, P. [Computational Science and Engineering Laboratory, ETH Zürich, CH-8092 (Switzerland); Papadimitriou, C. [Department of Mechanical Engineering, University of Thessaly, GR-38334 Volos (Greece); Koumoutsakos, P., E-mail: petros@ethz.ch [Computational Science and Engineering Laboratory, ETH Zürich, CH-8092 (Switzerland)

    2015-03-01

    We present Π4U,{sup 1} an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  19. Anomalous Diffusion within the Transcriptome as a Bio-Inspired Computing Framework for Resilience

    Directory of Open Access Journals (Sweden)

    William Seffens

    2017-07-01

    Full Text Available Much of biology-inspired computer science is based on the Central Dogma, as implemented with genetic algorithms or evolutionary computation. That 60-year-old biological principle based on the genome, transcriptome and proteasome is becoming overshadowed by a new paradigm of complex ordered associations and connections between layers of biological entities, such as interactomes, metabolomics, etc. We define a new hierarchical concept as the “Connectosome”, and propose new venues of computational data structures based on a conceptual framework called “Grand Ensemble” which contains the Central Dogma as a subset. Connectedness and communication within and between living or biology-inspired systems comprise ensembles from which a physical computing system can be conceived. In this framework the delivery of messages is filtered by size and a simple and rapid semantic analysis of their content. This work aims to initiate discussion on the Grand Ensemble in network biology as a representation of a Persistent Turing Machine. This framework adding interaction and persistency to the classic Turing-machine model uses metrics based on resilience that has application to dynamic optimization problem solving in Genetic Programming.

  20. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  1. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  2. DEEP: a general computational framework for predicting enhancers

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2014-11-05

    Transcription regulation in multicellular eukaryotes is orchestrated by a number of DNA functional elements located at gene regulatory regions. Some regulatory regions (e.g. enhancers) are located far away from the gene they affect. Identification of distal regulatory elements is a challenge for the bioinformatics research. Although existing methodologies increased the number of computationally predicted enhancers, performance inconsistency of computational models across different cell-lines, class imbalance within the learning sets and ad hoc rules for selecting enhancer candidates for supervised learning, are some key questions that require further examination. In this study we developed DEEP, a novel ensemble prediction framework. DEEP integrates three components with diverse characteristics that streamline the analysis of enhancer\\'s properties in a great variety of cellular conditions. In our method we train many individual classification models that we combine to classify DNA regions as enhancers or non-enhancers. DEEP uses features derived from histone modification marks or attributes coming from sequence characteristics. Experimental results indicate that DEEP performs better than four state-of-the-art methods on the ENCODE data. We report the first computational enhancer prediction results on FANTOM5 data where DEEP achieves 90.2% accuracy and 90% geometric mean (GM) of specificity and sensitivity across 36 different tissues. We further present results derived using in vivo-derived enhancer data from VISTA database. DEEP-VISTA, when tested on an independent test set, achieved GM of 80.1% and accuracy of 89.64%. DEEP framework is publicly available at http://cbrc.kaust.edu.sa/deep/.

  3. DEEP: a general computational framework for predicting enhancers

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Kalnis, Panos; Bajic, Vladimir B.

    2014-01-01

    Transcription regulation in multicellular eukaryotes is orchestrated by a number of DNA functional elements located at gene regulatory regions. Some regulatory regions (e.g. enhancers) are located far away from the gene they affect. Identification of distal regulatory elements is a challenge for the bioinformatics research. Although existing methodologies increased the number of computationally predicted enhancers, performance inconsistency of computational models across different cell-lines, class imbalance within the learning sets and ad hoc rules for selecting enhancer candidates for supervised learning, are some key questions that require further examination. In this study we developed DEEP, a novel ensemble prediction framework. DEEP integrates three components with diverse characteristics that streamline the analysis of enhancer's properties in a great variety of cellular conditions. In our method we train many individual classification models that we combine to classify DNA regions as enhancers or non-enhancers. DEEP uses features derived from histone modification marks or attributes coming from sequence characteristics. Experimental results indicate that DEEP performs better than four state-of-the-art methods on the ENCODE data. We report the first computational enhancer prediction results on FANTOM5 data where DEEP achieves 90.2% accuracy and 90% geometric mean (GM) of specificity and sensitivity across 36 different tissues. We further present results derived using in vivo-derived enhancer data from VISTA database. DEEP-VISTA, when tested on an independent test set, achieved GM of 80.1% and accuracy of 89.64%. DEEP framework is publicly available at http://cbrc.kaust.edu.sa/deep/.

  4. Aristotle and Social-Epistemic Rhetoric: The Systematizing of the Sophistic Legacy.

    Science.gov (United States)

    Allen, James E.

    While Aristotle's philosophical views are more foundational than those of many of the Older Sophists, Aristotle's rhetorical theories inherit and incorporate many of the central tenets ascribed to Sophistic rhetoric, albeit in a more systematic fashion, as represented in the "Rhetoric." However, Aristotle was more than just a rhetorical…

  5. NINJA: a noninvasive framework for internal computer security hardening

    Science.gov (United States)

    Allen, Thomas G.; Thomson, Steve

    2004-07-01

    Vulnerabilities are a growing problem in both the commercial and government sector. The latest vulnerability information compiled by CERT/CC, for the year ending Dec. 31, 2002 reported 4129 vulnerabilities representing a 100% increase over the 2001 [1] (the 2003 report has not been published at the time of this writing). It doesn"t take long to realize that the growth rate of vulnerabilities greatly exceeds the rate at which the vulnerabilities can be fixed. It also doesn"t take long to realize that our nation"s networks are growing less secure at an accelerating rate. As organizations become aware of vulnerabilities they may initiate efforts to resolve them, but quickly realize that the size of the remediation project is greater than their current resources can handle. In addition, many IT tools that suggest solutions to the problems in reality only address "some" of the vulnerabilities leaving the organization unsecured and back to square one in searching for solutions. This paper proposes an auditing framework called NINJA (acronym for Network Investigation Notification Joint Architecture) for noninvasive daily scanning/auditing based on common security vulnerabilities that repeatedly occur in a network environment. This framework is used for performing regular audits in order to harden an organizations security infrastructure. The framework is based on the results obtained by the Network Security Assessment Team (NSAT) which emulates adversarial computer network operations for US Air Force organizations. Auditing is the most time consuming factor involved in securing an organization's network infrastructure. The framework discussed in this paper uses existing scripting technologies to maintain a security hardened system at a defined level of performance as specified by the computer security audit team. Mobile agents which were under development at the time of this writing are used at a minimum to improve the noninvasiveness of our scans. In general, noninvasive

  6. An Adaptive and Integrated Low-Power Framework for Multicore Mobile Computing

    Directory of Open Access Journals (Sweden)

    Jongmoo Choi

    2017-01-01

    Full Text Available Employing multicore in mobile computing such as smartphone and IoT (Internet of Things device is a double-edged sword. It provides ample computing capabilities required in recent intelligent mobile services including voice recognition, image processing, big data analysis, and deep learning. However, it requires a great deal of power consumption, which causes creating a thermal hot spot and putting pressure on the energy resource in a mobile device. In this paper, we propose a novel framework that integrates two well-known low-power techniques, DPM (Dynamic Power Management and DVFS (Dynamic Voltage and Frequency Scaling for energy efficiency in multicore mobile systems. The key feature of the proposed framework is adaptability. By monitoring the online resource usage such as CPU utilization and power consumption, the framework can orchestrate diverse DPM and DVFS policies according to workload characteristics. Real implementation based experiments using three mobile devices have shown that it can reduce the power consumption ranging from 22% to 79%, while affecting negligibly the performance of workloads.

  7. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    Directory of Open Access Journals (Sweden)

    J. Bhardwaj

    2018-02-01

    Full Text Available New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.

  8. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation.

    Science.gov (United States)

    Mangado, Nerea; Ceresa, Mario; Duchateau, Nicolas; Kjer, Hans Martin; Vera, Sergio; Dejea Velardo, Hector; Mistrik, Pavel; Paulsen, Rasmus R; Fagertun, Jens; Noailly, Jérôme; Piella, Gemma; González Ballester, Miguel Ángel

    2016-08-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging. To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient's CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns constitutive parameters to all components of the finite element model. This model can then be used to study in silico the effects of the electrical stimulation of the cochlear implant. Results are shown on a total of 25 models of patients. In all cases, a final mesh suitable for finite element simulations was obtained, in an average time of 94 s. The framework has proven to be fast and robust, and is promising for a detailed prognosis of the cochlear implantation surgery.

  9. plasmaFoam: An OpenFOAM framework for computational plasma physics and chemistry

    Science.gov (United States)

    Venkattraman, Ayyaswamy; Verma, Abhishek Kumar

    2016-09-01

    As emphasized in the 2012 Roadmap for low temperature plasmas (LTP), scientific computing has emerged as an essential tool for the investigation and prediction of the fundamental physical and chemical processes associated with these systems. While several in-house and commercial codes exist, with each having its own advantages and disadvantages, a common framework that can be developed by researchers from all over the world will likely accelerate the impact of computational studies on advances in low-temperature plasma physics and chemistry. In this regard, we present a finite volume computational toolbox to perform high-fidelity simulations of LTP systems. This framework, primarily based on the OpenFOAM solver suite, allows us to enhance our understanding of multiscale plasma phenomenon by performing massively parallel, three-dimensional simulations on unstructured meshes using well-established high performance computing tools that are widely used in the computational fluid dynamics community. In this talk, we will present preliminary results obtained using the OpenFOAM-based solver suite with benchmark three-dimensional simulations of microplasma devices including both dielectric and plasma regions. We will also discuss the future outlook for the solver suite.

  10. The predictors of economic sophistication: media, interpersonal communication and negative economic experiences

    NARCIS (Netherlands)

    Kalogeropoulos, A.; Albæk, E.; de Vreese, C.H.; van Dalen, A.

    2015-01-01

    In analogy to political sophistication, it is imperative that citizens have a certain level of economic sophistication, especially in times of heated debates about the economy. This study examines the impact of different influences (media, interpersonal communication and personal experiences) on

  11. VCC-SSF: Service-Oriented Security Framework for Vehicular Cloud Computing

    Directory of Open Access Journals (Sweden)

    Won Min Kang

    2015-02-01

    Full Text Available Recently, as vehicle computing technology has advanced, the paradigm of the vehicle has changed from a simple means of transportation to a smart vehicle for safety and convenience. In addition, the previous functions of the Intelligent Transportation System (ITS such as traffic accident prevention and providing traffic volume information have been combined with cloud computing. ITS services provide user-oriented broad services in the Vehicular Cloud Computing (VCC environment through efficient traffic management, traffic accident prevention, and convenience services. However, existing vehicle services focus on providing services using sensing information inside the vehicle and the system to provide the service through an interface with the external infrastructure is insufficient. In addition, because wireless networks are used in VCC environments, there is a risk of important information leakage from sensors inside the vehicle, such as driver personal identification and payment information at the time of goods purchase. We propose the VCC Service-oriented Security Framework (VCC-SSF to address the limitations and security threats of VCC-based services. The proposed framework considers security for convenient and efficient services of VCC and includes new user-oriented payment management and active accident management services. Furthermore, it provides authentication, encryption, access control, confidentiality, integrity, and privacy protection for user personal information and information inside the vehicle.

  12. Operator Decomposition Framework for Perturbation Theory

    Energy Technology Data Exchange (ETDEWEB)

    Abdel-Khalik, Hany S.; Wang, Congjian; Bang, Young Suk [North Carolina State University, Raleigh (United States)

    2012-05-15

    This summary describes a new framework for perturbation theory intended to improve its performance, in terms of the associated computational cost and the complexity of implementation, for routine reactor calculations in support of design, analysis, and regulation. Since its first introduction in reactor analysis by Winger, perturbation theory has assumed an aura of sophistication with regard to its implementation and its capabilities. Only few reactor physicists, typically mathematically proficient, have contributed to its development, with the general body of the nuclear engineering community remaining unaware of its current status, capabilities, and challenges. Given its perceived sophistication and the small body of community users, the application of perturbation theory has been limited to investigatory analyses only. It is safe to say that the nuclear community is split into two groups, a small one which understands the theory and, and a much bigger group with the perceived notion that perturbation theory is nothing but a fancy mathematical approach that has very little use in practice. Over the past three years, research has demonstrated two goals. First, reduce the computational cost of perturbation theory in order to enable its use for routine reactor calculations. Second, expose some of the myth about perturbation theory and present it in a form that is simple and relatable in order to stimulate the interest of nuclear practitioners, especially those who are currently working on the development of next generation reactor design and analysis tools. The operator decomposition approach has its roots in linear algebra and can be easily understood by code developers, especially those involved in the design of iterative numerical solution strategies

  13. PAUL AND SOPHISTIC RHETORIC: A PERSPECTIVE ON HIS ...

    African Journals Online (AJOL)

    use of modern rhetorical theories but analyses the letter in terms of the clas- ..... If a critical reader would have had the traditional anti-sophistic arsenal ..... pressions and that 'rhetoric' is mainly a matter of communicating these thoughts.

  14. Computational Thermodynamics and Kinetics-Based ICME Framework for High-Temperature Shape Memory Alloys

    Science.gov (United States)

    Arróyave, Raymundo; Talapatra, Anjana; Johnson, Luke; Singh, Navdeep; Ma, Ji; Karaman, Ibrahim

    2015-11-01

    Over the last decade, considerable interest in the development of High-Temperature Shape Memory Alloys (HTSMAs) for solid-state actuation has increased dramatically as key applications in the aerospace and automotive industry demand actuation temperatures well above those of conventional SMAs. Most of the research to date has focused on establishing the (forward) connections between chemistry, processing, (micro)structure, properties, and performance. Much less work has been dedicated to the development of frameworks capable of addressing the inverse problem of establishing necessary chemistry and processing schedules to achieve specific performance goals. Integrated Computational Materials Engineering (ICME) has emerged as a powerful framework to address this problem, although it has yet to be applied to the development of HTSMAs. In this paper, the contributions of computational thermodynamics and kinetics to ICME of HTSMAs are described. Some representative examples of the use of computational thermodynamics and kinetics to understand the phase stability and microstructural evolution in HTSMAs are discussed. Some very recent efforts at combining both to assist in the design of HTSMAs and limitations to the full implementation of ICME frameworks for HTSMA development are presented.

  15. Complexity and Intensionality in a Type-1 Framework for Computable Analysis

    DEFF Research Database (Denmark)

    Lambov, Branimir Zdravkov

    2005-01-01

    This paper describes a type-1 framework for computable analysis designed to facilitate efficient implementations and discusses properties that have not been well studied before for type-1 approaches: the introduction of complexity measures for type-1 representations of real functions, and ways...

  16. Isocratean Discourse Theory and Neo-Sophistic Pedagogy: Implications for the Composition Classroom.

    Science.gov (United States)

    Blair, Kristine L.

    With the recent interest in the fifth century B.C. theories of Protagoras and Gorgias come assumptions about the philosophical affinity of the Greek educator Isocrates to this pair of older sophists. Isocratean education in discourse, with its emphasis on collaborative political discourse, falls within recent definitions of a sophist curriculum.…

  17. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    Energy Technology Data Exchange (ETDEWEB)

    Kostin, Mikhail [Michigan State Univ., East Lansing, MI (United States); Mokhov, Nikolai [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Niita, Koji [Research Organization for Information Science and Technology, Ibaraki-ken (Japan)

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  18. An Open Source Computational Framework for Uncertainty Quantification of Plasma Chemistry Models

    OpenAIRE

    Zaheri Sarabi, Shadi

    2017-01-01

    The current thesis deals with the development of a computational framework for performing plasma chemistry simulations and their uncertainty quantification analysis by suitably combining and extending existing open source computational tools. A plasma chemistry solver is implemented in the OpenFOAM C++ solver suite. The OpenFOAM plasma chemistry application solves the species conservation equations and the electron energy equation by accounting suitably for various production and loss terms b...

  19. The Unlock Project: a Python-based framework for practical brain-computer interface communication "app" development.

    Science.gov (United States)

    Brumberg, Jonathan S; Lorenz, Sean D; Galbraith, Byron V; Guenther, Frank H

    2012-01-01

    In this paper we present a framework for reducing the development time needed for creating applications for use in non-invasive brain-computer interfaces (BCI). Our framework is primarily focused on facilitating rapid software "app" development akin to current efforts in consumer portable computing (e.g. smart phones and tablets). This is accomplished by handling intermodule communication without direct user or developer implementation, instead relying on a core subsystem for communication of standard, internal data formats. We also provide a library of hardware interfaces for common mobile EEG platforms for immediate use in BCI applications. A use-case example is described in which a user with amyotrophic lateral sclerosis participated in an electroencephalography-based BCI protocol developed using the proposed framework. We show that our software environment is capable of running in real-time with updates occurring 50-60 times per second with limited computational overhead (5 ms system lag) while providing accurate data acquisition and signal analysis.

  20. Computer-Aided Chemical Product Design Framework: Design of High Performance and Environmentally Friendly Refrigerants

    DEFF Research Database (Denmark)

    Cignitti, Stefano; Zhang, Lei; Gani, Rafiqul

    properties and needs should carefully be selected for a given heat pump cycle to ensure that an optimum refrigerant is found? How can cycle performance and environmental criteria be integrated at the product design stage and not in post-design analysis? Computer-aided product design methods enable...... the possibility of designing novel molecules, mixtures and blends, such as refrigerants through a systematic framework (Cignitti et al., 2015; Yunus et al., 2014). In this presentation a computer-aided framework is presented for chemical product design through mathematical optimization. Here, molecules, mixtures...... and blends, are systematically designed through a decomposition based solution method. Given a problem definition, computer-aided molecular design (CAMD) problem is defined, which is formulated into a mixed integer nonlinear program (MINLP). The decomposed solution method then sequentially divides the MINLP...

  1. Arcade: A Web-Java Based Framework for Distributed Computing

    Science.gov (United States)

    Chen, Zhikai; Maly, Kurt; Mehrotra, Piyush; Zubair, Mohammad; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    Distributed heterogeneous environments are being increasingly used to execute a variety of large size simulations and computational problems. We are developing Arcade, a web-based environment to design, execute, monitor, and control distributed applications. These targeted applications consist of independent heterogeneous modules which can be executed on a distributed heterogeneous environment. In this paper we describe the overall design of the system and discuss the prototype implementation of the core functionalities required to support such a framework.

  2. A framework for different levels of integration of computational models into web-based virtual patients.

    Science.gov (United States)

    Kononowicz, Andrzej A; Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-23

    Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients' interactivity by enriching them with computational models of physiological and pathological processes. The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the

  3. Dual-scan technique for the customization of zirconia computer-aided design/computer-aided manufacturing frameworks.

    Science.gov (United States)

    Andreiuolo, Rafael Ferrone; Sabrosa, Carlos Eduardo; Dias, Katia Regina H Cervantes

    2013-09-01

    The use of bi-layered all-ceramic crowns has continuously grown since the introduction of computer-aided design/computer-aided manufacturing (CAD/CAM) zirconia cores. Unfortunately, despite the outstanding mechanical properties of zirconia, problems related to porcelain cracking or chipping remain. One of the reasons for this is that ceramic copings are usually milled to uniform thicknesses of 0.3-0.6 mm around the whole tooth preparation. This may not provide uniform thickness or appropriate support for the veneering porcelain. To prevent these problems, the dual-scan technique demonstrates an alternative that allows the restorative team to customize zirconia CAD/CAM frameworks with adequate porcelain thickness and support in a simple manner.

  4. ProjectQ: an open source software framework for quantum computing

    Directory of Open Access Journals (Sweden)

    Damian S. Steiger

    2018-01-01

    Full Text Available We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through simulation and enables running them on actual quantum hardware using a back-end connecting to the IBM Quantum Experience cloud service. Through extension mechanisms, users can provide back-ends to further quantum hardware, and scientists working on quantum compilation can provide plug-ins for additional compilation, optimization, gate synthesis, and layout strategies.

  5. Factors That Influence Adoption of Cloud Computing: An Empirical Study of Australian SMEs

    Directory of Open Access Journals (Sweden)

    Ishan Senarathna

    2018-06-01

    Full Text Available Cloud computing is a recent computing paradigm enabling organizations to have access to sophisticated computing services via the Internet on a fee-for-service basis. It provides Small and Medium-sized Enterprises (SMEs with opportunities to become as technologically advanced as their larger counterparts, without significant financial outlays. This paper examined the important factors that influence SMEs’ adoption of cloud computing technology. Drawing upon aspects of the Technology, Organization and Environment framework and Diffusion of Innovation Theory, we developed a research model of SMEs’ adoption of cloud computing and tested it through an online survey of 149 Australian SMEs. Data was analyzed using multiple regression methods, with results showing that SMEs were influenced by factors related to advantaging their organizational capability (i.e., relative advantage, quality of service and awareness rather than risk-related factors (i.e., security, privacy and flexibility. The findings offer insights to SMEs owners, Cloud service providers and government in establishing Cloud computing adoption strategies for SMEs.

  6. Hydra-Ring: a computational framework to combine failure probabilities

    Science.gov (United States)

    Diermanse, Ferdinand; Roscoe, Kathryn; IJmker, Janneke; Mens, Marjolein; Bouwer, Laurens

    2013-04-01

    This presentation discusses the development of a new computational framework for the safety assessment of flood defence systems: Hydra-Ring. Hydra-Ring computes the failure probability of a flood defence system, which is composed of a number of elements (e.g., dike segments, dune segments or hydraulic structures), taking all relevant uncertainties explicitly into account. This is a major step forward in comparison with the current Dutch practice in which the safety assessment is done separately per individual flood defence section. The main advantage of the new approach is that it will result in a more balanced prioratization of required mitigating measures ('more value for money'). Failure of the flood defence system occurs if any element within the system fails. Hydra-Ring thus computes and combines failure probabilities of the following elements: - Failure mechanisms: A flood defence system can fail due to different failure mechanisms. - Time periods: failure probabilities are first computed for relatively small time scales (assessment of flood defense systems, Hydra-Ring can also be used to derive fragility curves, to asses the efficiency of flood mitigating measures, and to quantify the impact of climate change and land subsidence on flood risk. Hydra-Ring is being developed in the context of the Dutch situation. However, the computational concept is generic and the model is set up in such a way that it can be applied to other areas as well. The presentation will focus on the model concept and probabilistic computation techniques.

  7. Cognitive ability rivals the effect of political sophistication on ideological voting

    DEFF Research Database (Denmark)

    Hebbelstrup Rye Rasmussen, Stig

    2016-01-01

    This article examines the impact of cognitive ability on ideological voting. We find, using a US sample and a Danish sample, that the effect of cognitive ability rivals the effect of the traditionally strongest predicter of ideological voting political sophistication. Furthermore, the results...... are consistent with the effect of cognitive ability being partly mediated by political sophistication. Much of the effect of cognitive ability remains however and is not explained by differences in education or Openness to experience either. The implications of these results for democratic theory are discussed....

  8. A comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements.

    Science.gov (United States)

    Abdelgaied, A; Fisher, J; Jennings, L M

    2018-02-01

    A more robust pre-clinical wear simulation framework is required in order to simulate wider and higher ranges of activities, observed in different patient populations such as younger more active patients. Such a framework will help to understand and address the reported higher failure rates for younger and more active patients (National_Joint_Registry, 2016). The current study has developed and validated a comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements (TKR). The input mechanical (elastic modulus and Poisson's ratio) and wear parameters of the moderately cross-linked ultra-high molecular weight polyethylene (UHMWPE) bearing material were independently measured from experimental studies under realistic test conditions, similar to the loading conditions found in the total knee replacements. The wear predictions from the computational wear simulation were validated against the direct experimental wear measurements for size 3 Sigma curved total knee replacements (DePuy, UK) in an independent experimental wear simulation study under three different daily activities; walking, deep squat, and stairs ascending kinematic conditions. The measured compressive mechanical properties of the moderately cross-linked UHMWPE material were more than 20% lower than that reported in the literature under tensile test conditions. The pin-on-plate wear coefficient of moderately cross-linked UHMWPE was significantly dependant of the contact stress and the degree of cross-shear at the articulating surfaces. The computational wear predictions for the TKR from the current framework were consistent and in a good agreement with the independent full TKR experimental wear simulation measurements, with 0.94 coefficient of determination of the framework. In addition, the comprehensive combined experimental and computational framework was able to explain the complex experimental wear trends from the three different daily

  9. Application of computer-aided multi-scale modelling framework – Aerosol case study

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Glarborg, Peter

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy and water. This trend is set to continue due to the substantial benefits computer-aided...... methods provide. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task involving...... numerous steps, expert skills and different modelling tools. This motivates the development of a computer-aided modelling framework that supports the user during model development, documentation, analysis, identification, application and re-use with the goal to increase the efficiency of the modelling...

  10. A Framework for Modeling Competitive and Cooperative Computation in Retinal Processing

    Science.gov (United States)

    Moreno-Díaz, Roberto; de Blasio, Gabriel; Moreno-Díaz, Arminda

    2008-07-01

    The structure of the retina suggests that it should be treated (at least from the computational point of view), as a layered computer. Different retinal cells contribute to the coding of the signals down to ganglion cells. Also, because of the nature of the specialization of some ganglion cells, the structure suggests that all these specialization processes should take place at the inner plexiform layer and they should be of a local character, prior to a global integration and frequency-spike coding by the ganglion cells. The framework we propose consists of a layered computational structure, where outer layers provide essentially with band-pass space-time filtered signals which are progressively delayed, at least for their formal treatment. Specialization is supposed to take place at the inner plexiform layer by the action of spatio-temporal microkernels (acting very locally), and having a centerperiphery space-time structure. The resulting signals are then integrated by the ganglion cells through macrokernels structures. Practically all types of specialization found in different vertebrate retinas, as well as the quasilinear behavior in some higher vertebrates, can be modeled and simulated within this framework. Finally, possible feedback from central structures is considered. Though their relevance to retinal processing is not definitive, it is included here for the sake of completeness, since it is a formal requisite for recursiveness.

  11. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    Science.gov (United States)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a

  12. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  13. A Framework for Security Transparency in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Umar Mukhtar Ismail

    2016-02-01

    Full Text Available Individuals and corporate users are persistently considering cloud adoption due to its significant benefits compared to traditional computing environments. The data and applications in the cloud are stored in an environment that is separated, managed and maintained externally to the organisation. Therefore, it is essential for cloud providers to demonstrate and implement adequate security practices to protect the data and processes put under their stewardship. Security transparency in the cloud is likely to become the core theme that underpins the systematic disclosure of security designs and practices that enhance customer confidence in using cloud service and deployment models. In this paper, we present a framework that enables a detailed analysis of security transparency for cloud based systems. In particular, we consider security transparency from three different levels of abstraction, i.e., conceptual, organisation and technical levels, and identify the relevant concepts within these levels. This allows us to provide an elaboration of the essential concepts at the core of transparency and analyse the means for implementing them from a technical perspective. Finally, an example from a real world migration context is given to provide a solid discussion on the applicability of the proposed framework.

  14. A framework using cluster-based hybrid network architecture for collaborative virtual surgery.

    Science.gov (United States)

    Qin, Jing; Choi, Kup-Sze; Poon, Wai-Sang; Heng, Pheng-Ann

    2009-12-01

    Research on collaborative virtual environments (CVEs) opens the opportunity for simulating the cooperative work in surgical operations. It is however a challenging task to implement a high performance collaborative surgical simulation system because of the difficulty in maintaining state consistency with minimum network latencies, especially when sophisticated deformable models and haptics are involved. In this paper, an integrated framework using cluster-based hybrid network architecture is proposed to support collaborative virtual surgery. Multicast transmission is employed to transmit updated information among participants in order to reduce network latencies, while system consistency is maintained by an administrative server. Reliable multicast is implemented using distributed message acknowledgment based on cluster cooperation and sliding window technique. The robustness of the framework is guaranteed by the failure detection chain which enables smooth transition when participants join and leave the collaboration, including normal and involuntary leaving. Communication overhead is further reduced by implementing a number of management approaches such as computational policies and collaborative mechanisms. The feasibility of the proposed framework is demonstrated by successfully extending an existing standalone orthopedic surgery trainer into a collaborative simulation system. A series of experiments have been conducted to evaluate the system performance. The results demonstrate that the proposed framework is capable of supporting collaborative surgical simulation.

  15. Master of Puppets: An Animation-by-Demonstration Computer Puppetry Authoring Framework

    Science.gov (United States)

    Cui, Yaoyuan; Mousas, Christos

    2018-03-01

    This paper presents Master of Puppets (MOP), an animation-by-demonstration framework that allows users to control the motion of virtual characters (puppets) in real time. In the first step, the user is asked to perform the necessary actions that correspond to the character's motions. The user's actions are recorded, and a hidden Markov model is used to learn the temporal profile of the actions. During the runtime of the framework, the user controls the motions of the virtual character based on the specified activities. The advantage of the MOP framework is that it recognizes and follows the progress of the user's actions in real time. Based on the forward algorithm, the method predicts the evolution of the user's actions, which corresponds to the evolution of the character's motion. This method treats characters as puppets that can perform only one motion at a time. This means that combinations of motion segments (motion synthesis), as well as the interpolation of individual motion sequences, are not provided as functionalities. By implementing the framework and presenting several computer puppetry scenarios, its efficiency and flexibility in animating virtual characters is demonstrated.

  16. The Relationship between Logistics Sophistication and Drivers of the Outsourcing of Logistics Activities

    Directory of Open Access Journals (Sweden)

    Peter Wanke

    2008-10-01

    Full Text Available A strong link has been established between operational excellence and the degree of sophistication of logistics organization, a function of factors such as performance monitoring, investment in Information Technology [IT] and the formalization of logistics organization, as proposed in the Bowersox, Daugherty, Dröge, Germain and Rogers (1992 Leading Edge model. At the same time, shippers have been increasingly outsourcing their logistics activities to third party providers. This paper, based on a survey with large Brazilian shippers, addresses a gap in the literature by investigating the relationship between dimensions of logistics organization sophistication and drivers of logistics outsourcing. To this end, the dimensions behind the logistics sophistication construct were first investigated. Results from factor analysis led to the identification of six dimensions of logistics sophistication. By means of multivariate logistical regression analyses it was possible to relate some of these dimensions, such as the formalization of the logistics organization, to certain drivers of the outsourcing of logistics activities of Brazilian shippers, such as cost savings. These results indicate the possibility of segmenting shippers according to characteristics of their logistics organization, which may be particularly useful to logistics service providers.

  17. Lexical Complexity Development from Dynamic Systems Theory Perspective: Lexical Density, Diversity, and Sophistication

    Directory of Open Access Journals (Sweden)

    Reza Kalantari

    2017-10-01

    Full Text Available This longitudinal case study explored Iranian EFL learners’ lexical complexity (LC through the lenses of Dynamic Systems Theory (DST. Fifty independent essays written by five intermediate to advanced female EFL learners in a TOEFL iBT preparation course over six months constituted the corpus of this study. Three Coh-Metrix indices (Graesser, McNamara, Louwerse, & Cai, 2004; McNamara & Graesser, 2012, three Lexical Complexity Analyzer indices (Lu, 2010, 2012; Lu & Ai, 2011, and four Vocabprofile indices (Cobb, 2000 were selected to measure different dimensions of LC. Results of repeated measures analysis of variance (RM ANOVA indicated an improvement with regard to only lexical sophistication. Positive and significant relationships were found between time and mean values in Academic Word List and Beyond-2000 as indicators of lexical sophistication. The remaining seven indices of LC, falling short of significance, tended to flatten over the course of this writing program. Correlation analyses among LC indices indicated that lexical density enjoyed positive correlations with lexical sophistication. However, lexical diversity revealed no significant correlations with both lexical density and lexical sophistication. This study suggests that DST perspective specifies a viable foundation for analyzing lexical complexity

  18. Autonomic computing meets SCADA security

    OpenAIRE

    Nazir, S; Patel, S; Patel, D

    2017-01-01

    © 2017 IEEE. National assets such as transportation networks, large manufacturing, business and health facilities, power generation, and distribution networks are critical infrastructures. The cyber threats to these infrastructures have increasingly become more sophisticated, extensive and numerous. Cyber security conventional measures have proved useful in the past but increasing sophistication of attacks dictates the need for newer measures. The autonomic computing paradigm mimics the auton...

  19. A framework for model-based optimization of bioprocesses under uncertainty: Identifying critical parameters and operating variables

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    This study presents the development and application of a systematic model-based framework for bioprocess optimization, evaluated on a cellulosic ethanol production case study. The implementation of the framework involves the use of dynamic simulations, sophisticated uncertainty analysis (Monte...

  20. Comparative Evaluation of a Four-Implant-Supported Polyetherketoneketone Framework Prosthesis: A Three-Dimensional Finite Element Analysis Based on Cone Beam Computed Tomography and Computer-Aided Design.

    Science.gov (United States)

    Lee, Ki-Sun; Shin, Sang-Wan; Lee, Sang-Pyo; Kim, Jong-Eun; Kim, Jee-Hwan; Lee, Jeong-Yol

    The purpose of this pilot study was to evaluate and compare polyetherketoneketone (PEKK) with different framework materials for implant-supported prostheses by means of a three-dimensional finite element analysis (3D-FEA) based on cone beam computed tomography (CBCT) and computer-aided design (CAD) data. A geometric model that consisted of four maxillary implants supporting a prosthesis framework was constructed from CBCT and CAD data of a treated patient. Three different materials (zirconia, titanium, and PEKK) were selected, and their material properties were simulated using FEA software in the generated geometric model. In the PEKK framework (ie, low elastic modulus) group, the stress transferred to the implant and simulated adjacent tissue was reduced when compressive stress was dominant, but increased when tensile stress was dominant. This study suggests that the shock-absorbing effects of a resilient implant-supported framework are limited in some areas and that rigid framework material shows a favorable stress distribution and safety of overall components of the prosthesis.

  1. Conceptual Framework for Using Computers to Enhance Employee Engagement in Large Offices

    Science.gov (United States)

    Gill, Rob

    2010-01-01

    Using computers to engage with staff members on their organization's Employer of Choice (EOC) program as part of a human resource development (HRD) framework can add real value to that organization's reputation. EOC is an evolving principle for Australian business. It reflects the value and importance organizations place on their key stakeholders,…

  2. A computational framework for investigating the positional stability of aortic endografts.

    Science.gov (United States)

    Prasad, Anamika; Xiao, Nan; Gong, Xiao-Yan; Zarins, Christopher K; Figueroa, C Alberto

    2013-10-01

    Endovascular aneurysm repair (Greenhalgh in N Engl J Med 362(20):1863-1871, 2010) techniques have revolutionized the treatment of thoracic and abdominal aortic aneurysm disease, greatly reducing the perioperative mortality and morbidity associated with open surgical repair techniques. However, EVAR is not free of important complications such as late device migration, endoleak formation and fracture of device components that may result in adverse events such as aneurysm enlargement, need for long-term imaging surveillance and secondary interventions or even death. These complications result from the device inability to withstand the hemodynamics of blood flow and to keep its originally intended post-operative position over time. Understanding the in vivo biomechanical working environment experienced by endografts is a critical factor in improving their long-term performance. To date, no study has investigated the mechanics of contact between device and aorta in a three-dimensional setting. In this work, we developed a comprehensive Computational Solid Mechanics and Computational Fluid Dynamics framework to investigate the mechanics of endograft positional stability. The main building blocks of this framework are: (1) Three-dimensional non-planar aortic and stent-graft geometrical models, (2) Realistic multi-material constitutive laws for aorta, stent, and graft, (3) Physiological values for blood flow and pressure, and (4) Frictional model to describe the contact between the endograft and the aorta. We introduce a new metric for numerical quantification of the positional stability of the endograft. Lastly, in the results section, we test the framework by investigating the impact of several factors that are clinically known to affect endograft stability.

  3. Exploring methodological frameworks for a mental task-based near-infrared spectroscopy brain-computer interface.

    Science.gov (United States)

    Weyand, Sabine; Takehara-Nishiuchi, Kaori; Chau, Tom

    2015-10-30

    Near-infrared spectroscopy (NIRS) brain-computer interfaces (BCIs) enable users to interact with their environment using only cognitive activities. This paper presents the results of a comparison of four methodological frameworks used to select a pair of tasks to control a binary NIRS-BCI; specifically, three novel personalized task paradigms and the state-of-the-art prescribed task framework were explored. Three types of personalized task selection approaches were compared, including: user-selected mental tasks using weighted slope scores (WS-scores), user-selected mental tasks using pair-wise accuracy rankings (PWAR), and researcher-selected mental tasks using PWAR. These paradigms, along with the state-of-the-art prescribed mental task framework, where mental tasks are selected based on the most commonly used tasks in literature, were tested by ten able-bodied participants who took part in five NIRS-BCI sessions. The frameworks were compared in terms of their accuracy, perceived ease-of-use, computational time, user preference, and length of training. Most notably, researcher-selected personalized tasks resulted in significantly higher accuracies, while user-selected personalized tasks resulted in significantly higher perceived ease-of-use. It was also concluded that PWAR minimized the amount of data that needed to be collected; while, WS-scores maximized user satisfaction and minimized computational time. In comparison to the state-of-the-art prescribed mental tasks, our findings show that overall, personalized tasks appear to be superior to prescribed tasks with respect to accuracy and perceived ease-of-use. The deployment of personalized rather than prescribed mental tasks ought to be considered and further investigated in future NIRS-BCI studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. A collaborative computing framework of cloud network and WBSN applied to fall detection and 3-D motion reconstruction.

    Science.gov (United States)

    Lai, Chin-Feng; Chen, Min; Pan, Jeng-Shyang; Youn, Chan-Hyun; Chao, Han-Chieh

    2014-03-01

    As cloud computing and wireless body sensor network technologies become gradually developed, ubiquitous healthcare services prevent accidents instantly and effectively, as well as provides relevant information to reduce related processing time and cost. This study proposes a co-processing intermediary framework integrated cloud and wireless body sensor networks, which is mainly applied to fall detection and 3-D motion reconstruction. In this study, the main focuses includes distributed computing and resource allocation of processing sensing data over the computing architecture, network conditions and performance evaluation. Through this framework, the transmissions and computing time of sensing data are reduced to enhance overall performance for the services of fall events detection and 3-D motion reconstruction.

  5. Library of sophisticated functions for analysis of nuclear spectra

    Science.gov (United States)

    Morháč, Miroslav; Matoušek, Vladislav

    2009-10-01

    In the paper we present compact library for analysis of nuclear spectra. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. The functions can process one- and two-dimensional spectra. The software described in the paper comprises a number of conventional as well as newly developed methods needed to analyze experimental data. Program summaryProgram title: SpecAnalysLib 1.1 Catalogue identifier: AEDZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 42 154 No. of bytes in distributed program, including test data, etc.: 2 379 437 Distribution format: tar.gz Programming language: C++ Computer: Pentium 3 PC 2.4 GHz or higher, Borland C++ Builder v. 6. A precompiled Windows version is included in the distribution package Operating system: Windows 32 bit versions RAM: 10 MB Word size: 32 bits Classification: 17.6 Nature of problem: The demand for advanced highly effective experimental data analysis functions is enormous. The library package represents one approach to give the physicists the possibility to use the advanced routines simply by calling them from their own programs. SpecAnalysLib is a collection of functions for analysis of one- and two-parameter γ-ray spectra, but they can be used for other types of data as well. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. Solution method: The algorithms of background estimation are based on Sensitive Non-linear Iterative Peak (SNIP) clipping algorithm. The smoothing algorithms are based on the convolution of the original data with several types of filters and algorithms based on discrete

  6. Framework for emotional mobile computation for creating entertainment experience

    Science.gov (United States)

    Lugmayr, Artur R.

    2007-02-01

    Ambient media are media, which are manifesting in the natural environment of the consumer. The perceivable borders between the media and the context, where the media is used are getting more and more blurred. The consumer is moving through a digital space of services throughout his daily life. As we are developing towards an experience society, the central point in the development of services is the creation of a consumer experience. This paper reviews possibilities and potentials of the creation of entertainment experiences with mobile phone platforms. It reviews sensor network capable of acquiring consumer behavior data, interactivity strategies, psychological models for emotional computation on mobile phones, and lays the foundations of a nomadic experience society. The paper rounds up with a presentation of several different possible service scenarios in the field of entertainment and leisure computation on mobiles. The goal of this paper is to present a framework and evaluation of possibilities of applying sensor technology on mobile platforms to create an increasing consumer entertainment experience.

  7. Development of a computational framework on fluid-solid mixture flow simulations for the COMPASS code

    International Nuclear Information System (INIS)

    Zhang, Shuai; Morita, Koji; Shirakawa, Noriyuki; Yamamoto, Yuichi

    2010-01-01

    The COMPASS code is designed based on the moving particle semi-implicit method to simulate various complex mesoscale phenomena relevant to core disruptive accidents of sodium-cooled fast reactors. In this study, a computational framework for fluid-solid mixture flow simulations was developed for the COMPASS code. The passively moving solid model was used to simulate hydrodynamic interactions between fluid and solids. Mechanical interactions between solids were modeled by the distinct element method. A multi-time-step algorithm was introduced to couple these two calculations. The proposed computational framework for fluid-solid mixture flow simulations was verified by the comparison between experimental and numerical studies on the water-dam break with multiple solid rods. (author)

  8. Synthetic analog computation in living cells.

    Science.gov (United States)

    Daniel, Ramiz; Rubens, Jacob R; Sarpeshkar, Rahul; Lu, Timothy K

    2013-05-30

    A central goal of synthetic biology is to achieve multi-signal integration and processing in living cells for diagnostic, therapeutic and biotechnology applications. Digital logic has been used to build small-scale circuits, but other frameworks may be needed for efficient computation in the resource-limited environments of cells. Here we demonstrate that synthetic analog gene circuits can be engineered to execute sophisticated computational functions in living cells using just three transcription factors. Such synthetic analog gene circuits exploit feedback to implement logarithmically linear sensing, addition, ratiometric and power-law computations. The circuits exhibit Weber's law behaviour as in natural biological systems, operate over a wide dynamic range of up to four orders of magnitude and can be designed to have tunable transfer functions. Our circuits can be composed to implement higher-order functions that are well described by both intricate biochemical models and simple mathematical functions. By exploiting analog building-block functions that are already naturally present in cells, this approach efficiently implements arithmetic operations and complex functions in the logarithmic domain. Such circuits may lead to new applications for synthetic biology and biotechnology that require complex computations with limited parts, need wide-dynamic-range biosensing or would benefit from the fine control of gene expression.

  9. Nurturing Opportunity Identification for Business Sophistication in a Cross-disciplinary Study Environment

    Directory of Open Access Journals (Sweden)

    Karine Oganisjana

    2012-12-01

    Full Text Available Opportunity identification is the key element of the entrepreneurial process; therefore the issue of developing this skill in students is a crucial task in contemporary European education which has recognized entrepreneurship as one of the lifelong learning key competences. The earlier opportunity identification becomes a habitual way of thinking and behavior across a broad range of contexts, the more likely that entrepreneurial disposition will steadily reside in students. In order to nurture opportunity identification in students for making them able to organize sophisticated businesses in the future, certain demands ought to be put forward as well to the teacher – the person who is to promote these qualities in their students. The paper reflects some findings of a research conducted within the frameworks of a workplace learning project for the teachers of one of Riga secondary schools (Latvia. The main goal of the project was to teach the teachers to identify hidden inner links between apparently unrelated things, phenomena and events within 10th grade study curriculum and connect them together and create new opportunities. The creation and solution of cross-disciplinary tasks were the means for achieving this goal.

  10. Why is a computational framework for motivational and metacognitive control needed?

    Science.gov (United States)

    Sun, Ron

    2018-01-01

    This paper discusses, in the context of computational modelling and simulation of cognition, the relevance of deeper structures in the control of behaviour. Such deeper structures include motivational control of behaviour, which provides underlying causes for actions, and also metacognitive control, which provides higher-order processes for monitoring and regulation. It is argued that such deeper structures are important and thus cannot be ignored in computational cognitive architectures. A general framework based on the Clarion cognitive architecture is outlined that emphasises the interaction amongst action selection, motivation, and metacognition. The upshot is that it is necessary to incorporate all essential processes; short of that, the understanding of cognition can only be incomplete.

  11. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    Directory of Open Access Journals (Sweden)

    Daniel Müllensiefen

    Full Text Available Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636. Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  12. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    Science.gov (United States)

    Müllensiefen, Daniel; Gingras, Bruno; Musil, Jason; Stewart, Lauren

    2014-01-01

    Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  13. The New Toxicology of Sophisticated Materials: Nanotoxicology and Beyond

    Science.gov (United States)

    Maynard, Andrew D.; Warheit, David B.; Philbert, Martin A.

    2011-01-01

    It has long been recognized that the physical form of materials can mediate their toxicity—the health impacts of asbestiform materials, industrial aerosols, and ambient particulate matter are prime examples. Yet over the past 20 years, toxicology research has suggested complex and previously unrecognized associations between material physicochemistry at the nanoscale and biological interactions. With the rapid rise of the field of nanotechnology and the design and production of increasingly complex nanoscale materials, it has become ever more important to understand how the physical form and chemical composition of these materials interact synergistically to determine toxicity. As a result, a new field of research has emerged—nanotoxicology. Research within this field is highlighting the importance of material physicochemical properties in how dose is understood, how materials are characterized in a manner that enables quantitative data interpretation and comparison, and how materials move within, interact with, and are transformed by biological systems. Yet many of the substances that are the focus of current nanotoxicology studies are relatively simple materials that are at the vanguard of a new era of complex materials. Over the next 50 years, there will be a need to understand the toxicology of increasingly sophisticated materials that exhibit novel, dynamic and multifaceted functionality. If the toxicology community is to meet the challenge of ensuring the safe use of this new generation of substances, it will need to move beyond “nano” toxicology and toward a new toxicology of sophisticated materials. Here, we present a brief overview of the current state of the science on the toxicology of nanoscale materials and focus on three emerging toxicology-based challenges presented by sophisticated materials that will become increasingly important over the next 50 years: identifying relevant materials for study, physicochemical characterization, and

  14. Automated interpretable computational biology in the clinic: a framework to predict disease severity and stratify patients from clinical data

    Directory of Open Access Journals (Sweden)

    Soumya Banerjee

    2017-10-01

    Full Text Available We outline an automated computational and machine learning framework that predicts disease severity and stratifies patients. We apply our framework to available clinical data. Our algorithm automatically generates insights and predicts disease severity with minimal operator intervention. The computational framework presented here can be used to stratify patients, predict disease severity and propose novel biomarkers for disease. Insights from machine learning algorithms coupled with clinical data may help guide therapy, personalize treatment and help clinicians understand the change in disease over time. Computational techniques like these can be used in translational medicine in close collaboration with clinicians and healthcare providers. Our models are also interpretable, allowing clinicians with minimal machine learning experience to engage in model building. This work is a step towards automated machine learning in the clinic.

  15. Maximum likelihood as a common computational framework in tomotherapy

    International Nuclear Information System (INIS)

    Olivera, G.H.; Shepard, D.M.; Reckwerdt, P.J.; Ruchala, K.; Zachman, J.; Fitchard, E.E.; Mackie, T.R.

    1998-01-01

    Tomotherapy is a dose delivery technique using helical or axial intensity modulated beams. One of the strengths of the tomotherapy concept is that it can incorporate a number of processes into a single piece of equipment. These processes include treatment optimization planning, dose reconstruction and kilovoltage/megavoltage image reconstruction. A common computational technique that could be used for all of these processes would be very appealing. The maximum likelihood estimator, originally developed for emission tomography, can serve as a useful tool in imaging and radiotherapy. We believe that this approach can play an important role in the processes of optimization planning, dose reconstruction and kilovoltage and/or megavoltage image reconstruction. These processes involve computations that require comparable physical methods. They are also based on equivalent assumptions, and they have similar mathematical solutions. As a result, the maximum likelihood approach is able to provide a common framework for all three of these computational problems. We will demonstrate how maximum likelihood methods can be applied to optimization planning, dose reconstruction and megavoltage image reconstruction in tomotherapy. Results for planning optimization, dose reconstruction and megavoltage image reconstruction will be presented. Strengths and weaknesses of the methodology are analysed. Future directions for this work are also suggested. (author)

  16. Computed tomography for radiographers

    International Nuclear Information System (INIS)

    Brooker, M.

    1986-01-01

    Computed tomography is regarded by many as a complicated union of sophisticated x-ray equipment and computer technology. This book overcomes these complexities. The rigid technicalities of the machinery and the clinical aspects of computed tomography are discussed including the preparation of patients, both physically and mentally, for scanning. Furthermore, the author also explains how to set up and run a computed tomography department, including advice on how the room should be designed

  17. Sophisticated approval voting, ignorance priors, and plurality heuristics: a behavioral social choice analysis in a Thurstonian framework.

    Science.gov (United States)

    Regenwetter, Michel; Ho, Moon-Ho R; Tsetlin, Ilia

    2007-10-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two types of plurality heuristics to model approval voting behavior. When using a sincere plurality heuristic, voters simplify their decision process by voting for their single favorite candidate. When using a strategic plurality heuristic, voters strategically focus their attention on the 2 front-runners and vote for their preferred candidate among these 2. Using a hierarchy of Thurstonian random utility models, the authors implemented these different decision rules and tested them statistically on 7 real world approval voting elections. They cross-validated their key findings via a psychological Internet experiment. Although a substantial number of voters used the plurality heuristic in the real elections, they did so sincerely, not strategically. Moreover, even though Thurstonian models do not force such agreement, the results show, in contrast to common wisdom about social choice rules, that the sincere social orders by Condorcet, Borda, plurality, and approval voting are identical in all 7 elections and in the Internet experiment. PsycINFO Database Record (c) 2007 APA, all rights reserved.

  18. GLOFRIM v1.0-A globally applicable computational framework for integrated hydrological-hydrodynamic modelling

    NARCIS (Netherlands)

    Hoch, Jannis M.; Neal, Jeffrey C.; Baart, Fedor; Van Beek, Rens; Winsemius, Hessel C.; Bates, Paul D.; Bierkens, Marc F.P.

    2017-01-01

    We here present GLOFRIM, a globally applicable computational framework for integrated hydrological-hydrodynamic modelling. GLOFRIM facilitates spatially explicit coupling of hydrodynamic and hydrologic models and caters for an ensemble of models to be coupled. It currently encompasses the global

  19. Procles the Carthaginian: A North African Sophist in Pausanias’ Periegesis

    Directory of Open Access Journals (Sweden)

    Juan Pablo Sánchez Hernández

    2010-11-01

    Full Text Available Procles, cited by Pausanias (in the imperfect tense about a display in Rome and for an opinion about Pyrrhus of Epirus, probably was not a historian of Hellenistic date, but a contemporary sophist whom Pausanias encountered in person in Rome.

  20. Does underground storage still require sophisticated studies?

    International Nuclear Information System (INIS)

    Marsily, G. de

    1997-01-01

    Most countries agree to the necessity of burying high or medium-level wastes in geological layers situated at a few hundred meters below the ground level. The advantages and disadvantages of different types of rock such as salt, clay, granite and volcanic material are examined. Sophisticated studies are lead to determine the best geological confinement but questions arise about the time for which safety must be ensured. France has chosen 3 possible sites. These sites are geologically described in the article. The final place will be proposed after a testing phase of about 5 years in an underground facility. (A.C.)

  1. Planning Framework for Mesolevel Optimization of Urban Runoff Control Schemes

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qianqian; Blohm, Andrew; Liu, Bo

    2017-04-01

    A planning framework is developed to optimize runoff control schemes at scales relevant for regional planning at an early stage. The framework employs less sophisticated modeling approaches to allow a practical application in developing regions with limited data sources and computing capability. The methodology contains three interrelated modules: (1)the geographic information system (GIS)-based hydrological module, which aims at assessing local hydrological constraints and potential for runoff control according to regional land-use descriptions; (2)the grading module, which is built upon the method of fuzzy comprehensive evaluation. It is used to establish a priority ranking system to assist the allocation of runoff control targets at the subdivision level; and (3)the genetic algorithm-based optimization module, which is included to derive Pareto-based optimal solutions for mesolevel allocation with multiple competing objectives. The optimization approach describes the trade-off between different allocation plans and simultaneously ensures that all allocation schemes satisfy the minimum requirement on runoff control. Our results highlight the importance of considering the mesolevel allocation strategy in addition to measures at macrolevels and microlevels in urban runoff management. (C) 2016 American Society of Civil Engineers.

  2. A Framework for WWW Query Processing

    Science.gov (United States)

    Wu, Binghui Helen; Wharton, Stephen (Technical Monitor)

    2000-01-01

    Query processing is the most common operation in a DBMS. Sophisticated query processing has been mainly targeted at a single enterprise environment providing centralized control over data and metadata. Submitting queries by anonymous users on the web is different in such a way that load balancing or DBMS' accessing control becomes the key issue. This paper provides a solution by introducing a framework for WWW query processing. The success of this framework lies in the utilization of query optimization techniques and the ontological approach. This methodology has proved to be cost effective at the NASA Goddard Space Flight Center Distributed Active Archive Center (GDAAC).

  3. On the relevancy of efficient, integrated computer and network monitoring in HEP distributed online environment

    International Nuclear Information System (INIS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Javello, J.; Miere, Y.; Ruffinoni, D.; Albert, J.N.; Bellas, N.; Smith, G.

    1996-01-01

    Large Scientific Equipment are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them generically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System. (author)

  4. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    Science.gov (United States)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  5. A computational framework for the automated construction of glycosylation reaction networks.

    Science.gov (United States)

    Liu, Gang; Neelamegham, Sriram

    2014-01-01

    Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS) data. The features described above are illustrated using three case studies that examine: i) O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii) automated N-linked glycosylation pathway construction; and iii) the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme biochemistry. All

  6. A computational framework for the automated construction of glycosylation reaction networks.

    Directory of Open Access Journals (Sweden)

    Gang Liu

    Full Text Available Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS data. The features described above are illustrated using three case studies that examine: i O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii automated N-linked glycosylation pathway construction; and iii the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme

  7. A discrete-time Bayesian network reliability modeling and analysis framework

    International Nuclear Information System (INIS)

    Boudali, H.; Dugan, J.B.

    2005-01-01

    Dependability tools are becoming an indispensable tool for modeling and analyzing (critical) systems. However the growing complexity of such systems calls for increasing sophistication of these tools. Dependability tools need to not only capture the complex dynamic behavior of the system components, but they must be also easy to use, intuitive, and computationally efficient. In general, current tools have a number of shortcomings including lack of modeling power, incapacity to efficiently handle general component failure distributions, and ineffectiveness in solving large models that exhibit complex dependencies between their components. We propose a novel reliability modeling and analysis framework based on the Bayesian network (BN) formalism. The overall approach is to investigate timed Bayesian networks and to find a suitable reliability framework for dynamic systems. We have applied our methodology to two example systems and preliminary results are promising. We have defined a discrete-time BN reliability formalism and demonstrated its capabilities from a modeling and analysis point of view. This research shows that a BN based reliability formalism is a powerful potential solution to modeling and analyzing various kinds of system components behaviors and interactions. Moreover, being based on the BN formalism, the framework is easy to use and intuitive for non-experts, and provides a basis for more advanced and useful analyses such as system diagnosis

  8. Finding the Fabulous Few: Why Your Program Needs Sophisticated Research.

    Science.gov (United States)

    Pfizenmaier, Emily

    1981-01-01

    Fund raising, it is argued, needs sophisticated prospect research. Professional prospect researchers play an important role in helping to identify prospective donors and also in helping to stimulate interest in gift giving. A sample of an individual work-up on a donor and a bibliography are provided. (MLW)

  9. A Computational Framework for Multiply-Connected and Electromagnetic Quantum Systems

    Science.gov (United States)

    O'Brien, Allyson

    In this dissertation, we develop the capabilities of the Finite Element Method (FEM) and Finite Element analysis (FEA) in the domain of computational quantum physics. We describe how FEM works and how it has been leveraged in quantum physics research over the last several decades. We derive new methods for modeling and analyzing quantum systems by using "holes" (cutouts) in the geometries of billiards in order to tune energy levels and energy level spacing. We address historical issues of the method in modeling systems with magnetic fields. These issues include nonconvergence of gauge choice as well as non-convergence of solutions at higher energy levels. By developing a set of tools and a framework to form various "admissible systems", we demonstrate that these issues stem from a misrepresentation of FEM algorithm design in quantum models. Through leveraging gauge-invariance in algorithm design, we describe how an appropriate unique gauge is identified for modeling various physical parameters. We then extend this idea into a framework that leverages various gauge selections in order to gain a much more complete picture of a quantum model and its various complementary observables. Finally, we show that this framework extends to modeling quantum systems that are bounded at realistically sized length-scales on the cusp of magnetic confinement. Through this work new limits on the canonical Dirichlet boundary conditions are defined.

  10. A unifying computational framework for stability and flexibility of arousal

    Directory of Open Access Journals (Sweden)

    Christin eKosse

    2014-10-01

    Full Text Available Arousal and consciousness flexibly adjust to salient cues, but remain stable despite noise and disturbance. Diverse, highly interconnected neural networks govern the underlying transitions of behavioural state; these networks are robust but very complex. Frameworks from systems engineering provide powerful tools for understanding functional logic behind component complexity. From a general systems viewpoint, a minimum of three communicating control modules may enable flexibility and stability to coexist. Comparators would subtract current arousal from desired arousal, producing an error signal. Regulators would compute control signals from this error. Generators would convert control signals into arousal, which is fed back to comparators, to make the system noise-proof through self-correction. Can specific neurons correspond to these control elements? To explore this, here we consider the brain-wide orexin/hypocretin network, which is experimentally established to be vital for flexible and stable arousal. We discuss whether orexin neurons may act as comparators, and their target neurons as regulators and generators. Experiments are proposed for testing such predictions, based on computational simulations showing that comparators, regulators, and generators have distinct temporal signatures of activity. If some regulators integrate orexin-communicated errors, robust arousal control may be achieved via integral feedback (a basic engineering strategy for tracking a set-point despite noise. An integral feedback view also suggests functional roles for specific molecular aspects, such as differing life-spans of orexin peptides. The proposed framework offers a unifying logic for molecular, cellular, and network details of arousal systems, and provides insight into behavioral state transitions, complex behaviour, and bases for disease.

  11. Framework and implementation for improving physics essential skills via computer-based practice: Vector math

    Science.gov (United States)

    Mikula, Brendon D.; Heckler, Andrew F.

    2017-06-01

    We propose a framework for improving accuracy, fluency, and retention of basic skills essential for solving problems relevant to STEM introductory courses, and implement the framework for the case of basic vector math skills over several semesters in an introductory physics course. Using an iterative development process, the framework begins with a careful identification of target skills and the study of specific student difficulties with these skills. It then employs computer-based instruction, immediate feedback, mastery grading, and well-researched principles from cognitive psychology such as interleaved training sequences and distributed practice. We implemented this with more than 1500 students over 2 semesters. Students completed the mastery practice for an average of about 13 min /week , for a total of about 2-3 h for the whole semester. Results reveal large (>1 SD ) pretest to post-test gains in accuracy in vector skills, even compared to a control group, and these gains were retained at least 2 months after practice. We also find evidence of improved fluency, student satisfaction, and that awarding regular course credit results in higher participation and higher learning gains than awarding extra credit. In all, we find that simple computer-based mastery practice is an effective and efficient way to improve a set of basic and essential skills for introductory physics.

  12. Mississippi Curriculum Framework for Computer Information Systems Technology. Computer Information Systems Technology (Program CIP: 52.1201--Management Information Systems & Business Data). Computer Programming (Program CIP: 52.1201). Network Support (Program CIP: 52.1290--Computer Network Support Technology). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for two programs in the state's postsecondary-level computer information systems technology cluster: computer programming and network support. Presented in the introduction are program descriptions and suggested course…

  13. A Flexible Computational Framework Using R and Map-Reduce for Permutation Tests of Massive Genetic Analysis of Complex Traits.

    Science.gov (United States)

    Mahjani, Behrang; Toor, Salman; Nettelblad, Carl; Holmgren, Sverker

    2017-01-01

    In quantitative trait locus (QTL) mapping significance of putative QTL is often determined using permutation testing. The computational needs to calculate the significance level are immense, 10 4 up to 10 8 or even more permutations can be needed. We have previously introduced the PruneDIRECT algorithm for multiple QTL scan with epistatic interactions. This algorithm has specific strengths for permutation testing. Here, we present a flexible, parallel computing framework for identifying multiple interacting QTL using the PruneDIRECT algorithm which uses the map-reduce model as implemented in Hadoop. The framework is implemented in R, a widely used software tool among geneticists. This enables users to rearrange algorithmic steps to adapt genetic models, search algorithms, and parallelization steps to their needs in a flexible way. Our work underlines the maturity of accessing distributed parallel computing for computationally demanding bioinformatics applications through building workflows within existing scientific environments. We investigate the PruneDIRECT algorithm, comparing its performance to exhaustive search and DIRECT algorithm using our framework on a public cloud resource. We find that PruneDIRECT is vastly superior for permutation testing, and perform 2 ×10 5 permutations for a 2D QTL problem in 15 hours, using 100 cloud processes. We show that our framework scales out almost linearly for a 3D QTL search.

  14. EU-Korea FTA and Its Impact on V4 Economies. A Comparative Analysis of Trade Sophistication and Intra-Industry Trade

    Directory of Open Access Journals (Sweden)

    Michalski Bartosz

    2018-03-01

    Full Text Available This paper investigates selected short- and mid-term effects in trade in goods between the Visegrad countries (V4: the Czech Republic, Hungary, Poland and the Slovak Republic and the Republic of Korea under the framework of the Free Trade Agreement between the European Union and the Republic of Korea. This Agreement is described in the “Trade for All” (2015: 9 strategy as the most ambitious trade deal ever implemented by the EU. The primary purpose of our analysis is to identify, compare, and evaluate the evolution of the technological sophistication of bilateral exports and imports. Another dimension of the paper concentrates on the developments within intra-industry trade. Moreover, these objectives are approached taking into account the context of the South Korean direct investment inflow to the V4. The evaluation of technological sophistication is based on UNCTAD’s methodology, while the intensity of intra-industry trade is measured by the GL-index and identification of its subcategories (horizontal and vertical trade. The analysis covers the timespan 2001–2015. The novelty of the paper lies in the fact that the study of South Korean-V4 trade relations has not so far been carried out from this perspective. Thus this paper investigates interesting phenomena identified in the trade between the Republic of Korea (ROK and V4 economies. The main findings imply an impact of South Korean direct investments on trade. This is represented by the trade deficit of the V4 with ROK and the structure of bilateral trade in terms of its technological sophistication. South Korean investments might also have had positive consequences for the evolution of IIT, particularly in the machinery sector. The political interpretation indicates that they may strengthen common threats associated with the middle-income trap, particularly the technological gap and the emphasis placed on lower costs of production.

  15. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    Science.gov (United States)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  16. Lexical Sophistication as a Multidimensional Phenomenon: Relations to Second Language Lexical Proficiency, Development, and Writing Quality

    Science.gov (United States)

    Kim, Minkyung; Crossley, Scott A.; Kyle, Kristopher

    2018-01-01

    This study conceptualizes lexical sophistication as a multidimensional phenomenon by reducing numerous lexical features of lexical sophistication into 12 aggregated components (i.e., dimensions) via a principal component analysis approach. These components were then used to predict second language (L2) writing proficiency levels, holistic lexical…

  17. Few remarks on chiral theories with sophisticated topology

    International Nuclear Information System (INIS)

    Golo, V.L.; Perelomov, A.M.

    1978-01-01

    Two classes of the two-dimensional Euclidean chiral field theoreties are singled out: 1) the field phi(x) takes the values in the compact Hermitiam symmetric space 2) the field phi(x) takes the values in an orbit of the adjoint representation of the comcompact Lie group. The theories have sophisticated topological and rich analytical structures. They are considered with the help of topological invariants (topological charges). Explicit formulae for the topological charges are indicated, and the lower bound extimate for the action is given

  18. GLOFRIM v1.0 – A globally applicable computational framework for integrated hydrological–hydrodynamic modelling

    NARCIS (Netherlands)

    Hoch, J.M.; Neal, Jeffrey; Baart, Fedor; van Beek, L.P.H.; Winsemius, Hessel; Bates, Paul; Bierkens, M.F.P.

    2017-01-01

    We here present GLOFRIM, a globally applicable computational framework for integrated hydrological–hydrodynamic modelling. GLOFRIM facilitates spatially explicit coupling of hydrodynamic and hydrologic models and caters for an ensemble of models to be coupled. It currently encompasses the global

  19. Cloud computing-based energy optimization control framework for plug-in hybrid electric bus

    International Nuclear Information System (INIS)

    Yang, Chao; Li, Liang; You, Sixiong; Yan, Bingjie; Du, Xian

    2017-01-01

    Considering the complicated characteristics of traffic flow in city bus route and the nonlinear vehicle dynamics, optimal energy management integrated with clustering and recognition of driving conditions in plug-in hybrid electric bus is still a challenging problem. Motivated by this issue, this paper presents an innovative energy optimization control framework based on the cloud computing for plug-in hybrid electric bus. This framework, which includes offline part and online part, can realize the driving conditions clustering in offline part, and the energy management in online part. In offline part, utilizing the operating data transferred from a bus to the remote monitoring center, K-means algorithm is adopted to cluster the driving conditions, and then Markov probability transfer matrixes are generated to predict the possible operating demand of the bus driver. Next in online part, the current driving condition is real-time identified by a well-trained support vector machine, and Markov chains-based driving behaviors are accordingly selected. With the stochastic inputs, stochastic receding horizon control method is adopted to obtain the optimized energy management of hybrid powertrain. Simulations and hardware-in-loop test are carried out with the real-world city bus route, and the results show that the presented strategy could greatly improve the vehicle fuel economy, and as the traffic flow data feedback increases, the fuel consumption of every plug-in hybrid electric bus running in a specific bus route tends to be a stable minimum. - Highlights: • Cloud computing-based energy optimization control framework is proposed. • Driving cycles are clustered into 6 types by K-means algorithm. • Support vector machine is employed to realize the online recognition of driving condition. • Stochastic receding horizon control-based energy management strategy is designed for plug-in hybrid electric bus. • The proposed framework is verified by simulation and hard

  20. STOCK EXCHANGE LISTING INDUCES SOPHISTICATION OF CAPITAL BUDGETING

    Directory of Open Access Journals (Sweden)

    Wesley Mendes-da-Silva

    2014-08-01

    Full Text Available This article compares capital budgeting techniques employed in listed and unlisted companies in Brazil. We surveyed the Chief Financial Officers (CFOs of 398 listed companies and 300 large unlisted companies, and based on 91 respondents, the results suggest that the CFOs of listed companies tend to use less simplistic methods more often, for example: NPV and CAPM, and that CFOs of unlisted companies are less likely to estimate the cost of equity, despite being large companies. These findings indicate that stock exchange listing may require greater sophistication of the capital budgeting process.

  1. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  2. Reuse, Recycle, Reweigh: Combating Influenza through Efficient Sequential Bayesian Computation for Massive Data.

    Science.gov (United States)

    Tom, Jennifer A; Sinsheimer, Janet S; Suchard, Marc A

    Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework.

  3. MMA-EoS: A Computational Framework for Mineralogical Thermodynamics

    Science.gov (United States)

    Chust, T. C.; Steinle-Neumann, G.; Dolejš, D.; Schuberth, B. S. A.; Bunge, H.-P.

    2017-12-01

    We present a newly developed software framework, MMA-EoS, that evaluates phase equilibria and thermodynamic properties of multicomponent systems by Gibbs energy minimization, with application to mantle petrology. The code is versatile in terms of the equation-of-state and mixing properties and allows for the computation of properties of single phases, solution phases, and multiphase aggregates. Currently, the open program distribution contains equation-of-state formulations widely used, that is, Caloric-Murnaghan, Caloric-Modified-Tait, and Birch-Murnaghan-Mie-Grüneisen-Debye models, with published databases included. Through its modular design and easily scripted database, MMA-EoS can readily be extended with new formulations of equations-of-state and changes or extensions to thermodynamic data sets. We demonstrate the application of the program by reproducing and comparing physical properties of mantle phases and assemblages with previously published work and experimental data, successively increasing complexity, up to computing phase equilibria of six-component compositions. Chemically complex systems allow us to trace the budget of minor chemical components in order to explore whether they lead to the formation of new phases or extend stability fields of existing ones. Self-consistently computed thermophysical properties for a homogeneous mantle and a mechanical mixture of slab lithologies show no discernible differences that require a heterogeneous mantle structure as has been suggested previously. Such examples illustrate how thermodynamics of mantle mineralogy can advance the study of Earth's interior.

  4. Maxillary arch rehabilitation using implant-supported computer-assisted design-computer-assisted manufacturing-milled titanium framework

    Directory of Open Access Journals (Sweden)

    Tulika S Khanna

    2017-01-01

    Full Text Available Esthetic and functional rehabilitation of completely edentulous maxillary arch with fixed implant supported prosthesis is a challenging task. Newer technologies such as computer assisted design computer assisted manufacturing (CAD CAM and cone beam conventional tomography play an important role in achieving predictable results. Full mouth porcelain fused to metal (PFM individual crowns on CAD CAM milled titanium framework provides positive esthetic and functional outcome. This is a case report of rehabilitation of partially edentulous maxillary arch patient. Staged rehabilitation of this patient was planned. In the first stage, root canal treatment of key abutment teeth was done, nonsalvageable teeth were removed, and immediate interim overdenture was provided. In the second stage, five Nobel Biocare dental implants were placed. After integration impressions were made, CAD CAM milled titanium bar was fabricated. Individual PFM crowns were made and cemented. This method gives better esthetic compared to acrylic fused to metal hybrid prosthesis with the advantage of retrievability just like screw retained prosthesis. Hence, this technique is good for rehabilitation of patients with high esthetic demands.

  5. Differential ethnic associations between maternal flexibility and play sophistication in toddlers born very low birth weight

    Science.gov (United States)

    Erickson, Sarah J.; Montague, Erica Q.; Maclean, Peggy C.; Bancroft, Mary E.; Lowe, Jean R.

    2013-01-01

    Children born very low birth weight (development of self-regulation and effective functional skills, and play serves as an important avenue of early intervention. The current study investigated associations between maternal flexibility and toddler play sophistication in Caucasian, Spanish speaking Hispanic, English speaking Hispanic, and Native American toddlers (18-22 months adjusted age) in a cross-sectional cohort of 73 toddlers born VLBW and their mothers. We found that the association between maternal flexibility and toddler play sophistication differed by ethnicity (F(3,65) = 3.34, p = .02). In particular, Spanish speaking Hispanic dyads evidenced a significant positive association between maternal flexibility and play sophistication of medium effect size. Results for Native Americans were parallel to those of Spanish speaking Hispanic dyads: the relationship between flexibility and play sophistication was positive and of small-medium effect size. Findings indicate that for Caucasians and English speaking Hispanics, flexibility evidenced a non-significant (negative and small effect size) association with toddler play sophistication. Significant follow-up contrasts revealed that the associations for Caucasian and English speaking Hispanic dyads were significantly different from those of the other two ethnic groups. Results remained unchanged after adjusting for the amount of maternal language, an index of maternal engagement and stimulation; and after adjusting for birth weight, gestational age, gender, test age, cognitive ability, as well maternal age, education, and income. Our results provide preliminary evidence that ethnicity and acculturation may mediate the association between maternal interactive behavior such as flexibility and toddler developmental outcomes, as indexed by play sophistication. Addressing these association differences is particularly important in children born VLBW because interventions targeting parent interaction strategies such as

  6. Determining open cluster membership. A Bayesian framework for quantitative member classification

    Science.gov (United States)

    Stott, Jonathan J.

    2018-01-01

    Aims: My goal is to develop a quantitative algorithm for assessing open cluster membership probabilities. The algorithm is designed to work with single-epoch observations. In its simplest form, only one set of program images and one set of reference images are required. Methods: The algorithm is based on a two-stage joint astrometric and photometric assessment of cluster membership probabilities. The probabilities were computed within a Bayesian framework using any available prior information. Where possible, the algorithm emphasizes simplicity over mathematical sophistication. Results: The algorithm was implemented and tested against three observational fields using published survey data. M 67 and NGC 654 were selected as cluster examples while a third, cluster-free, field was used for the final test data set. The algorithm shows good quantitative agreement with the existing surveys and has a false-positive rate significantly lower than the astrometric or photometric methods used individually.

  7. A Framework for the Design of Computer-Assisted Simulation Training for Complex Police Situations

    Science.gov (United States)

    Söderström, Tor; Åström, Jan; Anderson, Greg; Bowles, Ron

    2014-01-01

    Purpose: The purpose of this paper is to report progress concerning the design of a computer-assisted simulation training (CAST) platform for developing decision-making skills in police students. The overarching aim is to outline a theoretical framework for the design of CAST to facilitate police students' development of search techniques in…

  8. A Distributed Computing Framework for Real-Time Detection of Stress and of Its Propagation in a Team.

    Science.gov (United States)

    Pandey, Parul; Lee, Eun Kyung; Pompili, Dario

    2016-11-01

    Stress is one of the key factor that impacts the quality of our daily life: From the productivity and efficiency in the production processes to the ability of (civilian and military) individuals in making rational decisions. Also, stress can propagate from one individual to other working in a close proximity or toward a common goal, e.g., in a military operation or workforce. Real-time assessment of the stress of individuals alone is, however, not sufficient, as understanding its source and direction in which it propagates in a group of people is equally-if not more-important. A continuous near real-time in situ personal stress monitoring system to quantify level of stress of individuals and its direction of propagation in a team is envisioned. However, stress monitoring of an individual via his/her mobile device may not always be possible for extended periods of time due to limited battery capacity of these devices. To overcome this challenge a novel distributed mobile computing framework is proposed to organize the resources in the vicinity and form a mobile device cloud that enables offloading of computation tasks in stress detection algorithm from resource constrained devices (low residual battery, limited CPU cycles) to resource rich devices. Our framework also supports computing parallelization and workflows, defining how the data and tasks divided/assigned among the entities of the framework are designed. The direction of propagation and magnitude of influence of stress in a group of individuals are studied by applying real-time, in situ analysis of Granger Causality. Tangible benefits (in terms of energy expenditure and execution time) of the proposed framework in comparison to a centralized framework are presented via thorough simulations and real experiments.

  9. Does a more sophisticated storm erosion model improve probabilistic erosion estimates?

    NARCIS (Netherlands)

    Ranasinghe, R.W.M.R.J.B.; Callaghan, D.; Roelvink, D.

    2013-01-01

    The dependency between the accuracy/uncertainty of storm erosion exceedance estimates obtained via a probabilistic model and the level of sophistication of the structural function (storm erosion model) embedded in the probabilistic model is assessed via the application of Callaghan et al.'s (2008)

  10. Hypercell : A bio-inspired information design framework for real-time adaptive spatial components

    NARCIS (Netherlands)

    Biloria, N.M.; Chang, J.R.

    2012-01-01

    Contemporary explorations within the evolutionary computational domain have been heavily instrumental in exploring biological processes of adaptation, growth and mutation. On the other hand a plethora of designers owing to the increasing sophistication in computer aided design software are equally

  11. Knowledge-based computer systems for radiotherapy planning.

    Science.gov (United States)

    Kalet, I J; Paluszynski, W

    1990-08-01

    Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.

  12. Sophisticating a naive Liapunov function

    International Nuclear Information System (INIS)

    Smith, D.; Lewins, J.D.

    1985-01-01

    The art of the direct method of Liapunov to determine system stability is to construct a suitable Liapunov or V function where V is to be positive definite (PD), to shrink to a center, which may be conveniently chosen as the origin, and where V is the negative definite (ND). One aid to the art is to solve an approximation to the system equations in order to provide a candidate V function. It can happen, however, that the V function is not strictly ND but vanishes at a finite number of isolated points. Naively, one anticipates that stability has been demonstrated since the trajectory of the system at such points is only momentarily tangential and immediately enters a region of inward directed trajectories. To demonstrate stability rigorously requires the construction of a sophisticated Liapunov function from what can be called the naive original choice. In this paper, the authors demonstrate the method of perturbing the naive function in the context of the well-known second-order oscillator and then apply the method to a more complicated problem based on a prompt jump model for a nuclear fission reactor

  13. Strategic sophistication of individuals and teams. Experimental evidence

    Science.gov (United States)

    Sutter, Matthias; Czermak, Simon; Feri, Francesco

    2013-01-01

    Many important decisions require strategic sophistication. We examine experimentally whether teams act more strategically than individuals. We let individuals and teams make choices in simple games, and also elicit first- and second-order beliefs. We find that teams play the Nash equilibrium strategy significantly more often, and their choices are more often a best response to stated first order beliefs. Distributional preferences make equilibrium play less likely. Using a mixture model, the estimated probability to play strategically is 62% for teams, but only 40% for individuals. A model of noisy introspection reveals that teams differ from individuals in higher order beliefs. PMID:24926100

  14. Computer-assisted radiology

    International Nuclear Information System (INIS)

    Lemke, H.U.

    1988-01-01

    New digital imaging modalities and more sophisticated image processing systems will have a profound effect on those areas of medicine concerned with imaging. This mainly means computer-assisted radiology (CAR) and implies a transition from analog film systems to digital imaging systems, integration of digital imaging modalities through picture archiving and communication systems (PACS) and the graduated employment of image-oriented medical work stations (MWS) for computer-assisted representation, communication, diagnosis, and therapy planning. (orig.) [de

  15. Second generation registry framework.

    Science.gov (United States)

    Bellgard, Matthew I; Render, Lee; Radochonski, Maciej; Hunter, Adam

    2014-01-01

    Information management systems are essential to capture data be it for public health and human disease, sustainable agriculture, or plant and animal biosecurity. In public health, the term patient registry is often used to describe information management systems that are used to record and track phenotypic data of patients. Appropriate design, implementation and deployment of patient registries enables rapid decision making and ongoing data mining ultimately leading to improved patient outcomes. A major bottleneck encountered is the static nature of these registries. That is, software developers are required to work with stakeholders to determine requirements, design the system, implement the required data fields and functionality for each patient registry. Additionally, software developer time is required for ongoing maintenance and customisation. It is desirable to deploy a sophisticated registry framework that can allow scientists and registry curators possessing standard computing skills to dynamically construct a complete patient registry from scratch and customise it for their specific needs with little or no need to engage a software developer at any stage. This paper introduces our second generation open source registry framework which builds on our previous rare disease registry framework (RDRF). This second generation RDRF is a new approach as it empowers registry administrators to construct one or more patient registries without software developer effort. New data elements for a diverse range of phenotypic and genotypic measurements can be defined at any time. Defined data elements can then be utilised in any of the created registries. Fine grained, multi-level user and workgroup access can be applied to each data element to ensure appropriate access and data privacy. We introduce the concept of derived data elements to assist the data element standards communities on how they might be best categorised. We introduce the second generation RDRF that

  16. RSSI-Based Distance Estimation Framework Using a Kalman Filter for Sustainable Indoor Computing Environments

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2016-11-01

    Full Text Available Given that location information is the key to providing a variety of services in sustainable indoor computing environments, it is required to obtain accurate locations. Locations can be estimated by three distances from three fixed points. Therefore, if the distance between two points can be measured or estimated accurately, the location in indoor environments can be estimated. To increase the accuracy of the measured distance, noise filtering, signal revision, and distance estimation processes are generally performed. This paper proposes a novel framework for estimating the distance between a beacon and an access point (AP in a sustainable indoor computing environment. Diverse types of received strength signal indications (RSSIs are used for WiFi, Bluetooth, and radio signals, and the proposed distance estimation framework is unique in that it is independent of the specific wireless signal involved, being based on the Bluetooth signal of the beacon. Generally, RSSI measurement, noise filtering, and revision are required for distance estimation using RSSIs. The employed RSSIs are first measured from an AP, with multiple APs sometimes used to increase the accuracy of the distance estimation. Owing to the inevitable presence of noise in the measured RSSIs, the application of noise filtering is essential, and further revision is used to address the inaccuracy and instability that characterizes RSSIs measured in an indoor environment. The revised RSSIs are then used to estimate the distance. The proposed distance estimation framework uses one AP to measure the RSSIs, a Kalman filter to eliminate noise, and a log-distance path loss model to revise the measured RSSIs. In the experimental implementation of the framework, both a RSSI filter and a Kalman filter were respectively used for noise elimination to comparatively evaluate the performance of the latter for the specific application. The Kalman filter was found to reduce the accumulated errors by 8

  17. Computers in nuclear medicine: introductory concepts

    International Nuclear Information System (INIS)

    Weber, D.A.

    1978-01-01

    Computers play an important role in image and data processing in nuclear medicine. Applications extend from relatively simple mathematical processing of in vitro specimen assays to more sophisticated image reconstruction procedures for emission tomography. The basic concepts and terminology associated with computer applications in image and data processing in nuclear medicine are presented here

  18. From E-commerce to Social Commerce: A Framework to Guide Enabling Cloud Computing

    OpenAIRE

    Baghdadi, Youcef

    2013-01-01

    Social commerce is doing commerce in a collaborative and participative way, by using social media, through an enterprise interactive interface that enables social interactions. Technologies such as Web 2.0, Cloud Computing and Service Oriented Architecture (SOA) enable social commerce. Yet, a framework for social commerce, putting Enterprise Social Interactions as central entities, would provide a strong business justification for social commerce design and adoption with these enabling techno...

  19. Development Strategies for Tourism Destinations: Tourism Sophistication vs. Resource Investments

    OpenAIRE

    Rainer Andergassen; Guido Candela

    2010-01-01

    This paper investigates the effectiveness of development strategies for tourism destinations. We argue that resource investments unambiguously increase tourism revenues and that increasing the degree of tourism sophistication, that is increasing the variety of tourism related goods and services, increases tourism activity and decreases the perceived quality of the destination's resource endowment, leading to an ambiguous effect on tourism revenues. We disentangle these two effects and charact...

  20. Application of a computer-aided framework for the design of CO2 capture and utilization processes

    DEFF Research Database (Denmark)

    Frauzem, Rebecca; Gani, Rafiqul

    , and (3) innovation, has been developed. In order to facilitate the implementation and ensure sustainability, this framework integrates a number of computer-aided methods and tools, that are important for carbon dioxide capture and utilization. Applying this framework helps to address the questions about...... opportunities and products are explored. Second, the selected processing route is designed and analyzed by using simulation software and sustainability (economic, environmental and LCA) analysis tools. From this stage, hot spotsand areas for improvement are also generated. Third, the targets for improvement...

  1. Feasibility Study of a Generalized Framework for Developing Computer-Aided Detection Systems-a New Paradigm.

    Science.gov (United States)

    Nemoto, Mitsutaka; Hayashi, Naoto; Hanaoka, Shouhei; Nomura, Yukihiro; Miki, Soichiro; Yoshikawa, Takeharu

    2017-10-01

    We propose a generalized framework for developing computer-aided detection (CADe) systems whose characteristics depend only on those of the training dataset. The purpose of this study is to show the feasibility of the framework. Two different CADe systems were experimentally developed by a prototype of the framework, but with different training datasets. The CADe systems include four components; preprocessing, candidate area extraction, candidate detection, and candidate classification. Four pretrained algorithms with dedicated optimization/setting methods corresponding to the respective components were prepared in advance. The pretrained algorithms were sequentially trained in the order of processing of the components. In this study, two different datasets, brain MRA with cerebral aneurysms and chest CT with lung nodules, were collected to develop two different types of CADe systems in the framework. The performances of the developed CADe systems were evaluated by threefold cross-validation. The CADe systems for detecting cerebral aneurysms in brain MRAs and for detecting lung nodules in chest CTs were successfully developed using the respective datasets. The framework was shown to be feasible by the successful development of the two different types of CADe systems. The feasibility of this framework shows promise for a new paradigm in the development of CADe systems: development of CADe systems without any lesion specific algorithm designing.

  2. Computer codes for RF cavity design

    International Nuclear Information System (INIS)

    Ko, K.

    1992-08-01

    In RF cavity design, numerical modeling is assuming an increasingly important role with the help of sophisticated computer codes and powerful yet affordable computers. A description of the cavity codes in use in the accelerator community has been given previously. The present paper will address the latest developments and discuss their applications to cavity toning and matching problems

  3. Individual Tariffs for Mobile Services: Theoretical Framework and a Computational Case in Mobile Music

    OpenAIRE

    Chen, Hong; Pau, Louis-François

    2007-01-01

    textabstractThis paper introduces individual tariffs at service and content bundle level in mobile communications. It gives a theoretical framework (economic, sociological) as well as a computational game solution method. The user can be an individual or a community. Individual tariffs are decided through interactions between the user and the supplier. A numerical example from mobile music illustrates the concepts.

  4. Do organizations adopt sophisticated capital budgeting practices to deal with uncertainty in the investment decision? : A research note

    NARCIS (Netherlands)

    Verbeeten, Frank H M

    This study examines the impact of uncertainty on the sophistication of capital budgeting practices. While the theoretical applications of sophisticated capital budgeting practices (defined as the use of real option reasoning and/or game theory decision rules) have been well documented, empirical

  5. Framework Resources Multiply Computing Power

    Science.gov (United States)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  6. Computational medical imaging and hemodynamics framework for functional analysis and assessment of cardiovascular structures.

    Science.gov (United States)

    Wong, Kelvin K L; Wang, Defeng; Ko, Jacky K L; Mazumdar, Jagannath; Le, Thu-Thao; Ghista, Dhanjoo

    2017-03-21

    Cardiac dysfunction constitutes common cardiovascular health issues in the society, and has been an investigation topic of strong focus by researchers in the medical imaging community. Diagnostic modalities based on echocardiography, magnetic resonance imaging, chest radiography and computed tomography are common techniques that provide cardiovascular structural information to diagnose heart defects. However, functional information of cardiovascular flow, which can in fact be used to support the diagnosis of many cardiovascular diseases with a myriad of hemodynamics performance indicators, remains unexplored to its full potential. Some of these indicators constitute important cardiac functional parameters affecting the cardiovascular abnormalities. With the advancement of computer technology that facilitates high speed computational fluid dynamics, the realization of a support diagnostic platform of hemodynamics quantification and analysis can be achieved. This article reviews the state-of-the-art medical imaging and high fidelity multi-physics computational analyses that together enable reconstruction of cardiovascular structures and hemodynamic flow patterns within them, such as of the left ventricle (LV) and carotid bifurcations. The combined medical imaging and hemodynamic analysis enables us to study the mechanisms of cardiovascular disease-causing dysfunctions, such as how (1) cardiomyopathy causes left ventricular remodeling and loss of contractility leading to heart failure, and (2) modeling of LV construction and simulation of intra-LV hemodynamics can enable us to determine the optimum procedure of surgical ventriculation to restore its contractility and health This combined medical imaging and hemodynamics framework can potentially extend medical knowledge of cardiovascular defects and associated hemodynamic behavior and their surgical restoration, by means of an integrated medical image diagnostics and hemodynamic performance analysis framework.

  7. Press Play for Learning: A Framework to Guide Serious Computer Game Use in the Classroom

    Science.gov (United States)

    Southgate, Erica; Budd, Janene; Smith, Shamus

    2017-01-01

    Computer gaming is a global phenomenon and there has been rapid growth in "serious" games for learning. An emergent body of evidence demonstrates how serious games can be used in primary and secondary school classrooms. Despite the popularity of serious games and their pedagogical potential, there are few specialised frameworks to guide…

  8. Systematization and sophistication of a comprehensive sensitivity analysis program. Phase 2

    International Nuclear Information System (INIS)

    Oyamada, Kiyoshi; Ikeda, Takao

    2004-02-01

    This study developed minute estimation by adopting comprehensive sensitivity analytical program for reliability of TRU waste repository concepts in a crystalline rock condition. We examined each components and groundwater scenario of geological repository and prepared systematic bases to examine the reliability from the point of comprehensiveness. Models and data are sophisticated to examine the reliability. Based on an existing TRU waste repository concepts, effects of parameters to nuclide migration were quantitatively classified. Those parameters, that will be decided quantitatively, are such as site character of natural barrier and design specification of engineered barriers. Considering the feasibility of those figures of specifications, reliability is re-examined on combinations of those parameters within a practical range. Future issues are; Comprehensive representation of hybrid geosphere model including the fractured medium and permeable matrix medium. Sophistication of tools to develop the reliable combinations of parameters. It is significant to continue this study because the disposal concepts and specification of TRU nuclides containing waste on various sites shall be determined rationally and safely through these studies. (author)

  9. An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments.

    Science.gov (United States)

    Mora, Higinio; Gil, David; Terol, Rafael Muñoz; Azorín, Jorge; Szymanski, Julian

    2017-10-10

    The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other 'things' ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers' heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries.

  10. A proposed framework for computational fluid dynamics code calibration/validation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1993-01-01

    The paper reviews the terminology and methodology that have been introduced during the last several years for building confidence n the predictions from Computational Fluid Dynamics (CID) codes. Code validation terminology developed for nuclear reactor analyses and aerospace applications is reviewed and evaluated. Currently used terminology such as ''calibrated code,'' ''validated code,'' and a ''validation experiment'' is discussed along with the shortcomings and criticisms of these terms. A new framework is proposed for building confidence in CFD code predictions that overcomes some of the difficulties of past procedures and delineates the causes of uncertainty in CFD predictions. Building on previous work, new definitions of code verification and calibration are proposed. These definitions provide more specific requirements for the knowledge level of the flow physics involved and the solution accuracy of the given partial differential equations. As part of the proposed framework, categories are also proposed for flow physics research, flow modeling research, and the application of numerical predictions. The contributions of physical experiments, analytical solutions, and other numerical solutions are discussed, showing that each should be designed to achieve a distinctively separate purpose in building confidence in accuracy of CFD predictions. A number of examples are given for each approach to suggest methods for obtaining the highest value for CFD code quality assurance

  11. Reusability Framework for Cloud Computing

    OpenAIRE

    Singh, Sukhpal; Singh, Rishideep

    2012-01-01

    Cloud based development is a challenging task for several software engineering projects, especially for those which needs development with reusability. Present time of cloud computing is allowing new professional models for using the software development. The expected upcoming trend of computing is assumed to be this cloud computing because of speed of application deployment, shorter time to market, and lower cost of operation. Until Cloud Co mputing Reusability Model is considered a fundamen...

  12. Home, Hearth and Computing.

    Science.gov (United States)

    Seelig, Anita

    1982-01-01

    Advantages of having children use microcomputers at school and home include learning about sophisticated concepts early in life without a great deal of prodding, playing games that expand knowledge, and becoming literate in computer knowledge needed later in life. Includes comments from parents on their experiences with microcomputers and…

  13. The Value of Multivariate Model Sophistication: An Application to pricing Dow Jones Industrial Average options

    DEFF Research Database (Denmark)

    Rombouts, Jeroen V.K.; Stentoft, Lars; Violante, Francesco

    innovation for a Laplace innovation assumption improves the pricing in a smaller way. Apart from investigating directly the value of model sophistication in terms of dollar losses, we also use the model condence set approach to statistically infer the set of models that delivers the best pricing performance.......We assess the predictive accuracy of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set 248 multivariate models that differer...

  14. Computer codes for RF cavity design

    International Nuclear Information System (INIS)

    Ko, K.

    1992-01-01

    In RF cavity design, numerical modeling is assuming an increasingly important role with the help of sophisticated computer codes and powerful yet affordable computers. A description of the cavity codes in use in the accelerator community has been given previously. The present paper will address the latest developments and discuss their applications to cavity tuning and matching problems. (Author) 8 refs., 10 figs

  15. Sophistic Ethics in the Technical Writing Classroom: Teaching "Nomos," Deliberation, and Action.

    Science.gov (United States)

    Scott, J. Blake

    1995-01-01

    Claims that teaching ethics is particularly important to technical writing. Outlines a classical, sophistic approach to ethics based on the theories and pedagogies of Protagoras, Gorgias, and Isocrates, which emphasizes the Greek concept of "nomos," internal and external deliberation, and responsible action. Discusses problems and…

  16. Model-based Computer Aided Framework for Design of Process Monitoring and Analysis Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    In the manufacturing industry, for example, the pharmaceutical industry, a thorough understanding of the process is necessary in addition to a properly designed monitoring and analysis system (PAT system) to consistently obtain the desired end-product properties. A model-based computer....... The knowledge base provides the necessary information/data during the design of the PAT system while the model library generates additional or missing data needed for design and analysis. Optimization of the PAT system design is achieved in terms of product data analysis time and/or cost of monitoring equipment......-aided framework including the methods and tools through which the design of monitoring and analysis systems for product quality control can be generated, analyzed and/or validated, has been developed. Two important supporting tools developed as part of the framework are a knowledge base and a model library...

  17. GLOFRIM v1.0 - A globally applicable computational framework for integrated hydrological-hydrodynamic modelling

    Science.gov (United States)

    Hoch, Jannis M.; Neal, Jeffrey C.; Baart, Fedor; van Beek, Rens; Winsemius, Hessel C.; Bates, Paul D.; Bierkens, Marc F. P.

    2017-10-01

    We here present GLOFRIM, a globally applicable computational framework for integrated hydrological-hydrodynamic modelling. GLOFRIM facilitates spatially explicit coupling of hydrodynamic and hydrologic models and caters for an ensemble of models to be coupled. It currently encompasses the global hydrological model PCR-GLOBWB as well as the hydrodynamic models Delft3D Flexible Mesh (DFM; solving the full shallow-water equations and allowing for spatially flexible meshing) and LISFLOOD-FP (LFP; solving the local inertia equations and running on regular grids). The main advantages of the framework are its open and free access, its global applicability, its versatility, and its extensibility with other hydrological or hydrodynamic models. Before applying GLOFRIM to an actual test case, we benchmarked both DFM and LFP for a synthetic test case. Results show that for sub-critical flow conditions, discharge response to the same input signal is near-identical for both models, which agrees with previous studies. We subsequently applied the framework to the Amazon River basin to not only test the framework thoroughly, but also to perform a first-ever benchmark of flexible and regular grids on a large-scale. Both DFM and LFP produce comparable results in terms of simulated discharge with LFP exhibiting slightly higher accuracy as expressed by a Kling-Gupta efficiency of 0.82 compared to 0.76 for DFM. However, benchmarking inundation extent between DFM and LFP over the entire study area, a critical success index of 0.46 was obtained, indicating that the models disagree as often as they agree. Differences between models in both simulated discharge and inundation extent are to a large extent attributable to the gridding techniques employed. In fact, the results show that both the numerical scheme of the inundation model and the gridding technique can contribute to deviations in simulated inundation extent as we control for model forcing and boundary conditions. This study shows

  18. ClimateSpark: An in-memory distributed computing framework for big climate data analytics

    Science.gov (United States)

    Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei

    2018-06-01

    The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.

  19. Sophisticated Fowl: The Complex Behaviour and Cognitive Skills of Chickens and Red Junglefowl

    Directory of Open Access Journals (Sweden)

    Laura Garnham

    2018-01-01

    Full Text Available The world’s most numerous bird, the domestic chicken, and their wild ancestor, the red junglefowl, have long been used as model species for animal behaviour research. Recently, this research has advanced our understanding of the social behaviour, personality, and cognition of fowl, and demonstrated their sophisticated behaviour and cognitive skills. Here, we overview some of this research, starting with describing research investigating the well-developed senses of fowl, before presenting how socially and cognitively complex they can be. The realisation that domestic chickens, our most abundant production animal, are behaviourally and cognitively sophisticated should encourage an increase in general appraise and fascination towards them. In turn, this should inspire increased use of them as both research and hobby animals, as well as improvements in their unfortunately often poor welfare.

  20. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  1. International Review of Frameworks for Standard Setting & Labeling Development

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Nan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Khanna, Nina Zheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fridley, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Romankiewicz, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-09-01

    As appliance energy efficiency standards and labeling (S&L) programs reach a broader geographic and product scope, a series of sophisticated and complex technical and economic analyses have been adopted by different countries in the world to support and enhance these growing S&L programs. The initial supporting techno-economic and impact analyses for S&L development make up a defined framework and process for setting and developing appropriate appliance efficiency standards and labeling programs. This report reviews in-depth the existing framework for standards setting and label development in the well-established programs of the U.S., Australia and the EU to identify and evaluate major trends in how and why key analyses are undertaken and to understand major similarities and differences between each of the frameworks.

  2. Assessing Epistemic Sophistication by Considering Domain-Specific Absolute and Multiplicistic Beliefs Separately

    Science.gov (United States)

    Peter, Johannes; Rosman, Tom; Mayer, Anne-Kathrin; Leichner, Nikolas; Krampen, Günter

    2016-01-01

    Background: Particularly in higher education, not only a view of science as a means of finding absolute truths (absolutism), but also a view of science as generally tentative (multiplicism) can be unsophisticated and obstructive for learning. Most quantitative epistemic belief inventories neglect this and understand epistemic sophistication as…

  3. GeoFramework: A Modeling Framework for Solid Earth Geophysics

    Science.gov (United States)

    Gurnis, M.; Aivazis, M.; Tromp, J.; Tan, E.; Thoutireddy, P.; Liu, Q.; Choi, E.; Dicaprio, C.; Chen, M.; Simons, M.; Quenette, S.; Appelbe, B.; Aagaard, B.; Williams, C.; Lavier, L.; Moresi, L.; Law, H.

    2003-12-01

    As data sets in geophysics become larger and of greater relevance to other earth science disciplines, and as earth science becomes more interdisciplinary in general, modeling tools are being driven in new directions. There is now a greater need to link modeling codes to one another, link modeling codes to multiple datasets, and to make modeling software available to non modeling specialists. Coupled with rapid progress in computer hardware (including the computational speed afforded by massively parallel computers), progress in numerical algorithms, and the introduction of software frameworks, these lofty goals of merging software in geophysics are now possible. The GeoFramework project, a collaboration between computer scientists and geoscientists, is a response to these needs and opportunities. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. The utility and generality of Pyre as a general purpose framework in science is now being recognized. Besides its use in engineering and geophysics, it is also being used in particle physics and astronomy. Geology and geophysics impose their own unique requirements on software frameworks which are not generally available in existing frameworks and so there is a need for research in this area. One of the special requirements is the way Lagrangian and Eulerian codes will need to be linked in time and space within a plate tectonics context. GeoFramework has grown beyond its initial goal of linking a limited number of exiting codes together. The following codes are now being reengineered within the context of Pyre: Tecton, 3-D FE Visco-elastic code for lithospheric relaxation; CitComS, a code for spherical mantle convection; SpecFEM3D, a SEM code for global and regional seismic waves; eqsim, a FE code for dynamic

  4. Framework for generating expert systems to perform computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1985-01-01

    At Los Alamos we are developing a framework to generate knowledge-based expert systems for performing automated risk analyses upon a subject system. The expert system is a computer program that models experts' knowledge about a topic, including facts, assumptions, insights, and decision rationale. The subject system, defined as the collection of information, procedures, devices, and real property upon which the risk analysis is to be performed, is a member of the class of systems that have three identifying characteristics: a set of desirable assets (or targets), a set of adversaries (or threats) desiring to obtain or to do harm to the assets, and a set of protective mechanisms to safeguard the assets from the adversaries. Risk analysis evaluates both vulnerability to and the impact of successful threats against the targets by determining the overall effectiveness of the subject system safeguards, identifying vulnerabilities in that set of safeguards, and determining cost-effective improvements to the safeguards. As a testbed, we evaluate the inherent vulnerabilities and risks in a system of computer security safeguards. The method considers safeguards protecting four generic targets (physical plant of the computer installation, its hardware, its software, and its documents and displays) against three generic threats (natural hazards, direct human actions requiring the presence of the adversary, and indirect human actions wherein the adversary is not on the premises-perhaps using such access tools as wiretaps, dialup lines, and so forth). Our automated procedure to assess the effectiveness of computer security safeguards differs from traditional risk analysis methods

  5. Evolution of Computational Toxicology-from Primitive ...

    Science.gov (United States)

    Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 on the Evolution of Computational Toxicology-from Primitive Beginnings to Sophisticated Application

  6. A Survey of Software Infrastructures and Frameworks for Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    Christoph Endres

    2005-01-01

    Full Text Available In this survey, we discuss 29 software infrastructures and frameworks which support the construction of distributed interactive systems. They range from small projects with one implemented prototype to large scale research efforts, and they come from the fields of Augmented Reality (AR, Intelligent Environments, and Distributed Mobile Systems. In their own way, they can all be used to implement various aspects of the ubiquitous computing vision as described by Mark Weiser [60]. This survey is meant as a starting point for new projects, in order to choose an existing infrastructure for reuse, or to get an overview before designing a new one. It tries to provide a systematic, relatively broad (and necessarily not very deep overview, while pointing to relevant literature for in-depth study of the systems discussed.

  7. Particle Tracking and Simulation on the .NET Framework

    International Nuclear Information System (INIS)

    Nishimura, Hiroshi; Scarvie, Tom

    2006-01-01

    Particle tracking and simulation studies are becoming increasingly complex. In addition to the use of more sophisticated graphics, interactive scripting is becoming popular. Compatibility with different control systems requires network and database capabilities. It is not a trivial task to fulfill all the various requirements without sacrificing runtime performance. We evaluated the effectiveness of the .NET framework by converting a C++ simulation code to C. The portability to other platforms is mentioned in terms of Mono

  8. Virtual Memory Introspection Framework for Cyber Threat Detection in Virtual Environment

    Directory of Open Access Journals (Sweden)

    Himanshu Upadhyay

    2018-01-01

    Full Text Available In today’s information based world, it is increasingly important to safeguard the data owned by any organization, be it intellectual property or personal information. With ever increasing sophistication of malware, it is imperative to come up with an automated and advanced methods of attack vector recognition and isolation. Existing methods are not dynamic enough to adapt to the behavioral complexity of new malware. Widely used operating systems, especially Linux, have a popular perception of being more secure than other operating systems (e.g. Windows, but this is not necessarily true. The open source nature of the Linux operating system is a double edge sword; malicious actors having full access to the kernel code does not reassure the IT world of Linux’s vulnerabilities. Recent widely reported hacking attacks on reputable organizations have mostly been on Linux servers. Most new malwares are able to neutralize existing defenses on the Linux operating system. A radical solution for malware detection is needed – one which cannot be detected and damaged by malicious code. In this paper, we propose a novel framework design that uses virtualization to isolate and monitor Linux environments. The framework uses the well-known Xen hypervisor to host server environments and uses a Virtual Memory Introspection framework to capture process behavior. The behavioral data is analyzed using sophisticated machine learning algorithms to flag potential cyber threats. The framework can be enhanced to have self-healing properties: any compromised hosts are immediately replaced by their uncompromised versions, limiting the exposure to the wider enterprise network.

  9. An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments

    Directory of Open Access Journals (Sweden)

    Higinio Mora

    2017-10-01

    Full Text Available The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other ‘things’ ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers’ heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries.

  10. Microbase2.0: A Generic Framework for Computationally Intensive Bioinformatics Workflows in the Cloud

    OpenAIRE

    Flanagan Keith; Nakjang Sirintra; Hallinan Jennifer; Harwood Colin; Hirt Robert P.; Pocock Matthew R.; Wipat Anil

    2012-01-01

    As bioinformatics datasets grow ever larger, and analyses become increasingly complex, there is a need for data handling infrastructures to keep pace with developing technology. One solution is to apply Grid and Cloud technologies to address the computational requirements of analysing high throughput datasets. We present an approach for writing new, or wrapping existing applications, and a reference implementation of a framework, Microbase2.0, for executing those applications using Grid and C...

  11. A conceptual framework of computations in mid-level vision.

    Science.gov (United States)

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words-or, rather, descriptors-capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations.

  12. A conceptual framework of computations in mid-level vision

    Directory of Open Access Journals (Sweden)

    Jonas eKubilius

    2014-12-01

    Full Text Available If a picture is worth a thousand words, as an English idiom goes, what should those words – or, rather, descriptors – capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii sufficiently robust to apply in practice on realistic images; and (iii able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation and so on, and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization. Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model’s behavior and its limitations.

  13. A conceptual framework of computations in mid-level vision

    Science.gov (United States)

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044

  14. MCdevelop - a universal framework for Stochastic Simulations

    Science.gov (United States)

    Slawinska, M.; Jadach, S.

    2011-03-01

    We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, merge and analyse results from multiple parallel jobs, even before the production runs are terminated. Throughout the years of development of stochastic simulations for HEP, a sophisticated framework featuring all the above mentioned functionality has been implemented. MCdevelop represents its latest version, written mostly in C++ (GNU compiler gcc). It uses Autotools to build binaries (optionally managed within the KDevelop 3.5.3 Integrated Development Environment (IDE)). It uses the open-source ROOT package for histogramming, graphics and the mechanism of persistency for the C++ objects. MCdevelop helps to run multiple parallel jobs on any computer cluster with NQS-type batch system. Program summaryProgram title:MCdevelop Catalogue identifier: AEHW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http

  15. Evaluation Framework for Search Instruments

    International Nuclear Information System (INIS)

    Warren, Glen A.; Smith, Leon E.; Cooper, Matt W.; Kaye, William R.

    2005-01-01

    A framework for quantitatively evaluating current and proposed gamma-ray search instrument designs has been developed. The framework is designed to generate a large library of ''virtual neighborhoods'' that can be used to test and evaluate nearly any gamma-ray sensor type. Calculating nuisance-source emissions and combining various sources to create a large number of random virtual scenes places a significant computational burden on the development of the framework. To reduce this burden, a number of radiation transport simplifications have been made which maintain the essential physics ingredients for the quantitative assessment of search instruments while significantly reducing computational times. The various components of the framework, from the simulation and benchmarking of nuisance source emissions to the computational engine for generating the gigabytes of simulated search scenes, are discussed

  16. Practical applications of soft computing in engineering

    CERN Document Server

    2001-01-01

    Soft computing has been presented not only with the theoretical developments but also with a large variety of realistic applications to consumer products and industrial systems. Application of soft computing has provided the opportunity to integrate human-like vagueness and real-life uncertainty into an otherwise hard computer program. This book highlights some of the recent developments in practical applications of soft computing in engineering problems. All the chapters have been sophisticatedly designed and revised by international experts to achieve wide but in-depth coverage. Contents: Au

  17. How You Can Protect Public Access Computers "and" Their Users

    Science.gov (United States)

    Huang, Phil

    2007-01-01

    By providing the public with online computing facilities, librarians make available a world of information resources beyond their traditional print materials. Internet-connected computers in libraries greatly enhance the opportunity for patrons to enjoy the benefits of the digital age. Unfortunately, as hackers become more sophisticated and…

  18. A probabilistic computational framework for bridge network optimal maintenance scheduling

    International Nuclear Information System (INIS)

    Bocchini, Paolo; Frangopol, Dan M.

    2011-01-01

    This paper presents a probabilistic computational framework for the Pareto optimization of the preventive maintenance applications to bridges of a highway transportation network. The bridge characteristics are represented by their uncertain reliability index profiles. The in/out of service states of the bridges are simulated taking into account their correlation structure. Multi-objective Genetic Algorithms have been chosen as numerical tool for the solution of the optimization problem. The design variables of the optimization are the preventive maintenance schedules of all the bridges of the network. The two conflicting objectives are the minimization of the total present maintenance cost and the maximization of the network performance indicator. The final result is the Pareto front of optimal solutions among which the managers should chose, depending on engineering and economical factors. A numerical example illustrates the application of the proposed approach.

  19. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers

    Science.gov (United States)

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences. PMID:28450829

  20. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers.

    Science.gov (United States)

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences.

  1. Cloud computing can simplify HIT infrastructure management.

    Science.gov (United States)

    Glaser, John

    2011-08-01

    Software as a Service (SaaS), built on cloud computing technology, is emerging as the forerunner in IT infrastructure because it helps healthcare providers reduce capital investments. Cloud computing leads to predictable, monthly, fixed operating expenses for hospital IT staff. Outsourced cloud computing facilities are state-of-the-art data centers boasting some of the most sophisticated networking equipment on the market. The SaaS model helps hospitals safeguard against technology obsolescence, minimizes maintenance requirements, and simplifies management.

  2. A Distributed OpenCL Framework using Redundant Computation and Data Replication

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Junghyun [Seoul National University, Korea; Gangwon, Jo [Seoul National University, Korea; Jaehoon, Jung [Seoul National University, Korea; Lee, Jaejin [Seoul National University, Korea

    2016-01-01

    Applications written solely in OpenCL or CUDA cannot execute on a cluster as a whole. Most previous approaches that extend these programming models to clusters are based on a common idea: designating a centralized host node and coordinating the other nodes with the host for computation. However, the centralized host node is a serious performance bottleneck when the number of nodes is large. In this paper, we propose a scalable and distributed OpenCL framework called SnuCL-D for large-scale clusters. SnuCL-D's remote device virtualization provides an OpenCL application with an illusion that all compute devices in a cluster are confined in a single node. To reduce the amount of control-message and data communication between nodes, SnuCL-D replicates the OpenCL host program execution and data in each node. We also propose a new OpenCL host API function and a queueing optimization technique that significantly reduce the overhead incurred by the previous centralized approaches. To show the effectiveness of SnuCL-D, we evaluate SnuCL-D with a microbenchmark and eleven benchmark applications on a large-scale CPU cluster and a medium-scale GPU cluster.

  3. Integrating surrogate models into subsurface simulation framework allows computation of complex reactive transport scenarios

    Science.gov (United States)

    De Lucia, Marco; Kempka, Thomas; Jatnieks, Janis; Kühn, Michael

    2017-04-01

    Reactive transport simulations - where geochemical reactions are coupled with hydrodynamic transport of reactants - are extremely time consuming and suffer from significant numerical issues. Given the high uncertainties inherently associated with the geochemical models, which also constitute the major computational bottleneck, such requirements may seem inappropriate and probably constitute the main limitation for their wide application. A promising way to ease and speed-up such coupled simulations is achievable employing statistical surrogates instead of "full-physics" geochemical models [1]. Data-driven surrogates are reduced models obtained on a set of pre-calculated "full physics" simulations, capturing their principal features while being extremely fast to compute. Model reduction of course comes at price of a precision loss; however, this appears justified in presence of large uncertainties regarding the parametrization of geochemical processes. This contribution illustrates the integration of surrogates into the flexible simulation framework currently being developed by the authors' research group [2]. The high level language of choice for obtaining and dealing with surrogate models is R, which profits from state-of-the-art methods for statistical analysis of large simulations ensembles. A stand-alone advective mass transport module was furthermore developed in order to add such capability to any multiphase finite volume hydrodynamic simulator within the simulation framework. We present 2D and 3D case studies benchmarking the performance of surrogates and "full physics" chemistry in scenarios pertaining the assessment of geological subsurface utilization. [1] Jatnieks, J., De Lucia, M., Dransch, D., Sips, M.: "Data-driven surrogate model approach for improving the performance of reactive transport simulations.", Energy Procedia 97, 2016, p. 447-453. [2] Kempka, T., Nakaten, B., De Lucia, M., Nakaten, N., Otto, C., Pohl, M., Chabab [Tillner], E., Kühn, M

  4. Computer-generated movies as an analytic tool

    International Nuclear Information System (INIS)

    Elliott, R.L.

    1978-01-01

    One of the problems faced by the users of large, sophisticated modeling programs at the Los Alamos Scientific Laboratory (LASL) is the analysis of the results of their calculations. One of the more productive and frequently spectacular methods is the production of computer-generated movies. An overview of the generation of computer movies at LASL is presented. The hardware, software, and generation techniques are briefly discussed

  5. Reacting to Neighborhood Cues?: Political Sophistication Moderates the Effect of Exposure to Immigrants

    DEFF Research Database (Denmark)

    Danckert, Bolette; Dinesen, Peter Thisted; Sønderskov, Kim Mannemar

    2017-01-01

    is founded on politically sophisticated individuals having a greater comprehension of news and other mass-mediated sources, which makes them less likely to rely on neighborhood cues as sources of information relevant for political attitudes. Based on a unique panel data set with fine-grained information...

  6. Mobile cloud computing for computation offloading: Issues and challenges

    Directory of Open Access Journals (Sweden)

    Khadija Akherfi

    2018-01-01

    Full Text Available Despite the evolution and enhancements that mobile devices have experienced, they are still considered as limited computing devices. Today, users become more demanding and expect to execute computational intensive applications on their smartphone devices. Therefore, Mobile Cloud Computing (MCC integrates mobile computing and Cloud Computing (CC in order to extend capabilities of mobile devices using offloading techniques. Computation offloading tackles limitations of Smart Mobile Devices (SMDs such as limited battery lifetime, limited processing capabilities, and limited storage capacity by offloading the execution and workload to other rich systems with better performance and resources. This paper presents the current offloading frameworks, computation offloading techniques, and analyzes them along with their main critical issues. In addition, it explores different important parameters based on which the frameworks are implemented such as offloading method and level of partitioning. Finally, it summarizes the issues in offloading frameworks in the MCC domain that requires further research.

  7. Implementing Computer Algebra Enabled Questions for the Assessment and Learning of Mathematics

    Science.gov (United States)

    Sangwin, Christopher J.; Naismith, Laura

    2008-01-01

    We present principles for the design of an online system to support computer algebra enabled questions for use within the teaching and learning of mathematics in higher education. The introduction of a computer algebra system (CAS) into a computer aided assessment (CAA) system affords sophisticated response processing of student provided answers.…

  8. Computational modeling of Metal-Organic Frameworks

    Science.gov (United States)

    Sung, Jeffrey Chuen-Fai

    In this work, the metal-organic frameworks MIL-53(Cr), DMOF-2,3-NH 2Cl, DMOF-2,5-NH2Cl, and HKUST-1 were modeled using molecular mechanics and electronic structure. The effect of electronic polarization on the adsorption of water in MIL-53(Cr) was studied using molecular dynamics simulations of water-loaded MIL-53 systems with both polarizable and non-polarizable force fields. Molecular dynamics simulations of the full systems and DFT calculations on representative framework clusters were utilized to study the difference in nitrogen adsorption between DMOF-2,3-NH2Cl and DMOF-2,5-NH 2Cl. Finally, the control of proton conduction in HKUST-1 by complexation of molecules to the Cu open metal site was investigated using the MS-EVB methodology.

  9. Financial Sophistication and the Distribution of the Welfare Cost of Inflation

    OpenAIRE

    Paola Boel; Gabriele Camera

    2009-01-01

    The welfare cost of anticipated inflation is quantified in a calibrated model of the U.S. economy that exhibits tractable equilibrium dispersion in wealth and earnings. Inflation does not generate large losses in societal welfare, yet its impact varies noticeably across segments of society depending also on the financial sophistication of the economy. If money is the only asset, then inflation hurts mostly the wealthier and more productive agents, while those poorer and less productive may ev...

  10. European questionnaire on the use of computer programmes in radiation dosimetry

    International Nuclear Information System (INIS)

    Gualdrini, G.; Tanner, R.; Terrisol, M.

    1999-01-01

    Because of a potential reduction of necessary experimental efforts, the combination of measurements and supplementing calculations, also in the field of radiation dosimetry, may allow time and money to be saved if computational methods are used which are well suited to reproduce experimental data in a satisfactory quality. The dramatic increase in computing power in recent years now permits the use of computational tools for dosimetry also in routine applications. Many institutions dealing with radiation protection, however, have small groups which, in addition to their routine work, often cannot afford to specialise in the field of computational dosimetry. This means that not only experts but increasingly also casual users employ complicated computational tools such as general-purpose transport codes. This massive use of computer programmes in radiation protection and dosimetry applications motivated the Concerted Action Investigation and Quality Assurance of Numerical Methods in Radiation Protection Dosimetry of the 4th framework programme of the European Commission to prepare, distribute and evaluate a questionnaire on the use of such codes. A significant number of scientists from nearly all the countries of the European Community (and some countries outside Europe) contributed to the questionnaire, that allowed to obtain a satisfactory overview of the state of the art in this field. The results obtained from the questionnaire and summarised in the present Report are felt to be indicative of the situation of using sophisticated computer codes within the European Community although the group of participating scientist may not be a representative sample in a strict statistical sense [it

  11. Freedom and necessity in computer aided composition: A thinking framework and its application

    Science.gov (United States)

    Kretz, Johannes

    This paper presents some of the author's experiences with computer aided composition (CAC): the modeling of physical movements is used to obtain plausible musical gestures in interaction with constraint programming (rule based expert systems) in order to achieve precisely structured, consistent musical material with strong inner logic and syntax in pitch material. The "Constraints Engine" by Michael Laurson implemented in OpenMusic (IRCAM) or PWGL (Sibelius Academy) can be used to set up an interactive framework for composition, which offers a balance of freedom (allowing chance operations and arbitrary decisions of the composer) and necessity (through strict rules as well as through criteria for optimization). Computer Aided Composition is moving far beyond being "algorithmic" or "mechanical". This paper proposes an approach based on evolutionary epistemology (by the Austrian biologist and philosopher Rupert Riedl). The aim is a holistic synthesis of artistic freedom and coherent structures similar to the grown order of nature.

  12. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework.

    Science.gov (United States)

    Lewis, Steven; Csordas, Attila; Killcoyne, Sarah; Hermjakob, Henning; Hoopmann, Michael R; Moritz, Robert L; Deutsch, Eric W; Boyle, John

    2012-12-05

    For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources.

  13. Bio-inspired varying subspace based computational framework for a class of nonlinear constrained optimal trajectory planning problems.

    Science.gov (United States)

    Xu, Y; Li, N

    2014-09-01

    Biological species have produced many simple but efficient rules in their complex and critical survival activities such as hunting and mating. A common feature observed in several biological motion strategies is that the predator only moves along paths in a carefully selected or iteratively refined subspace (or manifold), which might be able to explain why these motion strategies are effective. In this paper, a unified linear algebraic formulation representing such a predator-prey relationship is developed to simplify the construction and refinement process of the subspace (or manifold). Specifically, the following three motion strategies are studied and modified: motion camouflage, constant absolute target direction and local pursuit. The framework constructed based on this varying subspace concept could significantly reduce the computational cost in solving a class of nonlinear constrained optimal trajectory planning problems, particularly for the case with severe constraints. Two non-trivial examples, a ground robot and a hypersonic aircraft trajectory optimization problem, are used to show the capabilities of the algorithms in this new computational framework.

  14. Bio-inspired varying subspace based computational framework for a class of nonlinear constrained optimal trajectory planning problems

    International Nuclear Information System (INIS)

    Xu, Y; Li, N

    2014-01-01

    Biological species have produced many simple but efficient rules in their complex and critical survival activities such as hunting and mating. A common feature observed in several biological motion strategies is that the predator only moves along paths in a carefully selected or iteratively refined subspace (or manifold), which might be able to explain why these motion strategies are effective. In this paper, a unified linear algebraic formulation representing such a predator–prey relationship is developed to simplify the construction and refinement process of the subspace (or manifold). Specifically, the following three motion strategies are studied and modified: motion camouflage, constant absolute target direction and local pursuit. The framework constructed based on this varying subspace concept could significantly reduce the computational cost in solving a class of nonlinear constrained optimal trajectory planning problems, particularly for the case with severe constraints. Two non-trivial examples, a ground robot and a hypersonic aircraft trajectory optimization problem, are used to show the capabilities of the algorithms in this new computational framework. (paper)

  15. Quantum Walks for Computer Scientists

    CERN Document Server

    Venegas-Andraca, Salvador

    2008-01-01

    Quantum computation, one of the latest joint ventures between physics and the theory of computation, is a scientific field whose main goals include the development of hardware and algorithms based on the quantum mechanical properties of those physical systems used to implement such algorithms. Solving difficult tasks (for example, the Satisfiability Problem and other NP-complete problems) requires the development of sophisticated algorithms, many of which employ stochastic processes as their mathematical basis. Discrete random walks are a popular choice among those stochastic processes. Inspir

  16. Computer Network Attack and the Use of Force in International Law: Thoughts on a Normative Framework

    Science.gov (United States)

    1999-06-01

    interpretive consideration. See Vienna Convention, supra note 51, art. 31(3). 69 On economic sanctions, see Paul Szasz , The Law of Economic Sanctions...COMPUTER NETWORK ATTACK AND THE USE OF FORCE IN INTERNATIONAL LAW : THOUGHTS ON A NORMATIVE FRAMEWORK MICHAEL N...Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law , no person shall be subject to a

  17. Putin’s Russia: Russian Mentality and Sophisticated Imperialism in Military Policies

    OpenAIRE

    Szénási, Lieutenant-Colonel Endre

    2016-01-01

    According to my experiences, the Western world hopelessly fails to understand Russian mentality, or misinterprets it. During my analysis of the Russian way of thinking I devoted special attention to the examination of military mentality. I have connected the issue of the Russian way of thinking to the contemporary imperial policies of Putin’s Russia.  I have also attempted to prove the level of sophistication of both. I hope that a better understanding of both the Russian mentality and imperi...

  18. GLOFRIM v1.0 – A globally applicable computational framework for integrated hydrological–hydrodynamic modelling

    Directory of Open Access Journals (Sweden)

    J. M. Hoch

    2017-10-01

    Full Text Available We here present GLOFRIM, a globally applicable computational framework for integrated hydrological–hydrodynamic modelling. GLOFRIM facilitates spatially explicit coupling of hydrodynamic and hydrologic models and caters for an ensemble of models to be coupled. It currently encompasses the global hydrological model PCR-GLOBWB as well as the hydrodynamic models Delft3D Flexible Mesh (DFM; solving the full shallow-water equations and allowing for spatially flexible meshing and LISFLOOD-FP (LFP; solving the local inertia equations and running on regular grids. The main advantages of the framework are its open and free access, its global applicability, its versatility, and its extensibility with other hydrological or hydrodynamic models. Before applying GLOFRIM to an actual test case, we benchmarked both DFM and LFP for a synthetic test case. Results show that for sub-critical flow conditions, discharge response to the same input signal is near-identical for both models, which agrees with previous studies. We subsequently applied the framework to the Amazon River basin to not only test the framework thoroughly, but also to perform a first-ever benchmark of flexible and regular grids on a large-scale. Both DFM and LFP produce comparable results in terms of simulated discharge with LFP exhibiting slightly higher accuracy as expressed by a Kling–Gupta efficiency of 0.82 compared to 0.76 for DFM. However, benchmarking inundation extent between DFM and LFP over the entire study area, a critical success index of 0.46 was obtained, indicating that the models disagree as often as they agree. Differences between models in both simulated discharge and inundation extent are to a large extent attributable to the gridding techniques employed. In fact, the results show that both the numerical scheme of the inundation model and the gridding technique can contribute to deviations in simulated inundation extent as we control for model forcing and boundary

  19. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries.

    Science.gov (United States)

    Drawert, Brian; Engblom, Stefan; Hellander, Andreas

    2012-06-22

    Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at

  20. eCodonOpt: a systematic computational framework for optimizing codon usage in directed evolution experiments

    OpenAIRE

    Moore, Gregory L.; Maranas, Costas D.

    2002-01-01

    We present a systematic computational framework, eCodonOpt, for designing parental DNA sequences for directed evolution experiments through codon usage optimization. Given a set of homologous parental proteins to be recombined at the DNA level, the optimal DNA sequences encoding these proteins are sought for a given diversity objective. We find that the free energy of annealing between the recombining DNA sequences is a much better descriptor of the extent of crossover formation than sequence...

  1. The relation between maturity and sophistication shall be properly dealt with in nuclear power development

    International Nuclear Information System (INIS)

    Li Yongjiang

    2009-01-01

    The paper analyses the advantages and disadvantages of the second generation improved technologies and third generation technologies mainly developed in China in terms of safety and economy. The paper also discusses the maturity of the second generation improved technologies and the sophistication of the third generation technologies respectively. Meanwhile, the paper proposes that the advantage and disadvantage of second generation improved technologies and third generation technologies should be carefully taken into consideration and the relationship between the maturity and sophistication should be properly dealt with in the current stage. A two-step strategy shall be taken as a solution to solve the problem of insufficient capacity of nuclear power, trace and develop the third generation technologies, so as to ensure the sound and fast development of nuclear power. (authors)

  2. Simply computing for seniors

    CERN Document Server

    Clark, Linda

    2011-01-01

    Step-by-step instructions for seniors to get up and running on a home PC Answering the call for an up-to-date, straightforward computer guide targeted specifically for seniors, this helpful book includes easy-to-follow tutorials that escort you through the basics and shows you how to get the most out of your PC. Boasting an elegant, full-color interior with a clean, sophisticated look and feel, the layout makes it easy for you to find the information you need quickly. Author Linda Clark has earned her highly respected reputation through years of teaching computers at both the beginnin

  3. The InSAR Scientific Computing Environment (ISCE): A Python Framework for Earth Science

    Science.gov (United States)

    Rosen, P. A.; Gurrola, E. M.; Agram, P. S.; Sacco, G. F.; Lavalle, M.

    2015-12-01

    The InSAR Scientific Computing Environment (ISCE, funded by NASA ESTO) provides a modern computing framework for geodetic image processing of InSAR data from a diverse array of radar satellites and aircraft. ISCE is both a modular, flexible, and extensible framework for building software components and applications as well as a toolbox of applications for processing raw or focused InSAR and Polarimetric InSAR data. The ISCE framework contains object-oriented Python components layered to construct Python InSAR components that manage legacy Fortran/C InSAR programs. Components are independently configurable in a layered manner to provide maximum control. Polymorphism is used to define a workflow in terms of abstract facilities for each processing step that are realized by specific components at run-time. This enables a single workflow to work on either raw or focused data from all sensors. ISCE can serve as the core of a production center to process Level-0 radar data to Level-3 products, but is amenable to interactive processing approaches that allow scientists to experiment with data to explore new ways of doing science with InSAR data. The NASA-ISRO SAR (NISAR) Mission will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystems. ISCE is planned as the foundational element in processing NISAR data, enabling a new class of analyses that take greater advantage of the long time and large spatial scales of these new data. NISAR will be but one mission in a constellation of radar satellites in the future delivering such data. ISCE currently supports all publicly available strip map mode space-borne SAR data since ERS and is expected to include support for upcoming missions. ISCE has been incorporated into two prototype cloud-based systems that have demonstrated its elasticity in addressing larger data processing problems in a "production" context and its ability to be

  4. Optimizing the Performance of Data Analytics Frameworks

    NARCIS (Netherlands)

    Ghit, B.I.

    2017-01-01

    Data analytics frameworks enable users to process large datasets while hiding the complexity of scaling out their computations on large clusters of thousands of machines. Such frameworks parallelize the computations, distribute the data, and tolerate server failures by deploying their own runtime

  5. A computational framework for cortical microtubule dynamics in realistically shaped plant cells.

    Directory of Open Access Journals (Sweden)

    Bandan Chakrabortty

    2018-02-01

    Full Text Available Plant morphogenesis is strongly dependent on the directional growth and the subsequent oriented division of individual cells. It has been shown that the plant cortical microtubule array plays a key role in controlling both these processes. This ordered structure emerges as the collective result of stochastic interactions between large numbers of dynamic microtubules. To elucidate this complex self-organization process a number of analytical and computational approaches to study the dynamics of cortical microtubules have been proposed. To date, however, these models have been restricted to two dimensional planes or geometrically simple surfaces in three dimensions, which strongly limits their applicability as plant cells display a wide variety of shapes. This limitation is even more acute, as both local as well as global geometrical features of cells are expected to influence the overall organization of the array. Here we describe a framework for efficiently simulating microtubule dynamics on triangulated approximations of arbitrary three dimensional surfaces. This allows the study of microtubule array organization on realistic cell surfaces obtained by segmentation of microscopic images. We validate the framework against expected or known results for the spherical and cubical geometry. We then use it to systematically study the individual contributions of global geometry, cell-edge induced catastrophes and cell-face induced stability to array organization in a cuboidal geometry. Finally, we apply our framework to analyze the highly non-trivial geometry of leaf pavement cells of Arabidopsis thaliana, Nicotiana benthamiana and Hedera helix. We show that our simulations can predict multiple features of the microtubule array structure in these cells, revealing, among others, strong constraints on the orientation of division planes.

  6. A computational framework for cortical microtubule dynamics in realistically shaped plant cells

    KAUST Repository

    Chakrabortty, Bandan; Blilou, Ikram; Scheres, Ben; Mulder, Bela M.

    2018-01-01

    Plant morphogenesis is strongly dependent on the directional growth and the subsequent oriented division of individual cells. It has been shown that the plant cortical microtubule array plays a key role in controlling both these processes. This ordered structure emerges as the collective result of stochastic interactions between large numbers of dynamic microtubules. To elucidate this complex self-organization process a number of analytical and computational approaches to study the dynamics of cortical microtubules have been proposed. To date, however, these models have been restricted to two dimensional planes or geometrically simple surfaces in three dimensions, which strongly limits their applicability as plant cells display a wide variety of shapes. This limitation is even more acute, as both local as well as global geometrical features of cells are expected to influence the overall organization of the array. Here we describe a framework for efficiently simulating microtubule dynamics on triangulated approximations of arbitrary three dimensional surfaces. This allows the study of microtubule array organization on realistic cell surfaces obtained by segmentation of microscopic images. We validate the framework against expected or known results for the spherical and cubical geometry. We then use it to systematically study the individual contributions of global geometry, cell-edge induced catastrophes and cell-face induced stability to array organization in a cuboidal geometry. Finally, we apply our framework to analyze the highly non-trivial geometry of leaf pavement cells of Arabidopsis thaliana, Nicotiana benthamiana and Hedera helix. We show that our simulations can predict multiple features of the microtubule array structure in these cells, revealing, among others, strong constraints on the orientation of division planes.

  7. A computational framework for cortical microtubule dynamics in realistically shaped plant cells

    KAUST Repository

    Chakrabortty, Bandan

    2018-02-02

    Plant morphogenesis is strongly dependent on the directional growth and the subsequent oriented division of individual cells. It has been shown that the plant cortical microtubule array plays a key role in controlling both these processes. This ordered structure emerges as the collective result of stochastic interactions between large numbers of dynamic microtubules. To elucidate this complex self-organization process a number of analytical and computational approaches to study the dynamics of cortical microtubules have been proposed. To date, however, these models have been restricted to two dimensional planes or geometrically simple surfaces in three dimensions, which strongly limits their applicability as plant cells display a wide variety of shapes. This limitation is even more acute, as both local as well as global geometrical features of cells are expected to influence the overall organization of the array. Here we describe a framework for efficiently simulating microtubule dynamics on triangulated approximations of arbitrary three dimensional surfaces. This allows the study of microtubule array organization on realistic cell surfaces obtained by segmentation of microscopic images. We validate the framework against expected or known results for the spherical and cubical geometry. We then use it to systematically study the individual contributions of global geometry, cell-edge induced catastrophes and cell-face induced stability to array organization in a cuboidal geometry. Finally, we apply our framework to analyze the highly non-trivial geometry of leaf pavement cells of Arabidopsis thaliana, Nicotiana benthamiana and Hedera helix. We show that our simulations can predict multiple features of the microtubule array structure in these cells, revealing, among others, strong constraints on the orientation of division planes.

  8. South American regional ionospheric maps computed by GESA: A pilot service in the framework of SIRGAS

    Science.gov (United States)

    Brunini, C.; Meza, A.; Gende, M.; Azpilicueta, F.

    2008-08-01

    SIRGAS (Geocentric Reference Frame for the Americas) is an international enterprise of the geodetic community that aims to realize the Terrestrial Reference Frame in the America's countries. In order to fulfill this commitment, SIRGAS manages a network of continuously operational GNSS receivers totalling around one hundred sites in the Caribbean, Central, and South American region. Although the network was not planed for ionospheric studies, its potential to be used for such a purpose was recently recognized and SIRGAS started a pilot experiment devoted to establish a regular service for computing and releasing regional vertical TEC (vTEC) maps based on GNSS data. Since July, 2005, the GESA (Geodesia Espacial y Aeronomía) laboratory belonging to the Facultad de Ciencias Astronómicas y Geofísicas of the Universidad Nacional de La Plata computes hourly maps of vertical Total Electron Content (vTEC) in the framework of the SIRGAS pilot experiment. These maps exploit all the GNSS data available in the South American region and are computed with the LPIM (La Plata Ionospheric Model). LPIM implements a de-biasing procedure that improves data calibration in relation to other procedures commonly used for such purposes. After calibration, slant TEC measurements are converted to vertical and mapped using local-time and modip latitude. The use of modip latitude smoothed the spatial variability of vTEC, especially in the South American low latitude region and hence allows for a better vTEC interpolation. This contribution summarizes the results obtained by GESA in the framework of the SIRGAS pilot experiment.

  9. behaviorism: a framework for dynamic data visualization.

    Science.gov (United States)

    Forbes, Angus Graeme; Höllerer, Tobias; Legrady, George

    2010-01-01

    While a number of information visualization software frameworks exist, creating new visualizations, especially those that involve novel visualization metaphors, interaction techniques, data analysis strategies, and specialized rendering algorithms, is still often a difficult process. To facilitate the creation of novel visualizations we present a new software framework, behaviorism, which provides a wide range of flexibility when working with dynamic information on visual, temporal, and ontological levels, but at the same time providing appropriate abstractions which allow developers to create prototypes quickly which can then easily be turned into robust systems. The core of the framework is a set of three interconnected graphs, each with associated operators: a scene graph for high-performance 3D rendering, a data graph for different layers of semantically linked heterogeneous data, and a timing graph for sophisticated control of scheduling, interaction, and animation. In particular, the timing graph provides a unified system to add behaviors to both data and visual elements, as well as to the behaviors themselves. To evaluate the framework we look briefly at three different projects all of which required novel visualizations in different domains, and all of which worked with dynamic data in different ways: an interactive ecological simulation, an information art installation, and an information visualization technique.

  10. Computational framework for risk-based planning of inspections, maintenance, and condition monitoring using discrete Bayesian networks

    DEFF Research Database (Denmark)

    Nielsen, Jannie Sønderkær; Sørensen, John Dalsgaard

    2018-01-01

    This paper presents a computational framework for risk-based planning of inspections and repairs for deteriorating components. Two distinct types of decision rules are used to model decisions: simple decision rules that depend on constants or observed variables (e.g. inspection outcome...... expecte d life-cycle costs. For advanced decision rules, simulations are performed to estimate the expected costs, and dBNs are used within the simulations for decision-making. Information from inspections and condition monitoring are included if available. An example in the paper demonstrates...... the framework and the implemented strategies and decision rules, including various types of condition-based maintenance. The strategies using advanced decision rules lead to reduced costs compared to the simple decision rules when condition monitoring is applied, and the value of condition monitoring...

  11. Application of a computer-aided framework for the design of CO2 capture and utilization processes

    DEFF Research Database (Denmark)

    Frauzem, Rebecca; Woodley, John; Gani, Rafiqul

    2017-01-01

    Carbon dioxide capture and utilization is a vital element of carbon dioxide emission reduction to address global warming. An integrated, computer-aided framework has been developed to achieve this. This framework adopts a three-stage approach to sustainable process synthesis-design: (i) process...... synthesis, (ii) process design and (iii) innovative design. In the first stage, reaction path synthesis is used to determine the reactions and products that are considered in processing route selection and/or generation. Various scenarios are then considered for the superstructure optimization. The selected...... steps. Then, the superstructure optimization is performed on a network containing 13 likely products giving 30 feasible processing routes, considering different scenarios and objectives. Stages 2 and 3 have been applied to the optimal solution of the first scenario, which selects the production...

  12. Research Directions for AI in Computer Games

    OpenAIRE

    Fairclough, Chris; Fagan, Michael; Cunningham, Padraig; Mac Namee, Brian

    2001-01-01

    The computer games industry is now bigger than the film industry. Until recently, technology in games was driven by a desire to achieve real-time, photo-realistic graphics. To a large extent, this has now been achieved. As game developers look for new and innovative technologies to drive games development, AI is coming to the fore. This paper will examine how sophisticated AI techniques, such as those being used in mainstream academic research, can be applied to computer games ...

  13. Computing challenges of the CMS experiment

    International Nuclear Information System (INIS)

    Krammer, N.; Liko, D.

    2017-01-01

    The success of the LHC experiments is due to the magnificent performance of the detector systems and the excellent operating computing systems. The CMS offline software and computing system is successfully fulfilling the LHC Run 2 requirements. For the increased data rate of future LHC operation, together with high pileup interactions, improvements of the usage of the current computing facilities and new technologies became necessary. Especially for the challenge of the future HL-LHC a more flexible and sophisticated computing model is needed. In this presentation, I will discuss the current computing system used in the LHC Run 2 and future computing facilities for the HL-LHC runs using flexible computing technologies like commercial and academic computing clouds. The cloud resources are highly virtualized and can be deployed for a variety of computing tasks providing the capacities for the increasing needs of large scale scientific computing.

  14. XACC - eXtreme-scale Accelerator Programming Framework

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-18

    Hybrid programming models for beyond-CMOS technologies will prove critical for integrating new computing technologies alongside our existing infrastructure. Unfortunately the software infrastructure required to enable this is lacking or not available. XACC is a programming framework for extreme-scale, post-exascale accelerator architectures that integrates alongside existing conventional applications. It is a pluggable framework for programming languages developed for next-gen computing hardware architectures like quantum and neuromorphic computing. It lets computational scientists efficiently off-load classically intractable work to attached accelerators through user-friendly Kernel definitions. XACC makes post-exascale hybrid programming approachable for domain computational scientists.

  15. Close to the Clothes : Materiality and Sophisticated Archaism in Alexander van Slobbe’s Design Practices

    NARCIS (Netherlands)

    Baronian, M.-A.

    This article looks at the work of contemporary Dutch fashion designer Alexander van Slobbe (1959) and examines how, since the 1990s, his fashion practices have consistently and consciously put forward a unique reflection on questions related to materiality, sophisticated archaism, luxury,

  16. Close to the Clothes: Materiality and Sophisticated Archaism in Alexander van Slobbe’s Design Practices

    NARCIS (Netherlands)

    Baronian, M.-A.

    This article looks at the work of contemporary Dutch fashion designer Alexander van Slobbe (1959) and examines how, since the 1990s, his fashion practices have consistently and consciously put forward a unique reflection on questions related to materiality, sophisticated archaism, luxury,

  17. An FPGA-Based Quantum Computing Emulation Framework Based on Serial-Parallel Architecture

    Directory of Open Access Journals (Sweden)

    Y. H. Lee

    2016-01-01

    Full Text Available Hardware emulation of quantum systems can mimic more efficiently the parallel behaviour of quantum computations, thus allowing higher processing speed-up than software simulations. In this paper, an efficient hardware emulation method that employs a serial-parallel hardware architecture targeted for field programmable gate array (FPGA is proposed. Quantum Fourier transform and Grover’s search are chosen as case studies in this work since they are the core of many useful quantum algorithms. Experimental work shows that, with the proposed emulation architecture, a linear reduction in resource utilization is attained against the pipeline implementations proposed in prior works. The proposed work contributes to the formulation of a proof-of-concept baseline FPGA emulation framework with optimization on datapath designs that can be extended to emulate practical large-scale quantum circuits.

  18. An Introduction to Computational Fluid Mechanics by Example

    CERN Document Server

    Biringen, Sedat

    2011-01-01

    This new book builds on the original classic textbook entitled: An Introduction to Computational Fluid Mechanics by C. Y. Chow which was originally published in 1979. In the decades that have passed since this book was published the field of computational fluid dynamics has seen a number of changes in both the sophistication of the algorithms used but also advances in the computer hardware and software available. This new book incorporates the latest algorithms in the solution techniques and supports this by using numerous examples of applications to a broad range of industries from mechanical

  19. Framework of nuclear safety and safety assessment

    International Nuclear Information System (INIS)

    Furuta, Kazuo

    2007-01-01

    Since enormous energy is released by nuclear chain reaction mainly as a form of radiation, a great potential risk accompanies utilization of nuclear energy. Safety has been continuously a critical issue therefore from the very beginning of its development. Though the framework of nuclear safety that has been established at an early developmental stage of nuclear engineering is still valid, more comprehensive approaches are required having experienced several events such as Three Mile Island, Chernobyl, and JCO. This article gives a brief view of the most basic principles how nuclear safety is achieved, which were introduced and sophisticated in nuclear engineering but applicable also to other engineering domains in general. (author)

  20. Algorithms for image processing and computer vision

    CERN Document Server

    Parker, J R

    2010-01-01

    A cookbook of algorithms for common image processing applications Thanks to advances in computer hardware and software, algorithms have been developed that support sophisticated image processing without requiring an extensive background in mathematics. This bestselling book has been fully updated with the newest of these, including 2D vision methods in content-based searches and the use of graphics cards as image processing computational aids. It's an ideal reference for software engineers and developers, advanced programmers, graphics programmers, scientists, and other specialists wh

  1. Multi-Disciplinary Computational Aerodynamics

    Science.gov (United States)

    2016-01-01

    one matching of the structural mesh and the surface mesh for the fluid is not always possible. More sophisticated approaches such as thin plate splines ...reference, the flexible wing is representative of a thin aluminum plate . These particular choices are both tractable for LES computations and...68Mayori, A. and Rockwell, D., “Interaction of a streamwise vortex with a thin plate : a source of turbulent buffeting,” AIAA J., Vol. 32, No. 10, 1994

  2. All for One: Integrating Budgetary Methods by Computer.

    Science.gov (United States)

    Herman, Jerry J.

    1994-01-01

    With the advent of high speed and sophisticated computer programs, all budgetary systems can be combined in one fiscal management information system. Defines and provides examples for the four budgeting systems: (1) function/object; (2) planning, programming, budgeting system; (3) zero-based budgeting; and (4) site-based budgeting. (MLF)

  3. Reliable, Memory Speed Storage for Cluster Computing Frameworks

    Science.gov (United States)

    2014-06-16

    data, the data needs to be written to a storage system. Nectar [24] also uses lineage for a specific framework (DryadLINQ) with the goal of saving space... Nectar [24] also uses the concept of lineage, but it does so only for a specific pro- gramming framework (DryadLINQ [44]), and in the con- text of a...traditional, replicated file system. Nectar is a data reuse system for DryadLINQ queries whose goals are to save space and to avoid redundant

  4. A Snapshot of Serial Rape: An Investigation of Criminal Sophistication and Use of Force on Victim Injury and Severity of the Assault.

    Science.gov (United States)

    de Heer, Brooke

    2016-02-01

    Prior research on rapes reported to law enforcement has identified criminal sophistication and the use of force against the victim as possible unique identifiers to serial rape versus one-time rape. This study sought to contribute to the current literature on reported serial rape by investigating how the level of criminal sophistication of the rapist and use of force used were associated with two important outcomes of rape: victim injury and overall severity of the assault. In addition, it was evaluated whether rapist and victim ethnicity affected these relationships. A nation-wide sample of serial rape cases reported to law enforcement collected by the Federal Bureau of Investigation (FBI) was analyzed (108 rapists, 543 victims). Results indicated that serial rapists typically used a limited amount of force against the victim and displayed a high degree of criminal sophistication. In addition, the more criminally sophisticated the perpetrator was, the more sexual acts he performed on his victim. Finally, rapes between a White rapist and White victim were found to exhibit higher levels of criminal sophistication and were more severe in terms of number and types of sexual acts committed. These findings provide a more in-depth understanding of serial rape that can inform both academics and practitioners in the field about contributors to victim injury and severity of the assault. © The Author(s) 2014.

  5. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Kozacik, Stephen [EM Photonics, Inc., Newark, DE (United States)

    2017-05-15

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  6. From Digital Imaging to Computer Image Analysis of Fine Art

    Science.gov (United States)

    Stork, David G.

    An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.

  7. Computing in an academic radiation therapy department

    International Nuclear Information System (INIS)

    Gottlieb, C.F.; Houdek, P.V.; Fayos, J.V.

    1985-01-01

    The authors conceptualized the different computer functions in radiotherapy as follows: 1) treatment planning and dosimetry, 2) data and word processing, 3) radiotherapy information system (data bank), 4) statistical analysis, 5) data acquisition and equipment control, 6) telecommunication, and 7) financial management. They successfully implemented the concept of distributed computing using multiple mini and personal computers. The authors' computer practice supports data and word processing, graphics, communication, automated data acquisition and control, and portable computing. The computers are linked together into a local computer network which permits sharing of information, peripherals, and unique programs among our systems, while preserving the individual function and identity of each machine. Furthermore, the architecture of our network allows direct access to any other computer network providing them with inexpensive use of the most modern and sophisticated software and hardware resources

  8. Text genres and registers the computation of linguistic features

    CERN Document Server

    Fang, Chengyu Alex

    2015-01-01

    This book is a description of some of the most recent advances in text classification as part of a concerted effort to achieve computer understanding of human language. In particular, it addresses state-of-the-art developments in the computation of higher-level linguistic features, ranging from etymology to grammar and syntax for the practical task of text classification according to genres, registers and subject domains. Serving as a bridge between computational methods and sophisticated linguistic analysis, this book will be of particular interest to academics and students of computational linguistics as well as professionals in natural language engineering.

  9. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  10. RIO: a new computational framework for accurate initial data of binary black holes

    Science.gov (United States)

    Barreto, W.; Clemente, P. C. M.; de Oliveira, H. P.; Rodriguez-Mueller, B.

    2018-06-01

    We present a computational framework ( Rio) in the ADM 3+1 approach for numerical relativity. This work enables us to carry out high resolution calculations for initial data of two arbitrary black holes. We use the transverse conformal treatment, the Bowen-York and the puncture methods. For the numerical solution of the Hamiltonian constraint we use the domain decomposition and the spectral decomposition of Galerkin-Collocation. The nonlinear numerical code solves the set of equations for the spectral modes using the standard Newton-Raphson method, LU decomposition and Gaussian quadratures. We show the convergence of the Rio code. This code allows for easy deployment of large calculations. We show how the spin of one of the black holes is manifest in the conformal factor.

  11. Guest editorial: Brain/neuronal computer games interfaces and interaction

    OpenAIRE

    Coyle, D.; Principe, J.; Lotte, F.; Nijholt, Antinus

    2013-01-01

    Nowadays brainwave or electroencephalogram (EEG) controlled games controllers are adding new options to satisfy the continual demand for new ways to interact with games, following trends such as the Nintendo® Wii, Microsoft® Kinect and Playstation® Move which are based on accelerometers and motion capture. EEG-based brain-computer games interaction are controlled through brain-computer interface (BCI) technology which requires sophisticated signal processing to produce a low communication ban...

  12. A holistic framework for understanding acceptance of remote patient management (RPM) systems by non-professional users

    NARCIS (Netherlands)

    Puuronen, S.; Vasilyeva, E.; Pechenizkiy, M.; Tesanovic, A.

    2010-01-01

    The successful integration of Information and Communication Technologies (ICT) in healthcare facilitates the use of the sophisticated medical equipment and computer applications by medical practitioners. If earlier medical systems were mainly used by the health professionals (e.g. medical staff or

  13. Towards a Framework to Improve the Quality of Teaching and Learning: Consciousness and Validation in Computer Engineering Science, UCT

    Science.gov (United States)

    Lévano, Marcos; Albornoz, Andrea

    2016-01-01

    This paper aims to propose a framework to improve the quality in teaching and learning in order to develop good practices to train professionals in the career of computer engineering science. To demonstrate the progress and achievements, our work is based on two principles for the formation of professionals, one based on the model of learning…

  14. A Computational Framework for Quantifying and Optimizing the Performance of Observational Networks in 4D-Var Data Assimilation

    Science.gov (United States)

    Cioaca, Alexandru

    A deep scientific understanding of complex physical systems, such as the atmosphere, can be achieved neither by direct measurements nor by numerical simulations alone. Data assimila- tion is a rigorous procedure to fuse information from a priori knowledge of the system state, the physical laws governing the evolution of the system, and real measurements, all with associated error statistics. Data assimilation produces best (a posteriori) estimates of model states and parameter values, and results in considerably improved computer simulations. The acquisition and use of observations in data assimilation raises several important scientific questions related to optimal sensor network design, quantification of data impact, pruning redundant data, and identifying the most beneficial additional observations. These questions originate in operational data assimilation practice, and have started to attract considerable interest in the recent past. This dissertation advances the state of knowledge in four dimensional variational (4D-Var) data assimilation by developing, implementing, and validating a novel computational framework for estimating observation impact and for optimizing sensor networks. The framework builds on the powerful methodologies of second-order adjoint modeling and the 4D-Var sensitivity equations. Efficient computational approaches for quantifying the observation impact include matrix free linear algebra algorithms and low-rank approximations of the sensitivities to observations. The sensor network configuration problem is formulated as a meta-optimization problem. Best values for parameters such as sensor location are obtained by optimizing a performance criterion, subject to the constraint posed by the 4D-Var optimization. Tractable computational solutions to this "optimization-constrained" optimization problem are provided. The results of this work can be directly applied to the deployment of intelligent sensors and adaptive observations, as well as

  15. Computational Psychosomatics and Computational Psychiatry: Toward a Joint Framework for Differential Diagnosis.

    Science.gov (United States)

    Petzschner, Frederike H; Weber, Lilian A E; Gard, Tim; Stephan, Klaas E

    2017-09-15

    This article outlines how a core concept from theories of homeostasis and cybernetics, the inference-control loop, may be used to guide differential diagnosis in computational psychiatry and computational psychosomatics. In particular, we discuss 1) how conceptualizing perception and action as inference-control loops yields a joint computational perspective on brain-world and brain-body interactions and 2) how the concrete formulation of this loop as a hierarchical Bayesian model points to key computational quantities that inform a taxonomy of potential disease mechanisms. We consider the utility of this perspective for differential diagnosis in concrete clinical applications. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  16. The sophisticated control of the tram bogie on track

    Directory of Open Access Journals (Sweden)

    Radovan DOLECEK

    2015-09-01

    Full Text Available The paper deals with the problems of routing control algorithms of new conception of tram vehicle bogie. The main goal of these research activities is wear reduction of rail wheels and tracks, wear reduction of traction energy losses and increasing of running comfort. The testing experimental tram vehicle with special bogie construction powered by traction battery is utilized for these purposes. This vehicle has a rotary bogie with independent rotating wheels driven by permanent magnets synchronous motors and a solid axle. The wheel forces in bogie are measured by large amounts of the various sensors placed on the testing experimental tram vehicle. Nowadays the designed control algorithms are implemented to the vehicle superset control system. The traction requirements and track characteristics have an effect to these control algorithms. This control including sophisticated routing brings other improvements which is verified and corrected according to individual traction and driving characteristics, and opens new possibilities.

  17. Cuby: An Integrative Framework for Computational Chemistry

    Czech Academy of Sciences Publication Activity Database

    Řezáč, Jan

    2016-01-01

    Roč. 37, č. 13 (2016), s. 1230-1237 ISSN 0192-8651 R&D Projects: GA ČR GP13-01214P Institutional support: RVO:61388963 Keywords : software framework * workflow automation * QM/MM * datasets * Ruby Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 3.229, year: 2016

  18. Use of Computer-Assisted Technologies (CAT) to Enhance Social, Communicative, and Language Development in Children with Autism Spectrum Disorders

    Science.gov (United States)

    Ploog, Bertram O.; Scharf, Alexa; Nelson, DeShawn; Brooks, Patricia J.

    2013-01-01

    Major advances in multimedia computer technology over the past decades have made sophisticated computer games readily available to the public. This, combined with the observation that most children, including those with autism spectrum disorders (ASD), show an affinity to computers, has led researchers to recognize the potential of computer…

  19. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  20. Computers. A perspective on their usefulness in nuclear medicine

    International Nuclear Information System (INIS)

    Loken, M.K.; Williams, L.E.; Ponto, R.A.; Ganatra, R.D.; Raikar, U.; Samuel, A.M.

    1977-01-01

    To date, many symposia have been held on computer applications in nuclear medicine. Despite all of these efforts, an appraisal of the true utility of computers in the day-to-day practice of nuclear medicine is yet to be achieved. Now that the technology of data storage and processing in nuclear medicine has reached a high degree of sophistication, as evidenced by many reports in the literature, the time has come to develop a perspective on the proper place of computers in nuclear medicine practice. The paper summarizes various uses of a dedicated computer (Nuclear Data Med II) at our two institutions and comments on its clinical utility. (author)

  1. Progress in Computational Physics (PiCP) Volume 1 Wave Propagation in Periodic Media

    CERN Document Server

    Ehrhardt, Matthias

    2010-01-01

    Progress in Computational Physics is a new e-book series devoted to recent research trends in computational physics. It contains chapters contributed by outstanding experts of modeling of physical problems. The series focuses on interdisciplinary computational perspectives of current physical challenges, new numerical techniques for the solution of mathematical wave equations and describes certain real-world applications. With the help of powerful computers and sophisticated methods of numerical mathematics it is possible to simulate many ultramodern devices, e.g. photonic crystals structures,

  2. A Current Logical Framework: The Propositional Fragment

    National Research Council Canada - National Science Library

    Watkins, Kevin

    2003-01-01

    We present the propositional fragment CLF of the Concurrent Logical Framework (CLF). CLF extends the Linear Logical Framework to allow the natural representation of concurrent computations in an object language...

  3. Interfacing the Paramesh Computational Libraries to the Cactus Computational Framework, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We will design and implement an interface between the Paramesh computational libraries, developed and used by groups at NASA GSFC, and the Cactus computational...

  4. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  5. Porting plasma physics simulation codes to modern computing architectures using the libmrc framework

    Science.gov (United States)

    Germaschewski, Kai; Abbott, Stephen

    2015-11-01

    Available computing power has continued to grow exponentially even after single-core performance satured in the last decade. The increase has since been driven by more parallelism, both using more cores and having more parallelism in each core, e.g. in GPUs and Intel Xeon Phi. Adapting existing plasma physics codes is challenging, in particular as there is no single programming model that covers current and future architectures. We will introduce the open-source libmrc framework that has been used to modularize and port three plasma physics codes: The extended MHD code MRCv3 with implicit time integration and curvilinear grids; the OpenGGCM global magnetosphere model; and the particle-in-cell code PSC. libmrc consolidates basic functionality needed for simulations based on structured grids (I/O, load balancing, time integrators), and also introduces a parallel object model that makes it possible to maintain multiple implementations of computational kernels, on e.g. conventional processors and GPUs. It handles data layout conversions and enables us to port performance-critical parts of a code to a new architecture step-by-step, while the rest of the code can remain unchanged. We will show examples of the performance gains and some physics applications.

  6. 3 D flow computations under a reactor vessel closure head

    International Nuclear Information System (INIS)

    Daubert, O.; Bonnin, O.; Hofmann, F.; Hecker, M.

    1995-12-01

    The flow under a vessel cover of a pressurised water reactor is investigated by using several computations and a physical model. The case presented here is turbulent, isothermal and incompressible. Computations are made with N3S code using a k-epsilon model. Comparisons between numerical and experimental results are on the whole satisfying. Some local improvements are expected either with more sophisticated turbulence models or with mesh refinements automatically computed by using the adaptive meshing technique which has been just implemented in N3S for 3D cases. (authors). 6 refs., 7 figs

  7. Discovering and validating biological hypotheses from coherent patterns in functional genomics data

    Energy Technology Data Exchange (ETDEWEB)

    Joachimiak, Marcin Pawel

    2008-08-12

    The area of transcriptomics analysis is among the more established in computational biology, having evolved in both technology and experimental design. Transcriptomics has a strong impetus to develop sophisticated computational methods due to the large amounts of available whole-genome datasets for many species and because of powerful applications in regulatory network reconstruction as well as elucidation and modeling of cellular transcriptional responses. While gene expression microarray data can be noisy and comparisons across experiments challenging, there are a number of sophisticated methods that aid in arriving at statistically and biologically significant conclusions. As such, computational transcriptomics analysis can provide guidance for analysis of results from newer experimental technologies. More recently, search methods have been developed to identify modules of genes, which exhibit coherent expression patterns in only a subset of experimental conditions. The latest advances in these methods allow to integrate multiple data types anddatasets, both experimental and computational, within a single statistical framework accounting for data confidence and relevance to specific biological questions. Such frameworks provide a unified environment for the exploration of specific biological hypothesis and for the discovery of coherent data patterns along with the evidence supporting them.

  8. Abstraction/Representation Theory for heterotic physical computing.

    Science.gov (United States)

    Horsman, D C

    2015-07-28

    We give a rigorous framework for the interaction of physical computing devices with abstract computation. Device and program are mediated by the non-logical representation relation; we give the conditions under which representation and device theory give rise to commuting diagrams between logical and physical domains, and the conditions for computation to occur. We give the interface of this new framework with currently existing formal methods, showing in particular its close relationship to refinement theory, and the implications for questions of meaning and reference in theoretical computer science. The case of hybrid computing is considered in detail, addressing in particular the example of an Internet-mediated social machine, and the abstraction/representation framework used to provide a formal distinction between heterotic and hybrid computing. This forms the basis for future use of the framework in formal treatments of non-standard physical computers. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  9. Optimization of Network Topology in Computer-Aided Detection Schemes Using Phased Searching with NEAT in a Time-Scaled Framework.

    Science.gov (United States)

    Tan, Maxine; Pu, Jiantao; Zheng, Bin

    2014-01-01

    In the field of computer-aided mammographic mass detection, many different features and classifiers have been tested. Frequently, the relevant features and optimal topology for the artificial neural network (ANN)-based approaches at the classification stage are unknown, and thus determined by trial-and-error experiments. In this study, we analyzed a classifier that evolves ANNs using genetic algorithms (GAs), which combines feature selection with the learning task. The classifier named "Phased Searching with NEAT in a Time-Scaled Framework" was analyzed using a dataset with 800 malignant and 800 normal tissue regions in a 10-fold cross-validation framework. The classification performance measured by the area under a receiver operating characteristic (ROC) curve was 0.856 ± 0.029. The result was also compared with four other well-established classifiers that include fixed-topology ANNs, support vector machines (SVMs), linear discriminant analysis (LDA), and bagged decision trees. The results show that Phased Searching outperformed the LDA and bagged decision tree classifiers, and was only significantly outperformed by SVM. Furthermore, the Phased Searching method required fewer features and discarded superfluous structure or topology, thus incurring a lower feature computational and training and validation time requirement. Analyses performed on the network complexities evolved by Phased Searching indicate that it can evolve optimal network topologies based on its complexification and simplification parameter selection process. From the results, the study also concluded that the three classifiers - SVM, fixed-topology ANN, and Phased Searching with NeuroEvolution of Augmenting Topologies (NEAT) in a Time-Scaled Framework - are performing comparably well in our mammographic mass detection scheme.

  10. TelluSim: A Python Plug-in Based Computational Framework for Spatially Distributed Environmental and Earth Sciences Modelling

    Science.gov (United States)

    Willgoose, G. R.

    2008-12-01

    TelluSim is a python-based computational framework for integrating and manipulating modules written in a variety of computer languages. TelluSim consists of a main program that dynamically, at run time, assembles a series of modules. These modules can be written in any language that can be accessed by Python. Currently we have modules in Fortran and Python, with C to be supported soon. New modules are incorporated as plug-ins like done for a browser or Photoshop, simply by copying the module binary into a plug-in directory. TelluSim automatically generates a GUI for parameter and state I/O, and automatically creates the intermodule communication mechanisms needed for the computations. A decision to use Python was arrived at after detailed trials using other languages including C, Tcl/Tk and Fortran. An important aspect of the design of TelluSim was to minimise the overhead in interfacing the modules with TelluSim, and minimise any requirement for recoding of existing software, so eliminating a major disadvantage of more complex frameworks (e.g. JAMS, openMI). Several significant Fortran codes developed by the author have been incorporated as part of the design process and as proof of concept. In particular the SIBERIA landform evolution code (a high performance F90 code, including parallel capability) has been broken up into a series of TelluSim modules, so that the SIBERIA now consists of a Python script of 20 lines. These 20 lines assemble and run the underlying modules (about 50,000 lines of Fortran code). The presentation will discuss in more detail the design of TelluSim, and our experiences of the advantages and disadvantages of using Python relative to other approaches.

  11. A communicating Thread -CT- case study: JIWY

    NARCIS (Netherlands)

    Jovanovic, D.S.; Hilderink, G.H.; Broenink, Johannes F.; Pascoe, P.W.J.; Loader, R.; Sunderman, V.

    2002-01-01

    This JIWY demonstrator is constructed in the context of the development of a design framework and software tools to efficiently support mechatronic engineers in developing sophisticated control computer code out of a set control laws. We use the CSP-based Communicating Threads -CT- library as the

  12. Touchable Computing: Computing-Inspired Bio-Detection.

    Science.gov (United States)

    Chen, Yifan; Shi, Shaolong; Yao, Xin; Nakano, Tadashi

    2017-12-01

    We propose a new computing-inspired bio-detection framework called touchable computing (TouchComp). Under the rubric of TouchComp, the best solution is the cancer to be detected, the parameter space is the tissue region at high risk of malignancy, and the agents are the nanorobots loaded with contrast medium molecules for tracking purpose. Subsequently, the cancer detection procedure (CDP) can be interpreted from the computational optimization perspective: a population of externally steerable agents (i.e., nanorobots) locate the optimal solution (i.e., cancer) by moving through the parameter space (i.e., tissue under screening), whose landscape (i.e., a prescribed feature of tissue environment) may be altered by these agents but the location of the best solution remains unchanged. One can then infer the landscape by observing the movement of agents by applying the "seeing-is-sensing" principle. The term "touchable" emphasizes the framework's similarity to controlling by touching the screen with a finger, where the external field for controlling and tracking acts as the finger. Given this analogy, we aim to answer the following profound question: can we look to the fertile field of computational optimization algorithms for solutions to achieve effective cancer detection that are fast, accurate, and robust? Along this line of thought, we consider the classical particle swarm optimization (PSO) as an example and propose the PSO-inspired CDP, which differs from the standard PSO by taking into account realistic in vivo propagation and controlling of nanorobots. Finally, we present comprehensive numerical examples to demonstrate the effectiveness of the PSO-inspired CDP for different blood flow velocity profiles caused by tumor-induced angiogenesis. The proposed TouchComp bio-detection framework may be regarded as one form of natural computing that employs natural materials to compute.

  13. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  14. a Simulation-As Framework Facilitating Webgis Based Installation Planning

    Science.gov (United States)

    Zheng, Z.; Chang, Z. Y.; Fei, Y. F.

    2017-09-01

    Installation Planning is constrained by both natural and social conditions, especially for spatially sparse but functionally connected facilities. Simulation is important for proper deploy in space and configuration in function of facilities to make them a cohesive and supportive system to meet users' operation needs. Based on requirement analysis, we propose a framework to combine GIS and Agent simulation to overcome the shortness in temporal analysis and task simulation of traditional GIS. In this framework, Agent based simulation runs as a service on the server, exposes basic simulation functions, such as scenario configuration, simulation control, and simulation data retrieval to installation planners. At the same time, the simulation service is able to utilize various kinds of geoprocessing services in Agents' process logic to make sophisticated spatial inferences and analysis. This simulation-as-a-service framework has many potential benefits, such as easy-to-use, on-demand, shared understanding, and boosted performances. At the end, we present a preliminary implement of this concept using ArcGIS javascript api 4.0 and ArcGIS for server, showing how trip planning and driving can be carried out by agents.

  15. Coalescent: an open-science framework for importance sampling in coalescent theory.

    Science.gov (United States)

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  16. Coalescent: an open-science framework for importance sampling in coalescent theory

    Directory of Open Access Journals (Sweden)

    Susanta Tewari

    2015-08-01

    Full Text Available Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner.Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3 for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux. Extensive tests and coverage make the framework reliable and maintainable.Conclusions. In coalescent theory, many studies of computational efficiency

  17. V&V framework

    Energy Technology Data Exchange (ETDEWEB)

    Hills, Richard G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Maniaci, David Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Naughton, Jonathan W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3) uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.

  18. Tunable intraparticle frameworks for creating complex heterostructured nanoparticle libraries

    Science.gov (United States)

    Fenton, Julie L.; Steimle, Benjamin C.; Schaak, Raymond E.

    2018-05-01

    Complex heterostructured nanoparticles with precisely defined materials and interfaces are important for many applications. However, rationally incorporating such features into nanoparticles with rigorous morphology control remains a synthetic bottleneck. We define a modular divergent synthesis strategy that progressively transforms simple nanoparticle synthons into increasingly sophisticated products. We introduce a series of tunable interfaces into zero-, one-, and two-dimensional copper sulfide nanoparticles using cation exchange reactions. Subsequent manipulation of these intraparticle frameworks yielded a library of 47 distinct heterostructured metal sulfide derivatives, including particles that contain asymmetric, patchy, porous, and sculpted nanoarchitectures. This generalizable mix-and-match strategy provides predictable retrosynthetic pathways to complex nanoparticle features that are otherwise inaccessible.

  19. A two-stage multi-view learning framework based computer-aided diagnosis of liver tumors with contrast enhanced ultrasound images.

    Science.gov (United States)

    Guo, Le-Hang; Wang, Dan; Qian, Yi-Yi; Zheng, Xiao; Zhao, Chong-Ke; Li, Xiao-Long; Bo, Xiao-Wan; Yue, Wen-Wen; Zhang, Qi; Shi, Jun; Xu, Hui-Xiong

    2018-04-04

    With the fast development of artificial intelligence techniques, we proposed a novel two-stage multi-view learning framework for the contrast-enhanced ultrasound (CEUS) based computer-aided diagnosis for liver tumors, which adopted only three typical CEUS images selected from the arterial phase, portal venous phase and late phase. In the first stage, the deep canonical correlation analysis (DCCA) was performed on three image pairs between the arterial and portal venous phases, arterial and delayed phases, and portal venous and delayed phases respectively, which then generated total six-view features. While in the second stage, these multi-view features were then fed to a multiple kernel learning (MKL) based classifier to further promote the diagnosis result. Two MKL classification algorithms were evaluated in this MKL-based classification framework. We evaluated proposed DCCA-MKL framework on 93 lesions (47 malignant cancers vs. 46 benign tumors). The proposed DCCA-MKL framework achieved the mean classification accuracy, sensitivity, specificity, Youden index, false positive rate, and false negative rate of 90.41 ± 5.80%, 93.56 ± 5.90%, 86.89 ± 9.38%, 79.44 ± 11.83%, 13.11 ± 9.38% and 6.44 ± 5.90%, respectively, by soft margin MKL classifier. The experimental results indicate that the proposed DCCA-MKL framework achieves best performance for discriminating benign liver tumors from malignant liver cancers. Moreover, it is also proved that the three-phase CEUS image based CAD is feasible for liver tumors with the proposed DCCA-MKL framework.

  20. ClimateSpark: An In-memory Distributed Computing Framework for Big Climate Data Analytics

    Science.gov (United States)

    Hu, F.; Yang, C. P.; Duffy, D.; Schnase, J. L.; Li, Z.

    2016-12-01

    Massive array-based climate data is being generated from global surveillance systems and model simulations. They are widely used to analyze the environment problems, such as climate changes, natural hazards, and public health. However, knowing the underlying information from these big climate datasets is challenging due to both data- and computing- intensive issues in data processing and analyzing. To tackle the challenges, this paper proposes ClimateSpark, an in-memory distributed computing framework to support big climate data processing. In ClimateSpark, the spatiotemporal index is developed to enable Apache Spark to treat the array-based climate data (e.g. netCDF4, HDF4) as native formats, which are stored in Hadoop Distributed File System (HDFS) without any preprocessing. Based on the index, the spatiotemporal query services are provided to retrieve dataset according to a defined geospatial and temporal bounding box. The data subsets will be read out, and a data partition strategy will be applied to equally split the queried data to each computing node, and store them in memory as climateRDDs for processing. By leveraging Spark SQL and User Defined Function (UDFs), the climate data analysis operations can be conducted by the intuitive SQL language. ClimateSpark is evaluated by two use cases using the NASA Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. One use case is to conduct the spatiotemporal query and visualize the subset results in animation; the other one is to compare different climate model outputs using Taylor-diagram service. Experimental results show that ClimateSpark can significantly accelerate data query and processing, and enable the complex analysis services served in the SQL-style fashion.

  1. Neutron Computed Tomography of Freeze/thaw Phenomena in Polymer Electrolyte Fuel Cells

    Energy Technology Data Exchange (ETDEWEB)

    Matthew M. Mech; Jack Brenizer; Kenan Unlu; A.K. Heller

    2008-12-12

    This report summarizes the final year's progress of the three-year NEER program. The overall objectives of this program were to 1) design and construct a sophisticated hight-resolution neutron computed tomography (NCT) facility, 2) develop novel and sophisticated liquid water and ice quantification analysis software for computed tomography, and 3) apply the advanced software and NCT capability to study liquid and ice distribution in polymer electrolyte fuel cells (PEFCs) under cold-start conditions. These objectives have been accomplished by the research team, enabling a new capability for advanced 3D image quantification with neutron imaging for fuel cell and other applications. The NCT water quantification methodology and software will greatly add to the capabilities of the neutron imaging community, and the quantified liquid water and ice distribution provided by its application to PEFCs will enhance understanding and guide design in the fuel cell community.

  2. COMPUTER-ASSISTED CONTROL OF ACADEMIC PERFORMANCE IN ENGINEERING GRAPHICS WITHIN THE FRAMEWORK OF DISTANCE LEARNING PROGRAMMES

    Directory of Open Access Journals (Sweden)

    Tel'noy Viktor Ivanovich

    2012-10-01

    Full Text Available Development of computer-assisted computer technologies and their integration into the academic activity with a view to the control of the academic performance within the framework of distance learning programmes represent the subject matter of the article. The article is a brief overview of the software programme designated for the monitoring of the academic performance of students enrolled in distance learning programmes. The software is developed on Delphi 7.0 for Windows operating system. The strength of the proposed software consists in the availability of the two modes of its operation that differ in the principle of the problem selection and timing parameters. Interim academic performance assessment is to be performed through the employment of computerized testing procedures that contemplate the use of a data base of testing assignments implemented in the eLearning Server media. Identification of students is to be performed through the installation of video cameras at workplaces of students.

  3. TRUSTED CLOUD COMPUTING FRAMEWORK FOR HEALTHCARE SECTOR

    OpenAIRE

    Mervat Adib Bamiah; Sarfraz Nawaz Brohi; Suriayati Chuprat; Jamalul-lail Ab Manan

    2014-01-01

    Cloud computing is rapidly evolving due to its efficient characteristics such as cost-effectiveness, availability and elasticity. Healthcare organizations and consumers lose control when they outsource their sensitive data and computing resources to a third party Cloud Service Provider (CSP), which may raise security and privacy concerns related to data loss and misuse appealing threats. Lack of consumers’ knowledge about their data storage location may lead to violating rules and r...

  4. Low Level RF Including a Sophisticated Phase Control System for CTF3

    CERN Document Server

    Mourier, J; Nonglaton, J M; Syratchev, I V; Tanner, L

    2004-01-01

    CTF3 (CLIC Test Facility 3), currently under construction at CERN, is a test facility designed to demonstrate the key feasibility issues of the CLIC (Compact LInear Collider) two-beam scheme. When completed, this facility will consist of a 150 MeV linac followed by two rings for bunch-interleaving, and a test stand where 30 GHz power will be generated. In this paper, the work that has been carried out on the linac's low power RF system is described. This includes, in particular, a sophisticated phase control system for the RF pulse compressor to produce a flat-top rectangular pulse over 1.4 µs.

  5. Applications of X-ray Computed Tomography and Emission Computed Tomography

    International Nuclear Information System (INIS)

    Seletchi, Emilia Dana; Sutac, Victor

    2005-01-01

    Computed Tomography is a non-destructive imaging method that allows visualization of internal features within non-transparent objects such as sedimentary rocks. Filtering techniques have been applied to circumvent the artifacts and achieve high-quality images for quantitative analysis. High-resolution X-ray computed tomography (HRXCT) can be used to identify the position of the growth axis in speleothems by detecting subtle changes in calcite density between growth bands. HRXCT imagery reveals the three-dimensional variability of coral banding providing information on coral growth and climate over the past several centuries. The Nuclear Medicine imaging technique uses a radioactive tracer, several radiation detectors, and sophisticated computer technologies to understand the biochemical basis of normal and abnormal functions within the brain. The goal of Emission Computed Tomography (ECT) is to accurately determine the three-dimensional radioactivity distribution resulting from the radiopharmaceutical uptake inside the patient instead of the attenuation coefficient distribution from different tissues as obtained from X-ray Computer Tomography. ECT is a very useful tool for investigating the cognitive functions. Because of the low radiation doses associated with Positron Emission Tomography (PET), this technique has been applied in clinical research, allowing the direct study of human neurological diseases. (authors)

  6. Towards Information Security Metrics Framework for Cloud Computing

    OpenAIRE

    Muhammad Imran Tariq

    2012-01-01

    Cloud computing has recently emerged as new computing paradigm which basically aims to provide customized, reliable, dynamic services over the internet.  Cost and security are influential issues to deploy cloud computing in large enterprise.  Privacy and security are very important issues in terms of user trust and legal compliance. Information Security (IS) metrics are best tool used to measure the efficiency, performance, effectiveness and impact of the security constraints. It is very hard...

  7. Causal Inference for Cross-Modal Action Selection: A Computational Study in a Decision Making Framework.

    Science.gov (United States)

    Daemi, Mehdi; Harris, Laurence R; Crawford, J Douglas

    2016-01-01

    Animals try to make sense of sensory information from multiple modalities by categorizing them into perceptions of individual or multiple external objects or internal concepts. For example, the brain constructs sensory, spatial representations of the locations of visual and auditory stimuli in the visual and auditory cortices based on retinal and cochlear stimulations. Currently, it is not known how the brain compares the temporal and spatial features of these sensory representations to decide whether they originate from the same or separate sources in space. Here, we propose a computational model of how the brain might solve such a task. We reduce the visual and auditory information to time-varying, finite-dimensional signals. We introduce controlled, leaky integrators as working memory that retains the sensory information for the limited time-course of task implementation. We propose our model within an evidence-based, decision-making framework, where the alternative plan units are saliency maps of space. A spatiotemporal similarity measure, computed directly from the unimodal signals, is suggested as the criterion to infer common or separate causes. We provide simulations that (1) validate our model against behavioral, experimental results in tasks where the participants were asked to report common or separate causes for cross-modal stimuli presented with arbitrary spatial and temporal disparities. (2) Predict the behavior in novel experiments where stimuli have different combinations of spatial, temporal, and reliability features. (3) Illustrate the dynamics of the proposed internal system. These results confirm our spatiotemporal similarity measure as a viable criterion for causal inference, and our decision-making framework as a viable mechanism for target selection, which may be used by the brain in cross-modal situations. Further, we suggest that a similar approach can be extended to other cognitive problems where working memory is a limiting factor, such

  8. A Computational Framework to Optimize Subject-Specific Hemodialysis Blood Flow Rate to Prevent Intimal Hyperplasia

    Science.gov (United States)

    Mahmoudzadeh, Javid; Wlodarczyk, Marta; Cassel, Kevin

    2017-11-01

    Development of excessive intimal hyperplasia (IH) in the cephalic vein of renal failure patients who receive chronic hemodialysis treatment results in vascular access failure and multiple treatment complications. Specifically, cephalic arch stenosis (CAS) is known to exacerbate hypertensive blood pressure, thrombosis, and subsequent cardiovascular incidents that would necessitate costly interventional procedures with low success rates. It has been hypothesized that excessive blood flow rate post access maturation which strongly violates the venous homeostasis is the main hemodynamic factor that orchestrates the onset and development of CAS. In this article, a computational framework based on a strong coupling of computational fluid dynamics (CFD) and shape optimization is proposed that aims to identify the effective blood flow rate on a patient-specific basis that avoids the onset of CAS while providing the adequate blood flow rate required to facilitate hemodialysis. This effective flow rate can be achieved through implementation of Miller's surgical banding method after the maturation of the arteriovenous fistula and is rooted in the relaxation of wall stresses back to a homeostatic target value. The results are indicative that this optimized hemodialysis blood flow rate is, in fact, a subject-specific value that can be assessed post vascular access maturation and prior to the initiation of chronic hemodialysis treatment as a mitigative action against CAS-related access failure. This computational technology can be employed for individualized dialysis treatment.

  9. DOE research in utilization of high-performance computers

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Worlton, W.J.; Michael, G.; Rodrigue, G.

    1980-12-01

    Department of Energy (DOE) and other Government research laboratories depend on high-performance computer systems to accomplish their programatic goals. As the most powerful computer systems become available, they are acquired by these laboratories so that advances can be made in their disciplines. These advances are often the result of added sophistication to numerical models whose execution is made possible by high-performance computer systems. However, high-performance computer systems have become increasingly complex; consequently, it has become increasingly difficult to realize their potential performance. The result is a need for research on issues related to the utilization of these systems. This report gives a brief description of high-performance computers, and then addresses the use of and future needs for high-performance computers within DOE, the growing complexity of applications within DOE, and areas of high-performance computer systems warranting research. 1 figure

  10. Comparison of stresses on homogeneous spheroids in the optical stretcher computed with geometrical optics and generalized Lorenz-Mie theory.

    Science.gov (United States)

    Boyde, Lars; Ekpenyong, Andrew; Whyte, Graeme; Guck, Jochen

    2012-11-20

    We present two electromagnetic frameworks to compare the surface stresses on spheroidal particles in the optical stretcher (a dual-beam laser trap that can be used to capture and deform biological cells). The first model is based on geometrical optics (GO) and limited in its applicability to particles that are much greater than the incident wavelength. The second framework is more sophisticated and hinges on the generalized Lorenz-Mie theory (GLMT). Despite the difference in complexity between both theories, the stress profiles computed with GO and GLMT are in good agreement with each other (relative errors are on the order of 1-10%). Both models predict a diminishing of the stresses for larger wavelengths and a strong increase of the stresses for shorter laser-cell distances. Results indicate that surface stresses on a spheroid with an aspect ratio of 1.2 hardly differ from the stresses on a sphere of similar size. Knowledge of the surface stresses and whether or not they redistribute during the stretching process is of crucial importance in real-time applications of the stretcher that aim to discern the viscoelastic properties of cells for purposes of cell characterization, sorting, and medical diagnostics.

  11. Building problem solving environments with the arches framework

    Energy Technology Data Exchange (ETDEWEB)

    Debardeleben, Nathan [Los Alamos National Laboratory; Sass, Ron [U NORTH CAROLINA; Stanzione, Jr., Daniel [ASU; Ligon, Ill, Walter [CLEMSON UNIV

    2009-01-01

    The computational problems that scientists face are rapidly escalating in size and scope. Moreover, the computer systems used to solve these problems are becoming significantly more complex than the familiar, well-understood sequential model on their desktops. While it is possible to re-train scientists to use emerging high-performance computing (HPC) models, it is much more effective to provide them with a higher-level programming environment that has been specialized to their particular domain. By fostering interaction between HPC specialists and the domain scientists, problem-solving environments (PSEs) provide a collaborative environment. A PSE environment allows scientists to focus on expressing their computational problem while the PSE and associated tools support mapping that domain-specific problem to a high-performance computing system. This article describes Arches, an object-oriented framework for building domain-specific PSEs. The framework was designed to support a wide range of problem domains and to be extensible to support very different high-performance computing targets. To demonstrate this flexibility, two PSEs have been developed from the Arches framework to solve problem in two different domains and target very different computing platforms. The Coven PSE supports parallel applications that require large-scale parallelism found in cost-effective Beowulf clusters. In contrast, RCADE targets FPGA-based reconfigurable computing and was originally designed to aid NASA Earth scientists studying satellite instrument data.

  12. A flexible framework for process-based hydraulic and water ...

    Science.gov (United States)

    Background Models that allow for design considerations of green infrastructure (GI) practices to control stormwater runoff and associated contaminants have received considerable attention in recent years. While popular, generally, the GI models are relatively simplistic. However, GI model predictions are being relied upon by many municipalities and State/Local agencies to make decisions about grey vs. green infrastructure improvement planning. Adding complexity to GI modeling frameworks may preclude their use in simpler urban planning situations. Therefore, the goal here was to develop a sophisticated, yet flexible tool that could be used by design engineers and researchers to capture and explore the effect of design factors and properties of the media used in the performance of GI systems at a relatively small scale. We deemed it essential to have a flexible GI modeling tool that is capable of simulating GI system components and specific biophysical processes affecting contaminants such as reactions, and particle-associated transport accurately while maintaining a high degree of flexibly to account for the myriad of GI alternatives. The mathematical framework for a stand-alone GI performance assessment tool has been developed and will be demonstrated.Framework Features The process-based model framework developed here can be used to model a diverse range of GI practices such as green roof, retention pond, bioretention, infiltration trench, permeable pavement and

  13. Designing and implementing the logical security framework for e-commerce based on service oriented architecture

    OpenAIRE

    Luhach, Ashish Kr.; Dwivedi, Sanjay K; Jha, C K

    2014-01-01

    Rapid evolution of information technology has contributed to the evolution of more sophisticated E- commerce system with the better transaction time and protection. The currently used E-commerce models lack in quality properties such as logical security because of their poor designing and to face the highly equipped and trained intruders. This editorial proposed a security framework for small and medium sized E-commerce, based on service oriented architecture and gives an analysis of the emin...

  14. Liquidity Risk meets Economic Capital and RAROC. A framework for measuring liquidity risk in banks.

    OpenAIRE

    Loebnitz, K.

    2011-01-01

    Liquidity risk is a crucial and inherent feature of the business model of banks. While banks and regulators use sophisticated mathematical methods to measure a bank's solvency risk, they use relatively simple tools for a bank's liquidity risk such as coverage ratios, sensitivity analyses, and scenario analyses. In this thesis we present a more rigorous framework that allows us to measure a bank's liquidity risk within the standard economic capital and RAROC setting. In particular, we introduc...

  15. Real-life applications with membrane computing

    CERN Document Server

    Zhang, Gexiang; Gheorghe, Marian

    2017-01-01

    This book thoroughly investigates the underlying theoretical basis of membrane computing models, and reveals their latest applications. In addition, to date there have been no illustrative case studies or complex real-life applications that capitalize on the full potential of the sophisticated membrane systems computational apparatus; gaps that this book remedies. By studying various complex applications – including engineering optimization, power systems fault diagnosis, mobile robot controller design, and complex biological systems involving data modeling and process interactions – the book also extends the capabilities of membrane systems models with features such as formal verification techniques, evolutionary approaches, and fuzzy reasoning methods. As such, the book offers a comprehensive and up-to-date guide for all researchers, PhDs and undergraduate students in the fields of computer science, engineering and the bio-sciences who are interested in the applications of natural computing models.

  16. Computational models of neuromodulation.

    Science.gov (United States)

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  17. Dal computer crime al computer-related crime

    Directory of Open Access Journals (Sweden)

    Antonio Apruzzese

    2007-04-01

    Full Text Available Nowadays, Digital Identity Theft has become one of the most lucrative illegitimate business. Also known as “phishing”, it consists in unauthorized access to an individual’s personal financial data aiming to capture information relative to on line banking and on line financial services. At the beginning people were the victims of such scams, currently the attention is directed to computer networks. “Pharming” and “keylogging” are some of the latest and utmost sophisticated data processing techniques used by computer crime fraudsters. Latest entries are the “botnets”, herds of infected machines, usually managed by one sole command centre which can determine serious damages to network systems. Botnets have made large scale identity theft much simpler to realize. Organized crime is becoming more and more involved in this new crime world that can easily assure huge profits. The Italian State Police, in order to respond more effectively to this new rising challenge, has created, with the Postal and Communication Police, an agency highly specialized in combating such new phenomenon

  18. Exploiting heterogeneous publicly available data sources for drug safety surveillance: computational framework and case studies.

    Science.gov (United States)

    Koutkias, Vassilis G; Lillo-Le Louët, Agnès; Jaulent, Marie-Christine

    2017-02-01

    Driven by the need of pharmacovigilance centres and companies to routinely collect and review all available data about adverse drug reactions (ADRs) and adverse events of interest, we introduce and validate a computational framework exploiting dominant as well as emerging publicly available data sources for drug safety surveillance. Our approach relies on appropriate query formulation for data acquisition and subsequent filtering, transformation and joint visualization of the obtained data. We acquired data from the FDA Adverse Event Reporting System (FAERS), PubMed and Twitter. In order to assess the validity and the robustness of the approach, we elaborated on two important case studies, namely, clozapine-induced cardiomyopathy/myocarditis versus haloperidol-induced cardiomyopathy/myocarditis, and apixaban-induced cerebral hemorrhage. The analysis of the obtained data provided interesting insights (identification of potential patient and health-care professional experiences regarding ADRs in Twitter, information/arguments against an ADR existence across all sources), while illustrating the benefits (complementing data from multiple sources to strengthen/confirm evidence) and the underlying challenges (selecting search terms, data presentation) of exploiting heterogeneous information sources, thereby advocating the need for the proposed framework. This work contributes in establishing a continuous learning system for drug safety surveillance by exploiting heterogeneous publicly available data sources via appropriate support tools.

  19. Performance Assessment Strategies: A computational framework for conceptual design of large roofs

    Directory of Open Access Journals (Sweden)

    Michela Turrin

    2014-01-01

    Full Text Available Using engineering performance evaluations to explore design alternatives during the conceptual phase of architectural design helps to understand the relationships between form and performance; and is crucial for developing well-performing final designs. Computer aided conceptual design has the potential to aid the design team in discovering and highlighting these relationships; especially by means of procedural and parametric geometry to support the generation of geometric design, and building performance simulation tools to support performance assessments. However, current tools and methods for computer aided conceptual design in architecture do not explicitly reveal nor allow for backtracking the relationships between performance and geometry of the design. They currently support post-engineering, rather than the early design decisions and the design exploration process. Focusing on large roofs, this research aims at developing a computational design approach to support designers in performance driven explorations. The approach is meant to facilitate the multidisciplinary integration and the learning process of the designer; and not to constrain the process in precompiled procedures or in hard engineering formulations, nor to automatize it by delegating the design creativity to computational procedures. PAS (Performance Assessment Strategies as a method is the main output of the research. It consists of a framework including guidelines and an extensible library of procedures for parametric modelling. It is structured on three parts. Pre-PAS provides guidelines for a design strategy-definition, toward the parameterization process. Model-PAS provides guidelines, procedures and scripts for building the parametric models. Explore-PAS supports the solutions-assessment based on numeric evaluations and performance simulations, until the identification of a suitable design solution. PAS has been developed based on action research. Several case studies

  20. Computational prediction of neoantigens: do we need more data or new approaches?

    DEFF Research Database (Denmark)

    Eklund, Aron Charles; Szallasi, Zoltan Imre

    2018-01-01

    Personalized cancer immunotherapy may benefit from improved computational algorithms for identifying neoantigens. Recent results demonstrate that machine learning can improve accuracy. Additional improvements may require more genomic data paired with in vitro T cell reactivity measurements......, and more sophisticated algorithms that take into account T cell receptor specificity....

  1. Data mining in soft computing framework: a survey.

    Science.gov (United States)

    Mitra, S; Pal, S K; Mitra, P

    2002-01-01

    The present article provides a survey of the available literature on data mining using soft computing. A categorization has been provided based on the different soft computing tools and their hybridizations used, the data mining function implemented, and the preference criterion selected by the model. The utility of the different soft computing methodologies is highlighted. Generally fuzzy sets are suitable for handling the issues related to understandability of patterns, incomplete/noisy data, mixed media information and human interaction, and can provide approximate solutions faster. Neural networks are nonparametric, robust, and exhibit good learning and generalization capabilities in data-rich environments. Genetic algorithms provide efficient search algorithms to select a model, from mixed media data, based on some preference criterion/objective function. Rough sets are suitable for handling different types of uncertainty in data. Some challenges to data mining and the application of soft computing methodologies are indicated. An extensive bibliography is also included.

  2. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Science.gov (United States)

    Abdul Wahab, Ainuddin Wahid; Han, Qi; Bin Abdul Rahman, Zulkanain

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC. PMID:25097880

  3. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Suleman Khan

    2014-01-01

    Full Text Available Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  4. A comprehensive review on adaptability of network forensics frameworks for mobile cloud computing.

    Science.gov (United States)

    Khan, Suleman; Shiraz, Muhammad; Wahab, Ainuddin Wahid Abdul; Gani, Abdullah; Han, Qi; Rahman, Zulkanain Bin Abdul

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  5. Methodology and computational framework used for the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement accident analysis

    International Nuclear Information System (INIS)

    Mueller, C.; Roglans-Ribas, J.; Folga, S.; Huttenga, A.; Jackson, R.; TenBrook, W.; Russell, J.

    1994-01-01

    A methodology, computational framework, and integrated PC-based database have been developed to assess the risks of facility accidents in support of the US Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement. The methodology includes the following interrelated elements: (1) screening of storage and treatment processes and related waste inventories to determine risk-dominant facilities across the DOE complex, (2) development and frequency estimation of the risk-dominant sequences of accidents, and (3) determination of the evolution of and final compositions of radiological or chemically hazardous source terms predicted to be released as a function of the storage inventory or treatment process throughput. The computational framework automates these elements to provide source term input for the second part of the analysis which includes (1) development or integration of existing site-specific demographics and meteorological data and calculation of attendant unit-risk factors and (2) assessment of the radiological or toxicological consequences of accident releases to the general public and to the occupational work force

  6. A framework for the computer-aided planning and optimisation of manufacturing processes for components with functional graded properties

    Science.gov (United States)

    Biermann, D.; Gausemeier, J.; Heim, H.-P.; Hess, S.; Petersen, M.; Ries, A.; Wagner, T.

    2014-05-01

    In this contribution a framework for the computer-aided planning and optimisation of functional graded components is presented. The framework is divided into three modules - the "Component Description", the "Expert System" for the synthetisation of several process chains and the "Modelling and Process Chain Optimisation". The Component Description module enhances a standard computer-aided design (CAD) model by a voxel-based representation of the graded properties. The Expert System synthesises process steps stored in the knowledge base to generate several alternative process chains. Each process chain is capable of producing components according to the enhanced CAD model and usually consists of a sequence of heating-, cooling-, and forming processes. The dependencies between the component and the applied manufacturing processes as well as between the processes themselves need to be considered. The Expert System utilises an ontology for that purpose. The ontology represents all dependencies in a structured way and connects the information of the knowledge base via relations. The third module performs the evaluation of the generated process chains. To accomplish this, the parameters of each process are optimised with respect to the component specification, whereby the result of the best parameterisation is used as representative value. Finally, the process chain which is capable of manufacturing a functionally graded component in an optimal way regarding to the property distributions of the component description is presented by means of a dedicated specification technique.

  7. Development of a technique for three-dimensional image reconstruction from emission computed tomograms (ECT)

    International Nuclear Information System (INIS)

    Gerischer, R.

    1987-01-01

    The described technique for three-dimensional image reconstruction from ECT sections is based on a simple procedure, which can be carried out with the aid of any standard-type computer used in nuclear medicine and requires no sophisticated arithmetic approach. (TRV) [de

  8. Integrated Computer-aided Framework for Sustainable Chemical Product Design and Evaluation

    DEFF Research Database (Denmark)

    Kalakul, Sawitree; Cignitti, Stefano; Zhang, Lei

    2016-01-01

    This work proposes an integrated model-based framework for chemical product design and evaluation based on which the software, VPPD-Lab (The Virtual Product-Process Design Laboratory) has been developed. The framework allows the following options: (1) design a product using design templates...

  9. Reciprocal Estimation of Pedestrian Location and Motion State toward a Smartphone Geo-Context Computing Solution

    Directory of Open Access Journals (Sweden)

    Jingbin Liu

    2015-06-01

    Full Text Available The rapid advance in mobile communications has made information and services ubiquitously accessible. Location and context information have become essential for the effectiveness of services in the era of mobility. This paper proposes the concept of geo-context that is defined as an integral synthesis of geographical location, human motion state and mobility context. A geo-context computing solution consists of a positioning engine, a motion state recognition engine, and a context inference component. In the geo-context concept, the human motion states and mobility context are associated with the geographical location where they occur. A hybrid geo-context computing solution is implemented that runs on a smartphone, and it utilizes measurements of multiple sensors and signals of opportunity that are available within a smartphone. Pedestrian location and motion states are estimated jointly under the framework of hidden Markov models, and they are used in a reciprocal manner to improve their estimation performance of one another. It is demonstrated that pedestrian location estimation has better accuracy when its motion state is known, and in turn, the performance of motion state recognition can be improved with increasing reliability when the location is given. The geo-context inference is implemented simply with the expert system principle, and more sophisticated approaches will be developed.

  10. Status report on SHARP coupling framework.

    Energy Technology Data Exchange (ETDEWEB)

    Caceres, A.; Tautges, T. J.; Lottes, J.; Fischer, P.; Rabiti, C.; Smith, M. A.; Siegel, A.; Yang, W. S.; Palmiotti, G.

    2008-05-30

    This report presents the software engineering effort under way at ANL towards a comprehensive integrated computational framework (SHARP) for high fidelity simulations of sodium cooled fast reactors. The primary objective of this framework is to provide accurate and flexible analysis tools to nuclear reactor designers by simulating multiphysics phenomena happening in complex reactor geometries. Ideally, the coupling among different physics modules (such as neutronics, thermal-hydraulics, and structural mechanics) needs to be tight to preserve the accuracy achieved in each module. However, fast reactor cores in steady state mode represent a special case where weak coupling between neutronics and thermal-hydraulics is usually adequate. Our framework design allows for both options. Another requirement for SHARP framework has been to implement various coupling algorithms that are parallel and scalable to large scale since nuclear reactor core simulations are among the most memory and computationally intensive, requiring the use of leadership-class petascale platforms. This report details our progress toward achieving these goals. Specifically, we demonstrate coupling independently developed parallel codes in a manner that does not compromise performance or portability, while minimizing the impact on individual developers. This year, our focus has been on developing a lightweight and loosely coupled framework targeted at UNIC (our neutronics code) and Nek (our thermal hydraulics code). However, the framework design is not limited to just using these two codes.

  11. Purification through Emotions: The Role of Shame in Plato's "Sophist" 230B4-E5

    Science.gov (United States)

    Candiotto, Laura

    2018-01-01

    This article proposes an analysis of Plato's "Sophist" (230b4--e5) that underlines the bond between the logical and the emotional components of the Socratic "elenchus", with the aim of depicting the social valence of this philosophical practice. The use of emotions characterizing the 'elenctic' method described by Plato is…

  12. Discrete computational mechanics for stiff phenomena

    KAUST Repository

    Michels, Dominik L.

    2016-11-28

    Many natural phenomena which occur in the realm of visual computing and computational physics, like the dynamics of cloth, fibers, fluids, and solids as well as collision scenarios are described by stiff Hamiltonian equations of motion, i.e. differential equations whose solution spectra simultaneously contain extremely high and low frequencies. This usually impedes the development of physically accurate and at the same time efficient integration algorithms. We present a straightforward computationally oriented introduction to advanced concepts from classical mechanics. We provide an easy to understand step-by-step introduction from variational principles over the Euler-Lagrange formalism and the Legendre transformation to Hamiltonian mechanics. Based on such solid theoretical foundations, we study the underlying geometric structure of Hamiltonian systems as well as their discrete counterparts in order to develop sophisticated structure preserving integration algorithms to efficiently perform high fidelity simulations.

  13. Designing Educational Games for Computer Programming: A Holistic Framework

    Science.gov (United States)

    Malliarakis, Christos; Satratzemi, Maya; Xinogalos, Stelios

    2014-01-01

    Computer science is continuously evolving during the past decades. This has also brought forth new knowledge that should be incorporated and new learning strategies must be adopted for the successful teaching of all sub-domains. For example, computer programming is a vital knowledge area within computer science with constantly changing curriculum…

  14. A non-voxel-based broad-beam (NVBB) framework for IMRT treatment planning.

    Science.gov (United States)

    Lu, Weiguo

    2010-12-07

    We present a novel framework that enables very large scale intensity-modulated radiation therapy (IMRT) planning in limited computation resources with improvements in cost, plan quality and planning throughput. Current IMRT optimization uses a voxel-based beamlet superposition (VBS) framework that requires pre-calculation and storage of a large amount of beamlet data, resulting in large temporal and spatial complexity. We developed a non-voxel-based broad-beam (NVBB) framework for IMRT capable of direct treatment parameter optimization (DTPO). In this framework, both objective function and derivative are evaluated based on the continuous viewpoint, abandoning 'voxel' and 'beamlet' representations. Thus pre-calculation and storage of beamlets are no longer needed. The NVBB framework has linear complexities (O(N(3))) in both space and time. The low memory, full computation and data parallelization nature of the framework render its efficient implementation on the graphic processing unit (GPU). We implemented the NVBB framework and incorporated it with the TomoTherapy treatment planning system (TPS). The new TPS runs on a single workstation with one GPU card (NVBB-GPU). Extensive verification/validation tests were performed in house and via third parties. Benchmarks on dose accuracy, plan quality and throughput were compared with the commercial TomoTherapy TPS that is based on the VBS framework and uses a computer cluster with 14 nodes (VBS-cluster). For all tests, the dose accuracy of these two TPSs is comparable (within 1%). Plan qualities were comparable with no clinically significant difference for most cases except that superior target uniformity was seen in the NVBB-GPU for some cases. However, the planning time using the NVBB-GPU was reduced many folds over the VBS-cluster. In conclusion, we developed a novel NVBB framework for IMRT optimization. The continuous viewpoint and DTPO nature of the algorithm eliminate the need for beamlets and lead to better plan

  15. A non-voxel-based broad-beam (NVBB) framework for IMRT treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Lu Weiguo, E-mail: wlu@tomotherapy.co [TomoTherapy Inc., 1240 Deming Way, Madison, WI 53717 (United States)

    2010-12-07

    We present a novel framework that enables very large scale intensity-modulated radiation therapy (IMRT) planning in limited computation resources with improvements in cost, plan quality and planning throughput. Current IMRT optimization uses a voxel-based beamlet superposition (VBS) framework that requires pre-calculation and storage of a large amount of beamlet data, resulting in large temporal and spatial complexity. We developed a non-voxel-based broad-beam (NVBB) framework for IMRT capable of direct treatment parameter optimization (DTPO). In this framework, both objective function and derivative are evaluated based on the continuous viewpoint, abandoning 'voxel' and 'beamlet' representations. Thus pre-calculation and storage of beamlets are no longer needed. The NVBB framework has linear complexities (O(N{sup 3})) in both space and time. The low memory, full computation and data parallelization nature of the framework render its efficient implementation on the graphic processing unit (GPU). We implemented the NVBB framework and incorporated it with the TomoTherapy treatment planning system (TPS). The new TPS runs on a single workstation with one GPU card (NVBB-GPU). Extensive verification/validation tests were performed in house and via third parties. Benchmarks on dose accuracy, plan quality and throughput were compared with the commercial TomoTherapy TPS that is based on the VBS framework and uses a computer cluster with 14 nodes (VBS-cluster). For all tests, the dose accuracy of these two TPSs is comparable (within 1%). Plan qualities were comparable with no clinically significant difference for most cases except that superior target uniformity was seen in the NVBB-GPU for some cases. However, the planning time using the NVBB-GPU was reduced many folds over the VBS-cluster. In conclusion, we developed a novel NVBB framework for IMRT optimization. The continuous viewpoint and DTPO nature of the algorithm eliminate the need for beamlets

  16. The reactor physics computer programs in PC's era

    International Nuclear Information System (INIS)

    Nainer, O.; Serghiuta, D.

    1995-01-01

    The main objective of reactor physics analysis is the evaluation of flux and power distribution over the reactor core. For CANDU reactors sophisticated computer programs, such as FMDP and RFSP, were developed 20 years ago for mainframe computers. These programs were adapted to work on workstations with UNIX or DOS, but they lack a feature that could improve their use and that is 'user friendly'. For using these programs the users need to deal with a great amount of information contained in sophisticated files. To modify a model is a great challenge. First of all, it is necessary to bear in mind all the geometrical dimensions and accordingly, to modify the core model to match the new requirements. All this must be done in a line input file. For a DOS platform, using an average performance PC system, could it be possible: to represent and modify all the geometrical and physical parameters in a meaningful way, on screen, using an intuitive graphic user interface; to reduce the real time elapsed in order to perform complex fuel-management analysis 'at home'; to avoid the rewrite of the mainframe version of the program? The author's answer is a fuel-management computer package operating on PC, 3 time faster than on a CDC-Cyber 830 mainframe one (486DX/33MHz/8MbRAM) or 20 time faster (Pentium-PC), respectively. (author). 5 refs., 1 tab., 5 figs

  17. InSAR Scientific Computing Environment

    Science.gov (United States)

    Rosen, Paul A.; Sacco, Gian Franco; Gurrola, Eric M.; Zabker, Howard A.

    2011-01-01

    This computing environment is the next generation of geodetic image processing technology for repeat-pass Interferometric Synthetic Aperture (InSAR) sensors, identified by the community as a needed capability to provide flexibility and extensibility in reducing measurements from radar satellites and aircraft to new geophysical products. This software allows users of interferometric radar data the flexibility to process from Level 0 to Level 4 products using a variety of algorithms and for a range of available sensors. There are many radar satellites in orbit today delivering to the science community data of unprecedented quantity and quality, making possible large-scale studies in climate research, natural hazards, and the Earth's ecosystem. The proposed DESDynI mission, now under consideration by NASA for launch later in this decade, would provide time series and multiimage measurements that permit 4D models of Earth surface processes so that, for example, climate-induced changes over time would become apparent and quantifiable. This advanced data processing technology, applied to a global data set such as from the proposed DESDynI mission, enables a new class of analyses at time and spatial scales unavailable using current approaches. This software implements an accurate, extensible, and modular processing system designed to realize the full potential of InSAR data from future missions such as the proposed DESDynI, existing radar satellite data, as well as data from the NASA UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar), and other airborne platforms. The processing approach has been re-thought in order to enable multi-scene analysis by adding new algorithms and data interfaces, to permit user-reconfigurable operation and extensibility, and to capitalize on codes already developed by NASA and the science community. The framework incorporates modern programming methods based on recent research, including object-oriented scripts controlling legacy and

  18. Computational intelligence synergies of fuzzy logic, neural networks and evolutionary computing

    CERN Document Server

    Siddique, Nazmul

    2013-01-01

    Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing presents an introduction to some of the cutting edge technological paradigms under the umbrella of computational intelligence. Computational intelligence schemes are investigated with the development of a suitable framework for fuzzy logic, neural networks and evolutionary computing, neuro-fuzzy systems, evolutionary-fuzzy systems and evolutionary neural systems. Applications to linear and non-linear systems are discussed with examples. Key features: Covers all the aspect

  19. Knowledge Framework Implementation with Multiple Architectures - 13090

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyay, H.; Lagos, L.; Quintero, W.; Shoffner, P. [Applied Research Center, Florida International University, Miami, FL 33174 (United States); DeGregory, J. [Office of D and D and Facility Engineering, Environmental Management, Department of Energy (United States)

    2013-07-01

    Multiple kinds of knowledge management systems are operational in public and private enterprises, large and small organizations with a variety of business models that make the design, implementation and operation of integrated knowledge systems very difficult. In recent days, there has been a sweeping advancement in the information technology area, leading to the development of sophisticated frameworks and architectures. These platforms need to be used for the development of integrated knowledge management systems which provides a common platform for sharing knowledge across the enterprise, thereby reducing the operational inefficiencies and delivering cost savings. This paper discusses the knowledge framework and architecture that can be used for the system development and its application to real life need of nuclear industry. A case study of deactivation and decommissioning (D and D) is discussed with the Knowledge Management Information Tool platform and framework. D and D work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with DOE sites, the Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintain this valuable information in a universally available and easily usable system. (authors)

  20. Knowledge Framework Implementation with Multiple Architectures - 13090

    International Nuclear Information System (INIS)

    Upadhyay, H.; Lagos, L.; Quintero, W.; Shoffner, P.; DeGregory, J.

    2013-01-01

    Multiple kinds of knowledge management systems are operational in public and private enterprises, large and small organizations with a variety of business models that make the design, implementation and operation of integrated knowledge systems very difficult. In recent days, there has been a sweeping advancement in the information technology area, leading to the development of sophisticated frameworks and architectures. These platforms need to be used for the development of integrated knowledge management systems which provides a common platform for sharing knowledge across the enterprise, thereby reducing the operational inefficiencies and delivering cost savings. This paper discusses the knowledge framework and architecture that can be used for the system development and its application to real life need of nuclear industry. A case study of deactivation and decommissioning (D and D) is discussed with the Knowledge Management Information Tool platform and framework. D and D work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with DOE sites, the Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintain this valuable information in a universally available and easily usable system. (authors)

  1. A SIMULATION-AS-A-SERVICE FRAMEWORK FACILITATING WEBGIS BASED INSTALLATION PLANNING

    Directory of Open Access Journals (Sweden)

    Z. Zheng

    2017-09-01

    Full Text Available Installation Planning is constrained by both natural and social conditions, especially for spatially sparse but functionally connected facilities. Simulation is important for proper deploy in space and configuration in function of facilities to make them a cohesive and supportive system to meet users’ operation needs. Based on requirement analysis, we propose a framework to combine GIS and Agent simulation to overcome the shortness in temporal analysis and task simulation of traditional GIS. In this framework, Agent based simulation runs as a service on the server, exposes basic simulation functions, such as scenario configuration, simulation control, and simulation data retrieval to installation planners. At the same time, the simulation service is able to utilize various kinds of geoprocessing services in Agents’ process logic to make sophisticated spatial inferences and analysis. This simulation-as-a-service framework has many potential benefits, such as easy-to-use, on-demand, shared understanding, and boosted performances. At the end, we present a preliminary implement of this concept using ArcGIS javascript api 4.0 and ArcGIS for server, showing how trip planning and driving can be carried out by agents.

  2. Money Laundering Detection Framework to Link the Disparate and Evolving Schemes

    Directory of Open Access Journals (Sweden)

    Murad Mehmet

    2013-09-01

    Full Text Available Money launderers hide traces of their transactions with the involvement of entities that participate in sophisticated schemes. Money laundering detection requires unraveling concealed connections among multiple but seemingly unrelated human money laundering networks, ties among actors of those schemes, and amounts of funds transferred among those entities. The link among small networks, either financial or social, is the primary factor that facilitates money laundering. Hence, the analysis of relations among money laundering networks is required to present the full structure of complex schemes. We propose a framework that uses sequence matching, case-based analysis, social network analysis, and complex event processing to detect money laundering. Our framework captures an ongoing single scheme as an event, and associations among such ongoing sequence of events to capture complex relationships among evolving money laundering schemes. The framework can detect associated multiple money laundering networks even in the absence of some evidence. We validated the accuracy of detecting evolving money laundering schemes using a multi-phases test methodology. Our test used data generated from real-life cases, and extrapolated to generate more data from real-life schemes generator that we implemented.

  3. Novel opportunities for computational biology and sociology in drug discovery☆

    Science.gov (United States)

    Yao, Lixia; Evans, James A.; Rzhetsky, Andrey

    2013-01-01

    Current drug discovery is impossible without sophisticated modeling and computation. In this review we outline previous advances in computational biology and, by tracing the steps involved in pharmaceutical development, explore a range of novel, high-value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy–industry links for scientific and human benefit. Attention to these opportunities could promise punctuated advance and will complement the well-established computational work on which drug discovery currently relies. PMID:20349528

  4. Novel opportunities for computational biology and sociology in drug discovery

    Science.gov (United States)

    Yao, Lixia

    2009-01-01

    Drug discovery today is impossible without sophisticated modeling and computation. In this review we touch on previous advances in computational biology and by tracing the steps involved in pharmaceutical development, we explore a range of novel, high value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy-industry ties for scientific and human benefit. Attention to these opportunities could promise punctuated advance, and will complement the well-established computational work on which drug discovery currently relies. PMID:19674801

  5. High performance computing in linear control

    International Nuclear Information System (INIS)

    Datta, B.N.

    1993-01-01

    Remarkable progress has been made in both theory and applications of all important areas of control. The theory is rich and very sophisticated. Some beautiful applications of control theory are presently being made in aerospace, biomedical engineering, industrial engineering, robotics, economics, power systems, etc. Unfortunately, the same assessment of progress does not hold in general for computations in control theory. Control Theory is lagging behind other areas of science and engineering in this respect. Nowadays there is a revolution going on in the world of high performance scientific computing. Many powerful computers with vector and parallel processing have been built and have been available in recent years. These supercomputers offer very high speed in computations. Highly efficient software, based on powerful algorithms, has been developed to use on these advanced computers, and has also contributed to increased performance. While workers in many areas of science and engineering have taken great advantage of these hardware and software developments, control scientists and engineers, unfortunately, have not been able to take much advantage of these developments

  6. A High Performance Computing Approach to Tree Cover Delineation in 1-m NAIP Imagery Using a Probabilistic Learning Framework

    Science.gov (United States)

    Basu, Saikat; Ganguly, Sangram; Michaelis, Andrew; Votava, Petr; Roy, Anshuman; Mukhopadhyay, Supratik; Nemani, Ramakrishna

    2015-01-01

    Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets, which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.

  7. A High Performance Computing Approach to Tree Cover Delineation in 1-m NAIP Imagery using a Probabilistic Learning Framework

    Science.gov (United States)

    Basu, S.; Ganguly, S.; Michaelis, A.; Votava, P.; Roy, A.; Mukhopadhyay, S.; Nemani, R. R.

    2015-12-01

    Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.

  8. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational thi...

  9. VisRseq: R-based visual framework for analysis of sequencing data

    OpenAIRE

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven JM

    2015-01-01

    Background Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. Results We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for ...

  10. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  11. VIP - A Framework-Based Approach to Robot Vision

    Directory of Open Access Journals (Sweden)

    Gerd Mayer

    2008-11-01

    Full Text Available For robot perception, video cameras are very valuable sensors, but the computer vision methods applied to extract information from camera images are usually computationally expensive. Integrating computer vision methods into a robot control architecture requires to balance exploitation of camera images with the need to preserve reactivity and robustness. We claim that better software support is needed in order to facilitate and simplify the application of computer vision and image processing methods on autonomous mobile robots. In particular, such support must address a simplified specification of image processing architectures, control and synchronization issues of image processing steps, and the integration of the image processing machinery into the overall robot control architecture. This paper introduces the video image processing (VIP framework, a software framework for multithreaded control flow modeling in robot vision.

  12. VIP - A Framework-Based Approach to Robot Vision

    Directory of Open Access Journals (Sweden)

    Hans Utz

    2006-03-01

    Full Text Available For robot perception, video cameras are very valuable sensors, but the computer vision methods applied to extract information from camera images are usually computationally expensive. Integrating computer vision methods into a robot control architecture requires to balance exploitation of camera images with the need to preserve reactivity and robustness. We claim that better software support is needed in order to facilitate and simplify the application of computer vision and image processing methods on autonomous mobile robots. In particular, such support must address a simplified specification of image processing architectures, control and synchronization issues of image processing steps, and the integration of the image processing machinery into the overall robot control architecture. This paper introduces the video image processing (VIP framework, a software framework for multithreaded control flow modeling in robot vision.

  13. Practical clinical applications of the computer in nuclear medicine

    International Nuclear Information System (INIS)

    Price, R.R.; Erickson, J.J.; Patton, J.A.; Jones, J.P.; Lagan, J.E.; Rollo, F.D.

    1978-01-01

    The impact of the computer on the practice of nuclear medicine has been felt primarily in the area of rapid dynamic studies. At this time it is difficult to find a clinic which routinely performs computer processing of static images. The general purpose digital computer is a sophisticated and flexible instrument. The number of applications for which one can use the computer to augment data acquisition, analysis, or display is essentially unlimited. In this light, the purpose of this exhibit is not to describe all possible applications of the computer in nuclear medicine but rather to illustrate those applications which have generally been accepted as practical in the routine clinical environment. Specifically, we have chosen examples of computer augmented cardiac, and renal function studies as well as examples of relative organ blood flow studies. In addition, a short description of basic computer components and terminology along with a few examples of non-imaging applications are presented

  14. Effect of various veneering techniques on mechanical strength of computer-controlled zirconia framework designs.

    Science.gov (United States)

    Kanat, Burcu; Cömlekoğlu, Erhan M; Dündar-Çömlekoğlu, Mine; Hakan Sen, Bilge; Ozcan, Mutlu; Ali Güngör, Mehmet

    2014-08-01

    The objectives of this study were to evaluate the fracture resistance (FR), flexural strength (FS), and shear bond strength (SBS) of zirconia framework material veneered with different methods and to assess the stress distributions using finite element analysis (FEA). Zirconia frameworks fabricated in the forms of crowns for FR, bars for FS, and disks for SBS (N = 90, n = 10) were veneered with either (a) file splitting (CAD-on) (CD), (b) layering (L), or (c) overpressing (P) methods. For crown specimens, stainless steel dies (N = 30; 1 mm chamfer) were scanned using the labside contrast spray. A bilayered design was produced for CD, whereas a reduced design (1 mm) was used for L and P to support the veneer by computer-aided design and manufacturing. For bar (1.5 × 5 × 25 mm(3) ) and disk (2.5 mm diameter, 2.5 mm height) specimens, zirconia blocks were sectioned under water cooling with a low-speed diamond saw and sintered. To prepare the suprastructures in the appropriate shapes for the three mechanical tests, nano-fluorapatite ceramic was layered and fired for L, fluorapatite-ceramic was pressed for P, and the milled lithium-disilicate ceramics were fused with zirconia by a thixotropic glass ceramic for CD and then sintered for crystallization of veneering ceramic. Crowns were then cemented to the metal dies. All specimens were stored at 37°C, 100% humidity for 48 hours. Mechanical tests were performed, and data were statistically analyzed (ANOVA, Tukey's, α = 0.05). Stereomicroscopy and scanning electron microscopy (SEM) were used to evaluate the failure modes and surface structure. FEA modeling of the crowns was obtained. Mean FR values (N ± SD) of CD (4408 ± 608) and L (4323 ± 462) were higher than P (2507 ± 594) (p mechanical tests, whereas a layering technique increased the FR when an anatomical core design was employed. File splitting (CAD-on) or layering veneering ceramic on zirconia with a reduced framework design may reduce ceramic chipping

  15. A scalable computational framework for establishing long-term behavior of stochastic reaction networks.

    Directory of Open Access Journals (Sweden)

    Ankit Gupta

    2014-06-01

    Full Text Available Reaction networks are systems in which the populations of a finite number of species evolve through predefined interactions. Such networks are found as modeling tools in many biological disciplines such as biochemistry, ecology, epidemiology, immunology, systems biology and synthetic biology. It is now well-established that, for small population sizes, stochastic models for biochemical reaction networks are necessary to capture randomness in the interactions. The tools for analyzing such models, however, still lag far behind their deterministic counterparts. In this paper, we bridge this gap by developing a constructive framework for examining the long-term behavior and stability properties of the reaction dynamics in a stochastic setting. In particular, we address the problems of determining ergodicity of the reaction dynamics, which is analogous to having a globally attracting fixed point for deterministic dynamics. We also examine when the statistical moments of the underlying process remain bounded with time and when they converge to their steady state values. The framework we develop relies on a blend of ideas from probability theory, linear algebra and optimization theory. We demonstrate that the stability properties of a wide class of biological networks can be assessed from our sufficient theoretical conditions that can be recast as efficient and scalable linear programs, well-known for their tractability. It is notably shown that the computational complexity is often linear in the number of species. We illustrate the validity, the efficiency and the wide applicability of our results on several reaction networks arising in biochemistry, systems biology, epidemiology and ecology. The biological implications of the results as well as an example of a non-ergodic biological network are also discussed.

  16. Building a Semantic Framework for eScience

    Science.gov (United States)

    Movva, S.; Ramachandran, R.; Maskey, M.; Li, X.

    2009-12-01

    The e-Science vision focuses on the use of advanced computing technologies to support scientists. Recent research efforts in this area have focused primarily on “enabling” use of infrastructure resources for both data and computational access especially in Geosciences. One of the existing gaps in the existing e-Science efforts has been the failure to incorporate stable semantic technologies within the design process itself. In this presentation, we describe our effort in designing a framework for e-Science built using Service Oriented Architecture. Our framework provides users capabilities to create science workflows and mine distributed data. Our e-Science framework is being designed around a mass market tool to promote reusability across many projects. Semantics is an integral part of this framework and our design goal is to leverage the latest stable semantic technologies. The use of these stable semantic technologies will provide the users of our framework the useful features such as: allow search engines to find their content with RDFa tags; create RDF triple data store for their content; create RDF end points to share with others; and semantically mash their content with other online content available as RDF end point.

  17. Application of advanced virtual reality and 3D computer assisted technologies in tele-3D-computer assisted surgery in rhinology.

    Science.gov (United States)

    Klapan, Ivica; Vranjes, Zeljko; Prgomet, Drago; Lukinović, Juraj

    2008-03-01

    The real-time requirement means that the simulation should be able to follow the actions of the user that may be moving in the virtual environment. The computer system should also store in its memory a three-dimensional (3D) model of the virtual environment. In that case a real-time virtual reality system will update the 3D graphic visualization as the user moves, so that up-to-date visualization is always shown on the computer screen. Upon completion of the tele-operation, the surgeon compares the preoperative and postoperative images and models of the operative field, and studies video records of the procedure itself Using intraoperative records, animated images of the real tele-procedure performed can be designed. Virtual surgery offers the possibility of preoperative planning in rhinology. The intraoperative use of computer in real time requires development of appropriate hardware and software to connect medical instrumentarium with the computer and to operate the computer by thus connected instrumentarium and sophisticated multimedia interfaces.

  18. Department of Energy research in utilization of high-performance computers

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Worlton, W.J.; Michael, G.; Rodrigue, G.

    1980-08-01

    Department of Energy (DOE) and other Government research laboratories depend on high-performance computer systems to accomplish their programmatic goals. As the most powerful computer systems become available, they are acquired by these laboratories so that advances can be made in their disciplines. These advances are often the result of added sophistication to numerical models, the execution of which is made possible by high-performance computer systems. However, high-performance computer systems have become increasingly complex, and consequently it has become increasingly difficult to realize their potential performance. The result is a need for research on issues related to the utilization of these systems. This report gives a brief description of high-performance computers, and then addresses the use of and future needs for high-performance computers within DOE, the growing complexity of applications within DOE, and areas of high-performance computer systems warranting research. 1 figure

  19. Towards a framework of smart city diplomacy

    Science.gov (United States)

    Mursitama, T. N.; Lee, L.

    2018-03-01

    This article addresses the impact of globalization on the contemporary society, particularly the role of the city that is becoming increasingly important. Three distinct yet intertwine aspects such as decentralization, technology, and para diplomacy become antecedent of competitiveness of the city. A city has more power and authority in creating wealth and prosperity of the society by utilizing technology. The smart city, in addition to the importance of technology as enabler, we argue that possessing the sophisticated technology and apply it towards the matter is not enough. The smart city needs to build smart diplomacy at the sub-national level. In this article, we extend the discussion about smart city by proposing a new framework of smart city diplomacy as one way to integrate information technology, public policy and international relations which will be the main contribution to literature and practice.

  20. Fermilab advanced computer program multi-microprocessor project

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Biel, J.

    1985-06-01

    Fermilab's Advanced Computer Program is constructing a powerful 128 node multi-microprocessor system for data analysis in high-energy physics. The system will use commercial 32-bit microprocessors programmed in Fortran-77. Extensive software supports easy migration of user applications from a uniprocessor environment to the multiprocessor and provides sophisticated program development, debugging, and error handling and recovery tools. This system is designed to be readily copied, providing computing cost effectiveness of below $2200 per VAX 11/780 equivalent. The low cost, commercial availability, compatibility with off-line analysis programs, and high data bandwidths (up to 160 MByte/sec) make the system an ideal choice for applications to on-line triggers as well as an offline data processor

  1. Hybrid Intrusion Forecasting Framework for Early Warning System

    Science.gov (United States)

    Kim, Sehun; Shin, Seong-Jun; Kim, Hyunwoo; Kwon, Ki Hoon; Han, Younggoo

    Recently, cyber attacks have become a serious hindrance to the stability of Internet. These attacks exploit interconnectivity of networks, propagate in an instant, and have become more sophisticated and evolutionary. Traditional Internet security systems such as firewalls, IDS and IPS are limited in terms of detecting recent cyber attacks in advance as these systems respond to Internet attacks only after the attacks inflict serious damage. In this paper, we propose a hybrid intrusion forecasting system framework for an early warning system. The proposed system utilizes three types of forecasting methods: time-series analysis, probabilistic modeling, and data mining method. By combining these methods, it is possible to take advantage of the forecasting technique of each while overcoming their drawbacks. Experimental results show that the hybrid intrusion forecasting method outperforms each of three forecasting methods.

  2. InteractiveROSETTA: a graphical user interface for the PyRosetta protein modeling suite.

    Science.gov (United States)

    Schenkelberg, Christian D; Bystroff, Christopher

    2015-12-15

    Modern biotechnical research is becoming increasingly reliant on computational structural modeling programs to develop novel solutions to scientific questions. Rosetta is one such protein modeling suite that has already demonstrated wide applicability to a number of diverse research projects. Unfortunately, Rosetta is largely a command-line-driven software package which restricts its use among non-computational researchers. Some graphical interfaces for Rosetta exist, but typically are not as sophisticated as commercial software. Here, we present InteractiveROSETTA, a graphical interface for the PyRosetta framework that presents easy-to-use controls for several of the most widely used Rosetta protocols alongside a sophisticated selection system utilizing PyMOL as a visualizer. InteractiveROSETTA is also capable of interacting with remote Rosetta servers, facilitating sophisticated protocols that are not accessible in PyRosetta or which require greater computational resources. InteractiveROSETTA is freely available at https://github.com/schenc3/InteractiveROSETTA/releases and relies upon a separate download of PyRosetta which is available at http://www.pyrosetta.org after obtaining a license (free for academic use). © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. A flexible framework for secure and efficient program obfuscation.

    Energy Technology Data Exchange (ETDEWEB)

    Solis, John Hector

    2013-03-01

    In this paper, we present a modular framework for constructing a secure and efficient program obfuscation scheme. Our approach, inspired by the obfuscation with respect to oracle machines model of [4], retains an interactive online protocol with an oracle, but relaxes the original computational and storage restrictions. We argue this is reasonable given the computational resources of modern personal devices. Furthermore, we relax the information-theoretic security requirement for computational security to utilize established cryptographic primitives. With this additional flexibility we are free to explore different cryptographic buildingblocks. Our approach combines authenticated encryption with private information retrieval to construct a secure program obfuscation framework. We give a formal specification of our framework, based on desired functionality and security properties, and provide an example instantiation. In particular, we implement AES in Galois/Counter Mode for authenticated encryption and the Gentry-Ramzan [13]constant communication-rate private information retrieval scheme. We present our implementation results and show that non-trivial sized programs can be realized, but scalability is quickly limited by computational overhead. Finally, we include a discussion on security considerations when instantiating specific modules.

  4. Virtual Reality Hypermedia Design Frameworks for Science Instruction.

    Science.gov (United States)

    Maule, R. William; Oh, Byron; Check, Rosa

    This paper reports on a study that conceptualizes a research framework to aid software design and development for virtual reality (VR) computer applications for instruction in the sciences. The framework provides methodologies for the processing, collection, examination, classification, and presentation of multimedia information within hyperlinked…

  5. Computational optogenetics: A novel continuum framework for the photoelectrochemistry of living systems

    Science.gov (United States)

    Wong, Jonathan; Abilez, Oscar J.; Kuhl, Ellen

    2012-06-01

    Electrical stimulation is currently the gold standard treatment for heart rhythm disorders. However, electrical pacing is associated with technical limitations and unavoidable potential complications. Recent developments now enable the stimulation of mammalian cells with light using a novel technology known as optogenetics. The optical stimulation of genetically engineered cells has significantly changed our understanding of electrically excitable tissues, paving the way towards controlling heart rhythm disorders by means of photostimulation. Controlling these disorders, in turn, restores coordinated force generation to avoid sudden cardiac death. Here, we report a novel continuum framework for the photoelectrochemistry of living systems that allows us to decipher the mechanisms by which this technology regulates the electrical and mechanical function of the heart. Using a modular multiscale approach, we introduce a non-selective cation channel, channelrhodopsin-2, into a conventional cardiac muscle cell model via an additional photocurrent governed by a light-sensitive gating variable. Upon optical stimulation, this channel opens and allows sodium ions to enter the cell, inducing electrical activation. In side-by-side comparisons with conventional heart muscle cells, we show that photostimulation directly increases the sodium concentration, which indirectly decreases the potassium concentration in the cell, while all other characteristics of the cell remain virtually unchanged. We integrate our model cells into a continuum model for excitable tissue using a nonlinear parabolic second-order partial differential equation, which we discretize in time using finite differences and in space using finite elements. To illustrate the potential of this computational model, we virtually inject our photosensitive cells into different locations of a human heart, and explore its activation sequences upon photostimulation. Our computational optogenetics tool box allows us to

  6. A Secure Authenticate Framework for Cloud Computing Environment

    OpenAIRE

    Nitin Nagar; Pradeep k. Jatav

    2014-01-01

    Cloud computing has an important aspect for the companies to build and deploy their infrastructure and application. Data Storage service in the cloud computing is easy as compare to the other data storage services. At the same time, cloud security in the cloud environment is challenging task. Security issues ranging from missing system configuration, lack of proper updates, or unwise user actions from remote data storage. It can expose user’s private data and information to unwanted access. i...

  7. A scalable fully implicit framework for reservoir simulation on parallel computers

    KAUST Repository

    Yang, Haijian

    2017-11-10

    The modeling of multiphase fluid flow in porous medium is of interest in the field of reservoir simulation. The promising numerical methods in the literature are mostly based on the explicit or semi-implicit approach, which both have certain stability restrictions on the time step size. In this work, we introduce and study a scalable fully implicit solver for the simulation of two-phase flow in a porous medium with capillarity, gravity and compressibility, which is free from the limitations of the conventional methods. In the fully implicit framework, a mixed finite element method is applied to discretize the model equations for the spatial terms, and the implicit Backward Euler scheme with adaptive time stepping is used for the temporal integration. The resultant nonlinear system arising at each time step is solved in a monolithic way by using a Newton–Krylov type method. The corresponding linear system from the Newton iteration is large sparse, nonsymmetric and ill-conditioned, consequently posing a significant challenge to the fully implicit solver. To address this issue, the family of additive Schwarz preconditioners is taken into account to accelerate the convergence of the linear system, and thereby improves the robustness of the outer Newton method. Several test cases in one, two and three dimensions are used to validate the correctness of the scheme and examine the performance of the newly developed algorithm on parallel computers.

  8. A scalable fully implicit framework for reservoir simulation on parallel computers

    KAUST Repository

    Yang, Haijian; Sun, Shuyu; Li, Yiteng; Yang, Chao

    2017-01-01

    The modeling of multiphase fluid flow in porous medium is of interest in the field of reservoir simulation. The promising numerical methods in the literature are mostly based on the explicit or semi-implicit approach, which both have certain stability restrictions on the time step size. In this work, we introduce and study a scalable fully implicit solver for the simulation of two-phase flow in a porous medium with capillarity, gravity and compressibility, which is free from the limitations of the conventional methods. In the fully implicit framework, a mixed finite element method is applied to discretize the model equations for the spatial terms, and the implicit Backward Euler scheme with adaptive time stepping is used for the temporal integration. The resultant nonlinear system arising at each time step is solved in a monolithic way by using a Newton–Krylov type method. The corresponding linear system from the Newton iteration is large sparse, nonsymmetric and ill-conditioned, consequently posing a significant challenge to the fully implicit solver. To address this issue, the family of additive Schwarz preconditioners is taken into account to accelerate the convergence of the linear system, and thereby improves the robustness of the outer Newton method. Several test cases in one, two and three dimensions are used to validate the correctness of the scheme and examine the performance of the newly developed algorithm on parallel computers.

  9. Provenance for Runtime Workflow Steering and Validation in Computational Seismology

    Science.gov (United States)

    Spinuso, A.; Krischer, L.; Krause, A.; Filgueira, R.; Magnoni, F.; Muraleedharan, V.; David, M.

    2014-12-01

    Provenance systems may be offered by modern workflow engines to collect metadata about the data transformations at runtime. If combined with effective visualisation and monitoring interfaces, these provenance recordings can speed up the validation process of an experiment, suggesting interactive or automated interventions with immediate effects on the lifecycle of a workflow run. For instance, in the field of computational seismology, if we consider research applications performing long lasting cross correlation analysis and high resolution simulations, the immediate notification of logical errors and the rapid access to intermediate results, can produce reactions which foster a more efficient progress of the research. These applications are often executed in secured and sophisticated HPC and HTC infrastructures, highlighting the need for a comprehensive framework that facilitates the extraction of fine grained provenance and the development of provenance aware components, leveraging the scalability characteristics of the adopted workflow engines, whose enactment can be mapped to different technologies (MPI, Storm clusters, etc). This work looks at the adoption of W3C-PROV concepts and data model within a user driven processing and validation framework for seismic data, supporting also computational and data management steering. Validation needs to balance automation with user intervention, considering the scientist as part of the archiving process. Therefore, the provenance data is enriched with community-specific metadata vocabularies and control messages, making an experiment reproducible and its description consistent with the community understandings. Moreover, it can contain user defined terms and annotations. The current implementation of the system is supported by the EU-Funded VERCE (http://verce.eu). It provides, as well as the provenance generation mechanisms, a prototypal browser-based user interface and a web API built on top of a NoSQL storage

  10. An Application of a Game Development Framework in Higher Education

    Directory of Open Access Journals (Sweden)

    Alf Inge Wang

    2009-01-01

    Full Text Available This paper describes how a game development framework was used as a learning aid in a software engineering. Games can be used within higher education in various ways to promote student participation, enable variation in how lectures are taught, and improve student interest. In this paper, we describe a case study at the Norwegian University of Science and Technology (NTNU where a game development framework was applied to make students learn software architecture by developing a computer game. We provide a model for how game development frameworks can be integrated with a software engineering or computer science course. We describe important requirements to consider when choosing a game development framework for a course and an evaluation of four frameworks based on these requirements. Further, we describe some extensions we made to the existing game development framework to let the students focus more on software architectural issues than the technical implementation issues. Finally, we describe a case study of how a game development framework was integrated in a software architecture course and the experiences from doing so.

  11. A Framework for the Evaluation of CASE Tool Learnability in Educational Environments

    Science.gov (United States)

    Senapathi, Mali

    2005-01-01

    The aim of the research is to derive a framework for the evaluation of Computer Aided Software Engineering (CASE) tool learnability in educational environments. Drawing from the literature of Human Computer Interaction and educational research, a framework for evaluating CASE tool learnability in educational environments is derived. The two main…

  12. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A.; Duro, Richard

    2016-01-01

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location. PMID:27399711

  13. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Directory of Open Access Journals (Sweden)

    Gervasio Varela

    2016-07-01

    Full Text Available This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC and Ambient Intelligence (AmI systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  14. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems.

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A; Duro, Richard

    2016-07-07

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  15. A self-description data framework for Tokamak control system design

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Ming; Zhang, Jing [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); School of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Zheng, Wei, E-mail: zhengwei@hust.edu.cn [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); School of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Hu, Feiran; Zhuang, Ge [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); School of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China)

    2015-10-15

    Highlights: • The SDD framework can be applied to different Tokamak devices. • We explain how configuration settings of control systems are described in SDD models, namely components and connections. • Evolving SDD models are stored in a dynamic schema database. • The SDD editor supports plug-and-play SDD models. - Abstract: A Tokamak device consists of numerous control systems, which need to be integrated. CODAC (Control, Data Access and Communication) system requires the configuration settings of these control systems to carry out the integration smoothly. SDD (Self-description data) is designed to describe the static configuration of control systems. ITER CODAC group has released an SDD software package for control system designers to manage the static configuration, but it is specific for ITER plant control systems. Following the idea of ITER SDD, we developed a flexible and scalable SDD framework to develop SDD software for J-TEXT and other sophisticated devices. The SDD framework describes the configuration settings of various control systems, including physical and logical elements and their relation information, in SDD models which are classified into Components and Connections. The framework is composed of three layers: the MongoDB database, an open-source, dynamic schema, NoSQL (Not Only SQL) database; the SDD service, which maps SDD models to MongoDB and handles the transaction and business logic; the SDD applications, which can be used to create and maintain SDD information, and generate various kinds of output using the stored SDD information.

  16. A self-description data framework for Tokamak control system design

    International Nuclear Information System (INIS)

    Zhang, Ming; Zhang, Jing; Zheng, Wei; Hu, Feiran; Zhuang, Ge

    2015-01-01

    Highlights: • The SDD framework can be applied to different Tokamak devices. • We explain how configuration settings of control systems are described in SDD models, namely components and connections. • Evolving SDD models are stored in a dynamic schema database. • The SDD editor supports plug-and-play SDD models. - Abstract: A Tokamak device consists of numerous control systems, which need to be integrated. CODAC (Control, Data Access and Communication) system requires the configuration settings of these control systems to carry out the integration smoothly. SDD (Self-description data) is designed to describe the static configuration of control systems. ITER CODAC group has released an SDD software package for control system designers to manage the static configuration, but it is specific for ITER plant control systems. Following the idea of ITER SDD, we developed a flexible and scalable SDD framework to develop SDD software for J-TEXT and other sophisticated devices. The SDD framework describes the configuration settings of various control systems, including physical and logical elements and their relation information, in SDD models which are classified into Components and Connections. The framework is composed of three layers: the MongoDB database, an open-source, dynamic schema, NoSQL (Not Only SQL) database; the SDD service, which maps SDD models to MongoDB and handles the transaction and business logic; the SDD applications, which can be used to create and maintain SDD information, and generate various kinds of output using the stored SDD information.

  17. Cartoon computation: quantum-like computing without quantum mechanics

    International Nuclear Information System (INIS)

    Aerts, Diederik; Czachor, Marek

    2007-01-01

    We present a computational framework based on geometric structures. No quantum mechanics is involved, and yet the algorithms perform tasks analogous to quantum computation. Tensor products and entangled states are not needed-they are replaced by sets of basic shapes. To test the formalism we solve in geometric terms the Deutsch-Jozsa problem, historically the first example that demonstrated the potential power of quantum computation. Each step of the algorithm has a clear geometric interpretation and allows for a cartoon representation. (fast track communication)

  18. A dataflow meta-computing framework for event processing in the H1 experiment

    International Nuclear Information System (INIS)

    Campbell, A.; Gerhards, R.; Mkrtchyan, T.; Levonian, S.; Grab, C.; Martyniak, J.; Nowak, J.

    2001-01-01

    Linux based networked PCs clusters are replacing both the VME non uniform direct memory access systems and SMP shared memory systems used previously for the online event filtering and reconstruction. To allow an optimal use of the distributed resources of PC clusters an open software framework is presently being developed based on a dataflow paradigm for event processing. This framework allows for the distribution of the data of physics events and associated calibration data to multiple computers from multiple input sources for processing and the subsequent collection of the processed events at multiple outputs. The basis of the system is the event repository, basically a first-in first-out event store which may be read and written in a manner similar to sequential file access. Events are stored in and transferred between repositories as suitably large sequences to enable high throughput. Multiple readers can read simultaneously from a single repository to receive event sequences and multiple writers can insert event sequences to a repository. Hence repositories are used for event distribution and collection. To support synchronisation of the event flow the repository implements barriers. A barrier must be written by all the writers of a repository before any reader can read the barrier. A reader must read a barrier before it may receive data from behind it. Only after all readers have read the barrier is the barrier removed from the repository. A barrier may also have attached data. In this way calibration data can be distributed to all processing units. The repositories are implemented as multi-threaded CORBA objects in C++ and CORBA is used for all data transfers. Job setup scripts are written in python and interactive status and histogram display is provided by a Java program. Jobs run under the PBS batch system providing shared use of resources for online triggering, offline mass reprocessing and user analysis jobs

  19. A framework for AI-based nuclear design support system

    International Nuclear Information System (INIS)

    Furuta, Kazuo; Kondo, Shunsuke

    1991-01-01

    Nowadays many computer programs are being developed and used for the analytic tasks in nuclear reactor design, but experienced designers are still responsible for most of the synthetic tasks which are not amenable to algorithmic computer processes. Artificial intelligence (AI) is a promising technology to deal with these intractable tasks in design. In development of AI-based design support systems, it is desirable to choose a comprehensive framework based on the scientific theory of design. In this work a framework for AI-based design support systems for nuclear reactor design will be proposed based on an exploration model of design. The fundamental architectures of this framework will be described especially on knowledge representation, context management and design planning. (author)

  20. Social networks a framework of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2014-01-01

    This volume provides the audience with an updated, in-depth and highly coherent material on the conceptually appealing and practically sound information technology of Computational Intelligence applied to the analysis, synthesis and evaluation of social networks. The volume involves studies devoted to key issues of social networks including community structure detection in networks, online social networks, knowledge growth and evaluation, and diversity of collaboration mechanisms.  The book engages a wealth of methods of Computational Intelligence along with well-known techniques of linear programming, Formal Concept Analysis, machine learning, and agent modeling.  Human-centricity is of paramount relevance and this facet manifests in many ways including personalized semantics, trust metric, and personal knowledge management; just to highlight a few of these aspects. The contributors to this volume report on various essential applications including cyber attacks detection, building enterprise social network...

  1. Phoneme-based speech segmentation using hybrid soft computing framework

    CERN Document Server

    Sarma, Mousmita

    2014-01-01

    The book discusses intelligent system design using soft computing and similar systems and their interdisciplinary applications. It also focuses on the recent trends to use soft computing as a versatile tool for designing a host of decision support systems.

  2. Computer forensics an essential guide for accountants, lawyers, and managers

    CERN Document Server

    Sheetz, Michael

    2013-01-01

    Would your company be prepared in the event of: * Computer-driven espionage * A devastating virus attack * A hacker's unauthorized access * A breach of data security? As the sophistication of computer technology has grown, so has the rate of computer-related criminal activity. Subsequently, American corporations now lose billions of dollars a year to hacking, identity theft, and other computer attacks. More than ever, businesses and professionals responsible for the critical data of countless customers and employees need to anticipate and safeguard against computer intruders and attacks. The first book to successfully speak to the nontechnical professional in the fields of business and law on the topic of computer crime, Computer Forensics: An Essential Guide for Accountants, Lawyers, and Managers provides valuable advice on the hidden difficulties that can blindside companies and result in damaging costs. Written by industry expert Michael Sheetz, this important book provides readers with an honest look at t...

  3. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  4. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  5. The role of computers in developing countries with reference to East Africa

    International Nuclear Information System (INIS)

    Shayo, L.K.

    1984-01-01

    The role of computers in economic and technological development is examined with particular reference to developing countries. It is stressed that these countries must exploit the potential of computers in their strive to catch-up in the development race. The shortage of qualified EDP personnel is singled out as one of the most critical factors in any unsatisfactory state of computer applications. A computerization policy based on the demands for information by the sophistication of the development process, and supported by a sufficient core of qualified local manpower, is recommended. The situation in East Africa is discussed and recommendations for training and production of telematics equipment are made. (author)

  6. MoCog1: A computer simulation of recognition-primed human decision making

    Science.gov (United States)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  7. Security Framework for Agent-Based Cloud Computing

    Directory of Open Access Journals (Sweden)

    K Venkateshwaran

    2015-06-01

    Full Text Available Agent can play a key role in bringing suitable cloud services to the customer based on their requirements. In agent based cloud computing, agent does negotiation, coordination, cooperation and collaboration on behalf of the customer to make the decisions in efficient manner. However the agent based cloud computing have some security issues like (a. addition of malicious agent in the cloud environment which could demolish the process by attacking other agents, (b. denial of service by creating flooding attacks on other involved agents. (c. Some of the exceptions in the agent interaction protocol such as Not-Understood and Cancel_Meta protocol can be misused and may lead to terminating the connection of all the other agents participating in the negotiating services. Also, this paper proposes algorithms to solve these issues to ensure that there will be no intervention of any malicious activities during the agent interaction.

  8. Advancing E-Commerce Personalization: Process Framework and Case Study

    NARCIS (Netherlands)

    Kaptein, M.C.; Parvinen, P.

    2015-01-01

    Personalization is widely used in e-commerce, and as computational power increases, personalization is now within reach for many online vendors. We describe a process framework to structure our knowledge of online personalization both from academia and from applied attempts. This framework is

  9. "SOCRATICS" AS ADDRESSES OF ISOCRATES’ EPIDEICTIC SPEECHES (Against the Sophists, Encomium of Helen, Busiris

    Directory of Open Access Journals (Sweden)

    Anna Usacheva

    2012-06-01

    Full Text Available This article analyses the three epideictic orations of Isocrates which are in themselves a precious testimony of the quality of intellectual life at the close of the fourth century before Christ. To this period belong also the Socratics who are generally seen as an important link between Socrates and Plato. The author of this article proposes a more productive approach to the study of Antisthenes, Euclid of Megara and other so-called Socratics, revealing them not as independent thinkers but rather as adherents of the sophistic school and also as teachers, thereby, including them among those who took part in the educative activity of their time

  10. Forward with Hoare

    Science.gov (United States)

    Gordon, Mike; Collavizza, Hélène

    Hoare's celebrated paper entitled "An Axiomatic Basis for Computer Programming" appeared in 1969, so the Hoare formula P{S}Q is now 40 years old! That paper introduced Hoare Logic, which is still the basis for program verification today, but is now mechanised inside sophisticated verification systems. We aim here to give an accessible introduction to methods for proving Hoare formulae based both on the forward computation of postconditions and on the backward computation of preconditions. Although precondition methods are better known, computing postconditions provides a verification framework that encompasses methods ranging from symbolic execution to full deductive proof of correctness.

  11. Combinatorial-topological framework for the analysis of global dynamics

    Science.gov (United States)

    Bush, Justin; Gameiro, Marcio; Harker, Shaun; Kokubu, Hiroshi; Mischaikow, Konstantin; Obayashi, Ippei; Pilarczyk, Paweł

    2012-12-01

    We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications.

  12. Combinatorial-topological framework for the analysis of global dynamics.

    Science.gov (United States)

    Bush, Justin; Gameiro, Marcio; Harker, Shaun; Kokubu, Hiroshi; Mischaikow, Konstantin; Obayashi, Ippei; Pilarczyk, Paweł

    2012-12-01

    We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications.

  13. Standardized evaluation framework for evaluating coronary artery stenosis detection, stenosis quantification and lumen segmentation algorithms in computed tomography angiography.

    Science.gov (United States)

    Kirişli, H A; Schaap, M; Metz, C T; Dharampal, A S; Meijboom, W B; Papadopoulou, S L; Dedic, A; Nieman, K; de Graaf, M A; Meijs, M F L; Cramer, M J; Broersen, A; Cetin, S; Eslami, A; Flórez-Valencia, L; Lor, K L; Matuszewski, B; Melki, I; Mohr, B; Oksüz, I; Shahzad, R; Wang, C; Kitslaar, P H; Unal, G; Katouzian, A; Örkisz, M; Chen, C M; Precioso, F; Najman, L; Masood, S; Ünay, D; van Vliet, L; Moreno, R; Goldenberg, R; Vuçini, E; Krestin, G P; Niessen, W J; van Walsum, T

    2013-12-01

    Though conventional coronary angiography (CCA) has been the standard of reference for diagnosing coronary artery disease in the past decades, computed tomography angiography (CTA) has rapidly emerged, and is nowadays widely used in clinical practice. Here, we introduce a standardized evaluation framework to reliably evaluate and compare the performance of the algorithms devised to detect and quantify the coronary artery stenoses, and to segment the coronary artery lumen in CTA data. The objective of this evaluation framework is to demonstrate the feasibility of dedicated algorithms to: (1) (semi-)automatically detect and quantify stenosis on CTA, in comparison with quantitative coronary angiography (QCA) and CTA consensus reading, and (2) (semi-)automatically segment the coronary lumen on CTA, in comparison with expert's manual annotation. A database consisting of 48 multicenter multivendor cardiac CTA datasets with corresponding reference standards are described and made available. The algorithms from 11 research groups were quantitatively evaluated and compared. The results show that (1) some of the current stenosis detection/quantification algorithms may be used for triage or as a second-reader in clinical practice, and that (2) automatic lumen segmentation is possible with a precision similar to that obtained by experts. The framework is open for new submissions through the website, at http://coronary.bigr.nl/stenoses/. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Towards a Formal Framework for Computational Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl Kristian; Sassone, Vladimiro

    2006-01-01

    We define a mathematical measure for the quantitative comparison of probabilistic computational trust systems, and use it to compare a well-known class of algorithms based on the so-called beta model. The main novelty is that our approach is formal, rather than based on experimental simulation....

  15. Brain-Computer Interface Games: Towards a Framework.

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Nijholt, Antinus; Poel, Mannes; Herrlich, Marc; Malaka, Rainer; Masuch, Maic

    2012-01-01

    The brain-computer interface (BCI) community started to consider games as potential applications while the games community started to consider BCI as a game controller. However, there is a discrepancy between the BCI games developed by the two communities. In this paper, we propose a preliminary BCI

  16. A programming environment for distributed complex computing. An overview of the Framework for Interdisciplinary Design Optimization (FIDO) project. NASA Langley TOPS exhibit H120b

    Science.gov (United States)

    Townsend, James C.; Weston, Robert P.; Eidson, Thomas M.

    1993-01-01

    The Framework for Interdisciplinary Design Optimization (FIDO) is a general programming environment for automating the distribution of complex computing tasks over a networked system of heterogeneous computers. For example, instead of manually passing a complex design problem between its diverse specialty disciplines, the FIDO system provides for automatic interactions between the discipline tasks and facilitates their communications. The FIDO system networks all the computers involved into a distributed heterogeneous computing system, so they have access to centralized data and can work on their parts of the total computation simultaneously in parallel whenever possible. Thus, each computational task can be done by the most appropriate computer. Results can be viewed as they are produced and variables changed manually for steering the process. The software is modular in order to ease migration to new problems: different codes can be substituted for each of the current code modules with little or no effect on the others. The potential for commercial use of FIDO rests in the capability it provides for automatically coordinating diverse computations on a networked system of workstations and computers. For example, FIDO could provide the coordination required for the design of vehicles or electronics or for modeling complex systems.

  17. Framework for AI-based nuclear reactor design support system

    International Nuclear Information System (INIS)

    Furuta, Kazuo; Kondo, Shunsuke

    1992-01-01

    Nowadays many computer programs are being developed and used for the analytic tasks in nuclear reactor design, but experienced designers are still responsible for most of the synthetic tasks which are not amenable to algorithmic computer processes. Artificial intelligence (AI) is a promising technology to deal with these intractable tasks in design. In development of AI-based design support systems, it is desirable to choose a comprehensive framework based on the scientific theory of design. In this work a framework for AI-based design support systems for nuclear reactor design will be proposed based on an explorative abduction model of design. The fundamental architectures of this framework will be described especially on knowledge representation, context management and design planning. (author)

  18. Advancing e-commerce personalization : Process framework and case study

    NARCIS (Netherlands)

    Kaptein, Maurits; Parvinen, Petri

    2015-01-01

    Personalization is widely used in e-commerce, and as computational power increases, personalization is now within reach for many online vendors. We describe a process framework to structure our knowledge of online personalization —both from academia and from applied attempts. This framework is

  19. Multi-binding site model-based curve-fitting program for the computation of RIA data

    International Nuclear Information System (INIS)

    Malan, P.G.; Ekins, R.P.; Cox, M.G.; Long, E.M.R.

    1977-01-01

    In this paper, a comparison will be made of model-based and empirical curve-fitting procedures. The implementation of a multiple binding-site curve-fitting model which will successfully fit a wide range of assay data, and which can be run on a mini-computer is described. The latter sophisticated model also provides estimates of binding site concentrations and the values of the respective equilibrium constants present: the latter have been used for refining assay conditions using computer optimisation techniques. (orig./AJ) [de

  20. Brain-Computer Interface Games: Towards a Framework

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Nijholt, Antinus; Poel, Mannes; Nakatsu, Ryohei; Rauterberg, Matthias; Ciancarini, Paolo

    2015-01-01

    The brain-computer interface (BCI) community has started to consider games as potential applications, while the game community has started to consider BCI as a game controller. However, there is a discrepancy between the BCI games developed by the two communities. This not only adds to the workload

  1. Commissioning the CMS alignment and calibration framework

    International Nuclear Information System (INIS)

    Futyan, David

    2010-01-01

    The CMS experiment has developed a powerful framework to ensure the precise and prompt alignment and calibration of its components, which is a major prerequisite to achieve the optimal performance for physics analysis. The prompt alignment and calibration strategy harnesses computing resources both at the Tier-0 site and the CERN Analysis Facility (CAF) to ensure fast turnaround for updating the corresponding database payloads. An essential element is the creation of dedicated data streams concentrating the specific event information required by the various alignment and calibration workflows. The resulting low latency is required for feeding the resulting constants into the prompt reconstruction process, which is essential for achieving swift physics analysis of the LHC data. This report discusses the implementation and the computational aspects of the alignment and calibration framework. Recent commissioning campaigns with cosmic muons, beam halo and simulated data have been used to gain detailed experience with this framework, and results of this validation are reported.

  2. Enabling Computational Dynamics in Distributed Computing Environments Using a Heterogeneous Computing Template

    Science.gov (United States)

    2011-08-09

    heterogeneous computing concept advertised recently as the paradigm capable of delivering exascale flop rates by the end of the decade. In this framework...and Lamb. Page 10 of 10 UNCLASSIFIED [3] Skaugen, K., Petascale to Exascale : Extending Intel’s HPC Commitment: http://download.intel.com

  3. Computer facilities for ISABELLE data handling

    International Nuclear Information System (INIS)

    Kramer, M.A.; Love, W.A.; Miller, R.J.; Zeller, M.

    1977-01-01

    The analysis of data produced by ISABELLE experiments will need a large system of computers. An official group of prospective users and operators of that system should begin planning now. Included in the array will be a substantial computer system at each ISABELLE intersection in use. These systems must include enough computer power to keep experimenters aware of the health of the experiment. This will require at least one very fast sophisticated processor in the system, the size depending on the experiment. Other features of the intersection systems must be a good, high speed graphic display, ability to record data on magnetic tape at 500 to 1000 KB, and a high speed link to a central computer. The operating system software must support multiple interactive users. A substantially larger capacity computer system, shared by the six intersection region experiments, must be available with good turnaround for experimenters while ISABELLE is running. A computer support group will be required to maintain the computer system and to provide and maintain software common to all experiments. Special superfast computing hardware or special function processors constructed with microprocessor circuitry may be necessary both in the data gathering and data processing work. Thus both the local and central processors should be chosen with the possibility of interfacing such devices in mind

  4. An Efficient Framework for EEG Analysis with Application to Hybrid Brain Computer Interfaces Based on Motor Imagery and P300

    Directory of Open Access Journals (Sweden)

    Jinyi Long

    2017-01-01

    Full Text Available The hybrid brain computer interface (BCI based on motor imagery (MI and P300 has been a preferred strategy aiming to improve the detection performance through combining the features of each. However, current methods used for combining these two modalities optimize them separately, which does not result in optimal performance. Here, we present an efficient framework to optimize them together by concatenating the features of MI and P300 in a block diagonal form. Then a linear classifier under a dual spectral norm regularizer is applied to the combined features. Under this framework, the hybrid features of MI and P300 can be learned, selected, and combined together directly. Experimental results on the data set of hybrid BCI based on MI and P300 are provided to illustrate competitive performance of the proposed method against other conventional methods. This provides an evidence that the method used here contributes to the discrimination performance of the brain state in hybrid BCI.

  5. A general framework for reasoning on inconsistency

    CERN Document Server

    Martinez, Maria Vanina; Subrahmanian, VS; Amgoud, Leila

    2013-01-01

    This SpringerBrief proposes a general framework for reasoning about inconsistency in a wide variety of logics, including inconsistency resolution methods that have not yet been studied.  The proposed framework allows users to specify preferences on how to resolve inconsistency when there are multiple ways to do so. This empowers users to resolve inconsistency in data leveraging both their detailed knowledge of the data as well as their application needs. The brief shows that the framework is well-suited to handle inconsistency in several logics, and provides algorithms to compute preferred opt

  6. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  7. A Framework for Enterprise Operating Systems Based on Zachman Framework

    Science.gov (United States)

    Ostadzadeh, S. Shervin; Rahmani, Amir Masoud

    Nowadays, the Operating System (OS) isn't only the software that runs your computer. In the typical information-driven organization, the operating system is part of a much larger platform for applications and data that extends across the LAN, WAN and Internet. An OS cannot be an island unto itself; it must work with the rest of the enterprise. Enterprise wide applications require an Enterprise Operating System (EOS). Enterprise operating systems used in an enterprise have brought about an inevitable tendency to lunge towards organizing their information activities in a comprehensive way. In this respect, Enterprise Architecture (EA) has proven to be the leading option for development and maintenance of enterprise operating systems. EA clearly provides a thorough outline of the whole information system comprising an enterprise. To establish such an outline, a logical framework needs to be laid upon the entire information system. Zachman Framework (ZF) has been widely accepted as a standard scheme for identifying and organizing descriptive representations that have prominent roles in enterprise-wide system development. In this paper, we propose a framework based on ZF for enterprise operating systems. The presented framework helps developers to design and justify completely integrated business, IT systems, and operating systems which results in improved project success rate.

  8. Security and Cloud Outsourcing Framework for Economic Dispatch

    International Nuclear Information System (INIS)

    Sarker, Mushfiqur R.; Wang, Jianhui

    2017-01-01

    The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for the Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.

  9. A Framework for Low-Communication 1-D FFT

    Directory of Open Access Journals (Sweden)

    Ping Tak Peter Tang

    2013-01-01

    Full Text Available In high-performance computing on distributed-memory systems, communication often represents a significant part of the overall execution time. The relative cost of communication will certainly continue to rise as compute-density growth follows the current technology and industry trends. Design of lower-communication alternatives to fundamental computational algorithms has become an important field of research. For distributed 1-D FFT, communication cost has hitherto remained high as all industry-standard implementations perform three all-to-all internode data exchanges (also called global transposes. These communication steps indeed dominate execution time. In this paper, we present a mathematical framework from which many single-all-to-all and easy-to-implement 1-D FFT algorithms can be derived. For large-scale problems, our implementation can be twice as fast as leading FFT libraries on state-of-the-art computer clusters. Moreover, our framework allows tradeoff between accuracy and performance, further boosting performance if reduced accuracy is acceptable.

  10. An evaluation framework and comparative analysis of the widely used first programming languages.

    Directory of Open Access Journals (Sweden)

    Muhammad Shoaib Farooq

    Full Text Available Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL. The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores.

  11. An evaluation framework and comparative analysis of the widely used first programming languages.

    Science.gov (United States)

    Farooq, Muhammad Shoaib; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed; Abid, Adnan

    2014-01-01

    Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL). The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores.

  12. COMPSs-Mobile: parallel programming for mobile-cloud computing

    OpenAIRE

    Lordan Gomis, Francesc-Josep; Badia Sala, Rosa Maria

    2016-01-01

    The advent of Cloud and the popularization of mobile devices have led us to a shift in computing access. Computing users will have an interaction display while the real computation will be performed remotely, in the Cloud. COMPSs-Mobile is a framework that aims to ease the development of energy-efficient and high-performing applications for this environment. The framework provides an infrastructure-unaware programming model that allows developers to code regular Android applications that, ...

  13. SnowCloud - a Framework to Predict Streamflow in Snowmelt-dominated Watersheds Using Cloud-based Computing

    Science.gov (United States)

    Sproles, E. A.; Crumley, R. L.; Nolin, A. W.; Mar, E.; Lopez-Moreno, J. J.

    2017-12-01

    Streamflow in snowy mountain regions is extraordinarily challenging to forecast, and prediction efforts are hampered by the lack of timely snow data—particularly in data sparse regions. SnowCloud is a prototype web-based framework that integrates remote sensing, cloud computing, interactive mapping tools, and a hydrologic model to offer a new paradigm for delivering key data to water resource managers. We tested the skill of SnowCloud to forecast monthly streamflow with one month lead time in three snow-dominated headwaters. These watersheds represent a range of precipitation/runoff schemes: the Río Elqui in northern Chile (200 mm/yr, entirely snowmelt); the John Day River, Oregon, USA (635 mm/yr, primarily snowmelt); and the Río Aragon in the northern Spain (850 mm/yr, snowmelt dominated). Model skill corresponded to snowpack contribution with Nash-Sutcliffe Efficiencies of 0.86, 0.52, and 0.21 respectively. SnowCloud does not require the user to possess advanced programming skills or proprietary software. We access NASA's MOD10A1 snow cover product to calculate the snow metrics globally using Google Earth Engine's geospatial analysis and cloud computing service. The analytics and forecast tools are provided through a web-based portal that requires only internet access and minimal training. To test the efficacy of SnowCloud we provided the tools and a series of tutorials in English and Spanish to water resource managers in Chile, Spain, and the United States. Participants assessed their user experience and provided feedback, and the results of our multi-cultural assessment are also presented. While our results focus on SnowCloud, they outline methods to develop cloud-based tools that function effectively across cultures and languages. Our approach also addresses the primary challenges of science-based computing; human resource limitations, infrastructure costs, and expensive proprietary software. These challenges are particularly problematic in developing

  14. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  15. Hi-Jack: a novel computational framework for pathway-based inference of host–pathogen interactions

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2015-03-09

    Motivation: Pathogens infect their host and hijack the host machinery to produce more progeny pathogens. Obligate intracellular pathogens, in particular, require resources of the host to replicate. Therefore, infections by these pathogens lead to alterations in the metabolism of the host, shifting in favor of pathogen protein production. Some computational identification of mechanisms of host-pathogen interactions have been proposed, but it seems the problem has yet to be approached from the metabolite-hijacking angle. Results: We propose a novel computational framework, Hi-Jack, for inferring pathway-based interactions between a host and a pathogen that relies on the idea of metabolite hijacking. Hi-Jack searches metabolic network data from hosts and pathogens, and identifies candidate reactions where hijacking occurs. A novel scoring function ranks candidate hijacked reactions and identifies pathways in the host that interact with pathways in the pathogen, as well as the associated frequent hijacked metabolites. We also describe host-pathogen interaction principles that can be used in the future for subsequent studies. Our case study on Mycobacterium tuberculosis (Mtb) revealed pathways in human-e.g. carbohydrate metabolism, lipids metabolism and pathways related to amino acids metabolism-that are likely to be hijacked by the pathogen. In addition, we report interesting potential pathway interconnections between human and Mtb such as linkage of human fatty acid biosynthesis with Mtb biosynthesis of unsaturated fatty acids, or linkage of human pentose phosphate pathway with lipopolysaccharide biosynthesis in Mtb. © The Author 2015. Published by Oxford University Press. All rights reserved.

  16. Component-based framework for subsurface simulations

    International Nuclear Information System (INIS)

    Palmer, B J; Fang, Yilin; Hammond, Glenn; Gurumoorthi, Vidhya

    2007-01-01

    Simulations in the subsurface environment represent a broad range of phenomena covering an equally broad range of scales. Developing modelling capabilities that can integrate models representing different phenomena acting at different scales present formidable challenges both from the algorithmic and computer science perspective. This paper will describe the development of an integrated framework that will be used to combine different models into a single simulation. Initial work has focused on creating two frameworks, one for performing smooth particle hydrodynamics (SPH) simulations of fluid systems, the other for performing grid-based continuum simulations of reactive subsurface flow. The SPH framework is based on a parallel code developed for doing pore scale simulations, the continuum grid-based framework is based on the STOMP (Subsurface Transport Over Multiple Phases) code developed at PNNL Future work will focus on combining the frameworks together to perform multiscale, multiphysics simulations of reactive subsurface flow

  17. A multi-GPU real-time dose simulation software framework for lung radiotherapy.

    Science.gov (United States)

    Santhanam, A P; Min, Y; Neelakkantan, H; Papp, N; Meeks, S L; Kupelian, P A

    2012-09-01

    Medical simulation frameworks facilitate both the preoperative and postoperative analysis of the patient's pathophysical condition. Of particular importance is the simulation of radiation dose delivery for real-time radiotherapy monitoring and retrospective analyses of the patient's treatment. In this paper, a software framework tailored for the development of simulation-based real-time radiation dose monitoring medical applications is discussed. A multi-GPU-based computational framework coupled with inter-process communication methods is introduced for simulating the radiation dose delivery on a deformable 3D volumetric lung model and its real-time visualization. The model deformation and the corresponding dose calculation are allocated among the GPUs in a task-specific manner and is performed in a pipelined manner. Radiation dose calculations are computed on two different GPU hardware architectures. The integration of this computational framework with a front-end software layer and back-end patient database repository is also discussed. Real-time simulation of the dose delivered is achieved at once every 120 ms using the proposed framework. With a linear increase in the number of GPU cores, the computational time of the simulation was linearly decreased. The inter-process communication time also improved with an increase in the hardware memory. Variations in the delivered dose and computational speedup for variations in the data dimensions are investigated using D70 and D90 as well as gEUD as metrics for a set of 14 patients. Computational speed-up increased with an increase in the beam dimensions when compared with a CPU-based commercial software while the error in the dose calculation was lung model-based radiotherapy is an effective tool for performing both real-time and retrospective analyses.

  18. A PC [personal computer]-based version of KENO V.a

    International Nuclear Information System (INIS)

    Nigg, D.A.; Atkinson, C.A.; Briggs, J.B.; Taylor, J.T.

    1990-01-01

    The use of personal computers (PCs) and engineering workstations for complex scientific computations has expanded rapidly in the last few years. This trend is expected to continue in the future with the introduction of increasingly sophisticated microprocessors and microcomputer systems. For a number of reasons, including security, economy, user convenience, and productivity, an integrated system of neutronics and radiation transport software suitable for operation in an IBM PC-class environment has been under development at the Idaho National Engineering Laboratory (INEL) for the past 3 yr. Nuclear cross-section data and resonance parameters are preprocessed from the Evaluated Nuclear Data Files Version 5 (ENDF/B-V) and supplied in a form suitable for use in a PC-based spectrum calculation and multigroup cross-section generation module. This module produces application-specific data libraries that can then be used in various neutron transport and diffusion theory code modules. This paper discusses several details of the Monte Carlo criticality module, which is based on the well-known highly-sophisticated KENO V.a package developed at Oak Ridge National Laboratory and previously released in mainframe form by the Radiation Shielding Information Center (RSIC). The conversion process and a variety of benchmarking results are described

  19. artG4: A Generic Framework for Geant4 Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Arvanitis, Tasha [Harvey Mudd Coll.; Lyon, Adam [Fermilab

    2014-01-01

    A small experiment must devote its limited computing expertise to writing physics code directly applicable to the experiment. A software 'framework' is essential for providing an infrastructure that makes writing the physics-relevant code easy. In this paper, we describe a highly modular and easy to use framework for writing Geant4 based simulations called 'artg4'. This framework is a layer on top of the art framework.

  20. The Guided System Development Framework

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2011-01-01

    The Service-Oriented Computing paradigm has had significant influence on the Internet. With the emergence of this paradigm, it is important to provide tools that help developers designing and verifying such systems. In this article, we present the Guided System Development (GSD) Framework that aids...

  1. An esthetics rehabilitation with computer-aided design/ computer-aided manufacturing technology.

    Science.gov (United States)

    Mazaro, Josá Vitor Quinelli; de Mello, Caroline Cantieri; Zavanelli, Adriana Cristina; Santiago, Joel Ferreira; Amoroso, Andressa Paschoal; Pellizzer, Eduardo Piza

    2014-07-01

    This paper describes a case of a rehabilitation involving Computer Aided Design/Computer Aided Manufacturing (CAD-CAM) system in implant supported and dental supported prostheses using zirconia as framework. The CAD-CAM technology has developed considerably over last few years, becoming a reality in dental practice. Among the widely used systems are the systems based on zirconia which demonstrate important physical and mechanical properties of high strength, adequate fracture toughness, biocompatibility and esthetics, and are indicated for unitary prosthetic restorations and posterior and anterior framework. All the modeling was performed by using CAD-CAM system and prostheses were cemented using resin cement best suited for each situation. The rehabilitation of the maxillary arch using zirconia framework demonstrated satisfactory esthetic and functional results after a 12-month control and revealed no biological and technical complications. This article shows the important of use technology CAD/CAM in the manufacture of dental prosthesis and implant-supported.

  2. National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Van Arsdall, P.J. LLNL

    1998-01-01

    The NIF design team is developing the Integrated Computer Control System (ICCS), which is based on an object-oriented software framework applicable to event-driven control systems. The framework provides an open, extensible architecture that is sufficiently abstract to construct future mission-critical control systems. The ICCS will become operational when the first 8 out of 192 beams are activated in mid 2000. The ICCS consists of 300 front-end processors attached to 60,000 control points coordinated by a supervisory system. Computers running either Solaris or VxWorks are networked over a hybrid configuration of switched fast Ethernet and asynchronous transfer mode (ATM). ATM carries digital motion video from sensors to operator consoles. Supervisory software is constructed by extending the reusable framework components for each specific application. The framework incorporates services for database persistence, system configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. More than twenty collaborating software applications are derived from the common framework. The framework is interoperable among different kinds of computers and functions as a plug-in software bus by leveraging a common object request brokering architecture (CORBA). CORBA transparently distributes the software objects across the network. Because of the pivotal role played, CORBA was tested to ensure adequate performance

  3. Sustainable Chemical Process Development through an Integrated Framework

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil; Kumar Tula, Anjan; Anantpinijwatna, Amata

    2016-01-01

    This paper describes the development and the application of a general integrated framework based on systematic model-based methods and computer-aided tools with the objective to achieve more sustainable process designs and to improve the process understanding. The developed framework can be appli...... studies involve multiphase reaction systems for the synthesis of active pharmaceutical ingredients....

  4. A computer control system for a research reactor

    International Nuclear Information System (INIS)

    Crawford, K.C.; Sandquist, G.M.

    1987-01-01

    Most reactor applications until now, have not required computer control of core output. Commercial reactors are generally operated at a constant power output to provide baseline power. However, if commercial reactor cores are to become load following over a wide range, then centralized digital computer control is required to make the entire facility respond as a single unit to continual changes in power demand. Navy and research reactors are much smaller and simpler and are operated at constant power levels as required, without concern for the number of operators required to operate the facility. For navy reactors, centralized digital computer control may provide space savings and reduced personnel requirements. Computer control offers research reactors versatility to efficiently change a system to develop new ideas. The operation of any reactor facility would be enhanced by a controller that does not panic and is continually monitoring all facility parameters. Eventually very sophisticated computer control systems may be developed which will sense operational problems, diagnose the problem, and depending on the severity of the problem, immediately activate safety systems or consult with operators before taking action

  5. Introduction into the Virtual Olympic Games Framework for online communities.

    Science.gov (United States)

    Stoilescu, Dorian

    2009-06-01

    This paper presents the design of the Virtual Olympic Games Framework (VOGF), a computer application designated for athletics, health care, general well-being, nutrition and fitness, which offers multiple benefits for its participants. A special interest in starting the design of the framework was in exploring how people can connect and participate together using existing computer technologies (i.e. gaming consoles, exercise equipment with computer interfaces, devices of measuring health, speed, force and distance and Web 2.0 applications). A stationary bike set-up offering information to users about their individual health and athletic performances has been considered as a starting model. While this model is in the design stage, some preliminary findings are encouraging, suggesting the potential for various fields: sports, medicine, theories of learning, technologies and cybercultural studies. First, this framework would allow participants to perform a variety of sports and improve their health. Second, this would involve creating an online environment able to store health information and sport performances correlated with accessing multi-media data and research about performing sports. Third, participants could share experiences with other athletes, coaches and researchers. Fourth, this framework also provides support for the research community in their future investigations.

  6. A clinically driven variant prioritization framework outperforms purely computational approaches for the diagnostic analysis of singleton WES data.

    Science.gov (United States)

    Stark, Zornitza; Dashnow, Harriet; Lunke, Sebastian; Tan, Tiong Y; Yeung, Alison; Sadedin, Simon; Thorne, Natalie; Macciocca, Ivan; Gaff, Clara; Oshlack, Alicia; White, Susan M; James, Paul A

    2017-11-01

    Rapid identification of clinically significant variants is key to the successful application of next generation sequencing technologies in clinical practice. The Melbourne Genomics Health Alliance (MGHA) variant prioritization framework employs a gene prioritization index based on clinician-generated a priori gene lists, and a variant prioritization index (VPI) based on rarity, conservation and protein effect. We used data from 80 patients who underwent singleton whole exome sequencing (WES) to test the ability of the framework to rank causative variants highly, and compared it against the performance of other gene and variant prioritization tools. Causative variants were identified in 59 of the patients. Using the MGHA prioritization framework the average rank of the causative variant was 2.24, with 76% ranked as the top priority variant, and 90% ranked within the top five. Using clinician-generated gene lists resulted in ranking causative variants an average of 8.2 positions higher than prioritization based on variant properties alone. This clinically driven prioritization approach significantly outperformed purely computational tools, placing a greater proportion of causative variants top or in the top 5 (permutation P-value=0.001). Clinicians included 40 of the 49 WES diagnoses in their a priori list of differential diagnoses (81%). The lists generated by PhenoTips and Phenomizer contained 14 (29%) and 18 (37%) of these diagnoses respectively. These results highlight the benefits of clinically led variant prioritization in increasing the efficiency of singleton WES data analysis and have important implications for developing models for the funding and delivery of genomic services.

  7. STAF: A Powerful and Sophisticated CAI System.

    Science.gov (United States)

    Loach, Ken

    1982-01-01

    Describes the STAF (Science Teacher's Authoring Facility) computer-assisted instruction system developed at Leeds University (England), focusing on STAF language and major program features. Although programs for the system emphasize physical chemistry and organic spectroscopy, the system and language are general purpose and can be used in any…

  8. Computational studies of adsorption in metal organic frameworks and interaction of nanoparticles in condensed phases

    Energy Technology Data Exchange (ETDEWEB)

    Annapureddy, HVR; Motkuri, RK; Nguyen, PTM; Truong, TB; Thallapally, PK; McGrail, BP; Dang, LX

    2014-02-05

    In this review, we describe recent efforts to systematically study nano-structured metal organic frameworks (MOFs), also known as metal organic heat carriers, with particular emphasis on their application in heating and cooling processes. We used both molecular dynamics and grand canonical Monte Carlo simulation techniques to gain a molecular-level understanding of the adsorption mechanism of gases in these porous materials. We investigated the uptake of various gases such as refrigerants R12 and R143a. We also evaluated the effects of temperature and pressure on the uptake mechanism. Our computed results compared reasonably well with available measurements from experiments, thus validating our potential models and approaches. In addition, we investigated the structural, diffusive and adsorption properties of different hydrocarbons in Ni-2(dhtp). Finally, to elucidate the mechanism of nanoparticle dispersion in condensed phases, we studied the interactions among nanoparticles in various liquids, such as n-hexane, water and methanol.

  9. A novel framework for diagnosing automatic tool changer and tool life based on cloud computing

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2016-03-01

    Full Text Available Tool change is one among the most frequently performed machining processes, and if there is improper percussion as the tool’s position is changed, the spindle bearing can be damaged. A spindle malfunction can cause problems, such as a knife being dropped or bias in a machined hole. The measures currently taken to avoid such issues, which arose from the available machine tools, only involve determining whether the clapping knife’s state is correct using a spindle and the air adhesion method, which is also used to satisfy the high precision required from mechanical components. Therefore, it cannot be used with any type of machine tool; in addition, improper tapping of the spindle during an automatic tool change cannot be detected. Therefore, this study proposes a new type of diagnostic framework that combines cloud computing and vibration sensors, among of which, tool change is automatically diagnosed using an architecture to identify abnormalities and thereby enhances the reliability and productivity of the machine and equipment.

  10. Detection of Malware and Kernel-Level Rootkits in Cloud Computing Environments

    OpenAIRE

    Win, Thu Yein; Tianfield, Huaglory; Mair, Quentin

    2016-01-01

    Cyberattacks targeted at virtualization infrastructure underlying cloud computing services has become increasingly sophisticated. This paper presents a novel malware and rookit detection system which protects the guests against different attacks. It combines system call monitoring and system call hashing on the guest kernel together with Support Vector Machines (SVM)-based external monitoring on the host. We demonstrate the effectiveness of our solution by evaluating it against well-known use...

  11. Message Passing Framework for Globally Interconnected Clusters

    International Nuclear Information System (INIS)

    Hafeez, M; Riaz, N; Asghar, S; Malik, U A; Rehman, A

    2011-01-01

    In prevailing technology trends it is apparent that the network requirements and technologies will advance in future. Therefore the need of High Performance Computing (HPC) based implementation for interconnecting clusters is comprehensible for scalability of clusters. Grid computing provides global infrastructure of interconnecting clusters consisting of dispersed computing resources over Internet. On the other hand the leading model for HPC programming is Message Passing Interface (MPI). As compared to Grid computing, MPI is better suited for solving most of the complex computational problems. MPI itself is restricted to a single cluster. It does not support message passing over the internet to use the computing resources of different clusters in an optimal way. We propose a model that provides message passing capabilities between parallel applications over the internet. The proposed model is based on Architecture for Java Universal Message Passing (A-JUMP) framework and Enterprise Service Bus (ESB) named as High Performance Computing Bus. The HPC Bus is built using ActiveMQ. HPC Bus is responsible for communication and message passing in an asynchronous manner. Asynchronous mode of communication offers an assurance for message delivery as well as a fault tolerance mechanism for message passing. The idea presented in this paper effectively utilizes wide-area intercluster networks. It also provides scheduling, dynamic resource discovery and allocation, and sub-clustering of resources for different jobs. Performance analysis and comparison study of the proposed framework with P2P-MPI are also presented in this paper.

  12. A sophisticated simulation for the fracture behavior of concrete material using XFEM

    Science.gov (United States)

    Zhai, Changhai; Wang, Xiaomin; Kong, Jingchang; Li, Shuang; Xie, Lili

    2017-10-01

    The development of a powerful numerical model to simulate the fracture behavior of concrete material has long been one of the dominant research areas in earthquake engineering. A reliable model should be able to adequately represent the discontinuous characteristics of cracks and simulate various failure behaviors under complicated loading conditions. In this paper, a numerical formulation, which incorporates a sophisticated rigid-plastic interface constitutive model coupling cohesion softening, contact, friction and shear dilatation into the XFEM, is proposed to describe various crack behaviors of concrete material. An effective numerical integration scheme for accurately assembling the contribution to the weak form on both sides of the discontinuity is introduced. The effectiveness of the proposed method has been assessed by simulating several well-known experimental tests. It is concluded that the numerical method can successfully capture the crack paths and accurately predict the fracture behavior of concrete structures. The influence of mode-II parameters on the mixed-mode fracture behavior is further investigated to better determine these parameters.

  13. Bioinformatics process management: information flow via a computational journal

    Directory of Open Access Journals (Sweden)

    Lushington Gerald

    2007-12-01

    Full Text Available Abstract This paper presents the Bioinformatics Computational Journal (BCJ, a framework for conducting and managing computational experiments in bioinformatics and computational biology. These experiments often involve series of computations, data searches, filters, and annotations which can benefit from a structured environment. Systems to manage computational experiments exist, ranging from libraries with standard data models to elaborate schemes to chain together input and output between applications. Yet, although such frameworks are available, their use is not widespread–ad hoc scripts are often required to bind applications together. The BCJ explores another solution to this problem through a computer based environment suitable for on-site use, which builds on the traditional laboratory notebook paradigm. It provides an intuitive, extensible paradigm designed for expressive composition of applications. Extensive features facilitate sharing data, computational methods, and entire experiments. By focusing on the bioinformatics and computational biology domain, the scope of the computational framework was narrowed, permitting us to implement a capable set of features for this domain. This report discusses the features determined critical by our system and other projects, along with design issues. We illustrate the use of our implementation of the BCJ on two domain-specific examples.

  14. A Framework for Online Conformance Checking

    DEFF Research Database (Denmark)

    Burattin, Andrea; Carmona, Josep

    2017-01-01

    is quantified after the completion of the process instance. In this paper we propose a framework for online conformance checking: not only do we quantify (non-)conformant behavior as the execution is running, we also restrict the computation to constant time complexity per event analyzed, thus enabling...... the online analysis of a stream of events. The framework is instantiated with ideas coming from the theory of regions, and state similarity. An implementation is available in ProM and promising results have been obtained....

  15. A Classification Framework for Large-Scale Face Recognition Systems

    OpenAIRE

    Zhou, Ziheng; Deravi, Farzin

    2009-01-01

    This paper presents a generic classification framework for large-scale face recognition systems. Within the framework, a data sampling strategy is proposed to tackle the data imbalance when image pairs are sampled from thousands of face images for preparing a training dataset. A modified kernel Fisher discriminant classifier is proposed to make it computationally feasible to train the kernel-based classification method using tens of thousands of training samples. The framework is tested in an...

  16. Teaching programming to non-STEM novices: a didactical study of computational thinking and non-STEM computing education

    DEFF Research Database (Denmark)

    Spangsberg, Thomas Hvid

    research approach. Computational thinking plays a significant role in computing education but it is still unclear how it should be interpreted to best serve its purpose. Constructionism and Computational Making seems to be promising frameworks to do this. In regards to specific teaching activities...

  17. Rational Multiparty Computation

    OpenAIRE

    Wallrabenstein, John Ross

    2014-01-01

    The field of rational cryptography considers the design of cryptographic protocols in the presence of rational agents seeking to maximize local utility functions. This departs from the standard secure multiparty computation setting, where players are assumed to be either honest or malicious. ^ We detail the construction of both a two-party and a multiparty game theoretic framework for constructing rational cryptographic protocols. Our framework specifies the utility function assumptions neces...

  18. Perceptual organization in computer vision - A review and a proposal for a classificatory structure

    Science.gov (United States)

    Sarkar, Sudeep; Boyer, Kim L.

    1993-01-01

    The evolution of perceptual organization in biological vision, and its necessity in advanced computer vision systems, arises from the characteristic that perception, the extraction of meaning from sensory input, is an intelligent process. This is particularly so for high order organisms and, analogically, for more sophisticated computational models. The role of perceptual organization in computer vision systems is explored. This is done from four vantage points. First, a brief history of perceptual organization research in both humans and computer vision is offered. Next, a classificatory structure in which to cast perceptual organization research to clarify both the nomenclature and the relationships among the many contributions is proposed. Thirdly, the perceptual organization work in computer vision in the context of this classificatory structure is reviewed. Finally, the array of computational techniques applied to perceptual organization problems in computer vision is surveyed.

  19. VisRseq: R-based visual framework for analysis of sequencing data.

    Science.gov (United States)

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2015-01-01

    Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights.

  20. Moving the IT Infrastructure to the Cloud

    Directory of Open Access Journals (Sweden)

    Oswaldo Moscoso-Zea

    2018-03-01

    Full Text Available Cloud computing services are nowadays advertised as an emerging business model. Moreover, these services bring innovative solutions in a more sophisticated competitive market. But, the decision for their adoption could be significantly reduced due to organizations’ concerns related to security, privacy, and trust. The challenge involves such questions as where to start, which provider should the company choose or whether it is even worthwhile. Thus, this paper proposes an improved unified framework, based on a previous study where a 6 step process framework was introduced. This improved framework add one new step for security and control after the migration process. At the end, a 7 processes framework is proposed aimed to fulfill organizations’ concerns when decide to adopt cloud computing services with a follow-up step. This additional step intends to help IT directors to make sure everything is working properly in a methodological way, in order to achieve a successful cloud computing migration process. An effective solution that is gaining momentum and popularity for competitive organizations.

  1. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Science.gov (United States)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  2. ASDEX Upgrade Discharge Control System—A real-time plasma control framework

    International Nuclear Information System (INIS)

    Treutterer, W.; Cole, R.; Lüddecke, K.; Neu, G.; Rapson, C.; Raupp, G.; Zasche, D.; Zehetbauer, T.

    2014-01-01

    Highlights: • The ASDEX Upgrade Discharge Control System (DCS) is a comprehensive control system to conduct fusion experiments. • DCS supports real-time diagnostic integration, adaptable feedback schemes, actuator management and exception handling. • DCS offers workflow management, logging and archiving, self-monitoring and inter-process communication. • DCS is based on a distributed, modular software framework architecture designed for real-time operation. • DCS is composed of re-usable generic but highly customisable components. - Abstract: ASDEX Upgrade is a fusion experiment with a size and complexity to allow extrapolation of technical and physical conditions and requirements to devices like ITER and even beyond. In addressing advanced physics topics it makes extensive use of sophisticated real-time control methods. It comprises real-time diagnostic integration, dynamically adaptable multivariable feedback schemes, actuator management including load distribution schemes and a powerful monitoring and pulse supervision concept based on segment scheduling and exception handling. The Discharge Control System (DCS) supplies all this functionality on base of a modular software framework architecture designed for real-time operation. It provides system-wide services like workflow management, logging and archiving, self-monitoring and inter-process communication on Linux, VxWorks and Solaris operating systems. By default DCS supports distributed computing, and a communication layer allows multi-directional signal transfer and data-driven process synchronisation over shared memory as well as over a number of real-time networks. The entire system is built following the same common design concept combining a rich set of re-usable generic but highly customisable components with a configuration-driven component deployment method. We will give an overview on the architectural concepts as well as on the outstanding capabilities of DCS in the domains of inter

  3. ASDEX Upgrade Discharge Control System—A real-time plasma control framework

    Energy Technology Data Exchange (ETDEWEB)

    Treutterer, W., E-mail: Wolfgang.Treutterer@ipp.mpg.de [Max-Planck-Institut für Plasmaphysik, EURATOM Association, Boltzmannstraße 2, 85748 Garching (Germany); Cole, R.; Lüddecke, K. [Unlimited Computer Systems GmbH, Iffeldorf (Germany); Neu, G.; Rapson, C.; Raupp, G.; Zasche, D.; Zehetbauer, T. [Max-Planck-Institut für Plasmaphysik, EURATOM Association, Boltzmannstraße 2, 85748 Garching (Germany)

    2014-03-15

    Highlights: • The ASDEX Upgrade Discharge Control System (DCS) is a comprehensive control system to conduct fusion experiments. • DCS supports real-time diagnostic integration, adaptable feedback schemes, actuator management and exception handling. • DCS offers workflow management, logging and archiving, self-monitoring and inter-process communication. • DCS is based on a distributed, modular software framework architecture designed for real-time operation. • DCS is composed of re-usable generic but highly customisable components. - Abstract: ASDEX Upgrade is a fusion experiment with a size and complexity to allow extrapolation of technical and physical conditions and requirements to devices like ITER and even beyond. In addressing advanced physics topics it makes extensive use of sophisticated real-time control methods. It comprises real-time diagnostic integration, dynamically adaptable multivariable feedback schemes, actuator management including load distribution schemes and a powerful monitoring and pulse supervision concept based on segment scheduling and exception handling. The Discharge Control System (DCS) supplies all this functionality on base of a modular software framework architecture designed for real-time operation. It provides system-wide services like workflow management, logging and archiving, self-monitoring and inter-process communication on Linux, VxWorks and Solaris operating systems. By default DCS supports distributed computing, and a communication layer allows multi-directional signal transfer and data-driven process synchronisation over shared memory as well as over a number of real-time networks. The entire system is built following the same common design concept combining a rich set of re-usable generic but highly customisable components with a configuration-driven component deployment method. We will give an overview on the architectural concepts as well as on the outstanding capabilities of DCS in the domains of inter

  4. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  5. A High-Throughput Computational Framework for Identifying Significant Copy Number Aberrations from Array Comparative Genomic Hybridisation Data

    Directory of Open Access Journals (Sweden)

    Ian Roberts

    2012-01-01

    Full Text Available Reliable identification of copy number aberrations (CNA from comparative genomic hybridization data would be improved by the availability of a generalised method for processing large datasets. To this end, we developed swatCGH, a data analysis framework and region detection heuristic for computational grids. swatCGH analyses sequentially displaced (sliding windows of neighbouring probes and applies adaptive thresholds of varying stringency to identify the 10% of each chromosome that contains the most frequently occurring CNAs. We used the method to analyse a published dataset, comparing data preprocessed using four different DNA segmentation algorithms, and two methods for prioritising the detected CNAs. The consolidated list of the most commonly detected aberrations confirmed the value of swatCGH as a simplified high-throughput method for identifying biologically significant CNA regions of interest.

  6. A Probabilistic Framework for Curve Evolution

    DEFF Research Database (Denmark)

    Dahl, Vedrana Andersen

    2017-01-01

    approach include ability to handle textured images, simple generalization to multiple regions, and efficiency in computation. We test our probabilistic framework in combination with parametric (snakes) and geometric (level-sets) curves. The experimental results on composed and natural images demonstrate...

  7. PSE For Solvent Applications: A Generic Computer-aided Solvent Selection and Design Framework

    DEFF Research Database (Denmark)

    Mitrofanov, Igor; Sin, Gürkan; Gani, Rafiqul

    system engineering view that emphasizes a systematic and generic solution framework to solvent selection problems is presented. The framework integrates different methods and tools to manage the complexity and solve a wide range of problems in efficient and flexible manner. Its software implementation...

  8. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...

  9. A computational framework for converting textual clinical diagnostic criteria into the quality data model.

    Science.gov (United States)

    Hong, Na; Li, Dingcheng; Yu, Yue; Xiu, Qiongying; Liu, Hongfang; Jiang, Guoqian

    2016-10-01

    Constructing standard and computable clinical diagnostic criteria is an important but challenging research field in the clinical informatics community. The Quality Data Model (QDM) is emerging as a promising information model for standardizing clinical diagnostic criteria. To develop and evaluate automated methods for converting textual clinical diagnostic criteria in a structured format using QDM. We used a clinical Natural Language Processing (NLP) tool known as cTAKES to detect sentences and annotate events in diagnostic criteria. We developed a rule-based approach for assigning the QDM datatype(s) to an individual criterion, whereas we invoked a machine learning algorithm based on the Conditional Random Fields (CRFs) for annotating attributes belonging to each particular QDM datatype. We manually developed an annotated corpus as the gold standard and used standard measures (precision, recall and f-measure) for the performance evaluation. We harvested 267 individual criteria with the datatypes of Symptom and Laboratory Test from 63 textual diagnostic criteria. We manually annotated attributes and values in 142 individual Laboratory Test criteria. The average performance of our rule-based approach was 0.84 of precision, 0.86 of recall, and 0.85 of f-measure; the performance of CRFs-based classification was 0.95 of precision, 0.88 of recall and 0.91 of f-measure. We also implemented a web-based tool that automatically translates textual Laboratory Test criteria into the QDM XML template format. The results indicated that our approaches leveraging cTAKES and CRFs are effective in facilitating diagnostic criteria annotation and classification. Our NLP-based computational framework is a feasible and useful solution in developing diagnostic criteria representation and computerization. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. User-customized brain computer interfaces using Bayesian optimization.

    Science.gov (United States)

    Bashashati, Hossein; Ward, Rabab K; Bashashati, Ali

    2016-04-01

    The brain characteristics of different people are not the same. Brain computer interfaces (BCIs) should thus be customized for each individual person. In motor-imagery based synchronous BCIs, a number of parameters (referred to as hyper-parameters) including the EEG frequency bands, the channels and the time intervals from which the features are extracted should be pre-determined based on each subject's brain characteristics. To determine the hyper-parameter values, previous work has relied on manual or semi-automatic methods that are not applicable to high-dimensional search spaces. In this paper, we propose a fully automatic, scalable and computationally inexpensive algorithm that uses Bayesian optimization to tune these hyper-parameters. We then build different classifiers trained on the sets of hyper-parameter values proposed by the Bayesian optimization. A final classifier aggregates the results of the different classifiers. We have applied our method to 21 subjects from three BCI competition datasets. We have conducted rigorous statistical tests, and have shown the positive impact of hyper-parameter optimization in improving the accuracy of BCIs. Furthermore, We have compared our results to those reported in the literature. Unlike the best reported results in the literature, which are based on more sophisticated feature extraction and classification methods, and rely on prestudies to determine the hyper-parameter values, our method has the advantage of being fully automated, uses less sophisticated feature extraction and classification methods, and yields similar or superior results compared to the best performing designs in the literature.

  11. Computing Diffeomorphic Paths for Large Motion Interpolation.

    Science.gov (United States)

    Seo, Dohyung; Jeffrey, Ho; Vemuri, Baba C

    2013-06-01

    In this paper, we introduce a novel framework for computing a path of diffeomorphisms between a pair of input diffeomorphisms. Direct computation of a geodesic path on the space of diffeomorphisms Diff (Ω) is difficult, and it can be attributed mainly to the infinite dimensionality of Diff (Ω). Our proposed framework, to some degree, bypasses this difficulty using the quotient map of Diff (Ω) to the quotient space Diff ( M )/ Diff ( M ) μ obtained by quotienting out the subgroup of volume-preserving diffeomorphisms Diff ( M ) μ . This quotient space was recently identified as the unit sphere in a Hilbert space in mathematics literature, a space with well-known geometric properties. Our framework leverages this recent result by computing the diffeomorphic path in two stages. First, we project the given diffeomorphism pair onto this sphere and then compute the geodesic path between these projected points. Second, we lift the geodesic on the sphere back to the space of diffeomerphisms, by solving a quadratic programming problem with bilinear constraints using the augmented Lagrangian technique with penalty terms. In this way, we can estimate the path of diffeomorphisms, first, staying in the space of diffeomorphisms, and second, preserving shapes/volumes in the deformed images along the path as much as possible. We have applied our framework to interpolate intermediate frames of frame-sub-sampled video sequences. In the reported experiments, our approach compares favorably with the popular Large Deformation Diffeomorphic Metric Mapping framework (LDDMM).

  12. Impact of sophisticated fog spray models on accident analyses

    International Nuclear Information System (INIS)

    Roblyer, S.P.; Owzarski, P.C.

    1978-01-01

    The N-Reactor confinement system release dose to the public in a postulated accident is reduced by washing the confinement atmosphere with fog sprays. This allows a low pressure release of confinement atmosphere containing fission products through filters and out an elevated stack. The current accident analysis required revision of the CORRAL code and other codes such as CONTEMPT to properly model the N Reactor confinement into a system of multiple fog-sprayed compartments. In revising these codes, more sophisticated models for the fog sprays and iodine plateout were incorporated to remove some of the conservatism of steam condensing rate, fission product washout and iodine plateout than used in previous studies. The CORRAL code, which was used to describe the transport and deposition of airborne fission products in LWR containment systems for the Rasmussen Study, was revised to describe fog spray removal of molecular iodine (I 2 ) and particulates in multiple compartments for sprays having individual characteristics of on-off times, flow rates, fall heights, and drop sizes in changing containment atmospheres. During postulated accidents, the code determined the fission product removal rates internally rather than from input decontamination factors. A discussion is given of how the calculated plateout and washout rates vary with time throughout the analysis. The results of the accident analyses indicated that more credit could be given to fission product washout and plateout. An important finding was that the release of fission products to the atmosphere and adsorption of fission products on the filters were significantly lower than previous studies had indicated

  13. Sophisticated Communication in the Brazilian Torrent Frog Hylodes japi.

    Science.gov (United States)

    de Sá, Fábio P; Zina, Juliana; Haddad, Célio F B

    2016-01-01

    Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids), we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

  14. Sophisticated Communication in the Brazilian Torrent Frog Hylodes japi.

    Directory of Open Access Journals (Sweden)

    Fábio P de Sá

    Full Text Available Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids, we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

  15. A computational framework for ultrastructural mapping of neural circuitry.

    Directory of Open Access Journals (Sweden)

    James R Anderson

    2009-03-01

    Full Text Available Circuitry mapping of metazoan neural systems is difficult because canonical neural regions (regions containing one or more copies of all components are large, regional borders are uncertain, neuronal diversity is high, and potential network topologies so numerous that only anatomical ground truth can resolve them. Complete mapping of a specific network requires synaptic resolution, canonical region coverage, and robust neuronal classification. Though transmission electron microscopy (TEM remains the optimal tool for network mapping, the process of building large serial section TEM (ssTEM image volumes is rendered difficult by the need to precisely mosaic distorted image tiles and register distorted mosaics. Moreover, most molecular neuronal class markers are poorly compatible with optimal TEM imaging. Our objective was to build a complete framework for ultrastructural circuitry mapping. This framework combines strong TEM-compliant small molecule profiling with automated image tile mosaicking, automated slice-to-slice image registration, and gigabyte-scale image browsing for volume annotation. Specifically we show how ultrathin molecular profiling datasets and their resultant classification maps can be embedded into ssTEM datasets and how scripted acquisition tools (SerialEM, mosaicking and registration (ir-tools, and large slice viewers (MosaicBuilder, Viking can be used to manage terabyte-scale volumes. These methods enable large-scale connectivity analyses of new and legacy data. In well-posed tasks (e.g., complete network mapping in retina, terabyte-scale image volumes that previously would require decades of assembly can now be completed in months. Perhaps more importantly, the fusion of molecular profiling, image acquisition by SerialEM, ir-tools volume assembly, and data viewers/annotators also allow ssTEM to be used as a prospective tool for discovery in nonneural systems and a practical screening methodology for neurogenetics. Finally

  16. A Framework on Collaboration: an Interdisciplinary Project across Multiple Colleges

    Directory of Open Access Journals (Sweden)

    Andis Kwan

    2007-06-01

    Full Text Available The order of complexity in carrying out collaborative research at multiple campuses poses a challenge to standard knowledge management systems. In this paper, we present a collaboration framework in which computer science students work in partnership with computer scientists, mathematicians and physicists on an emerging field of research, quantum information science. We first develop a few heuristic criteria to determine the rationale that makes project a successful one. We then demonstrate that our knowledge management systems produce publishable results and grant proposals within our framework.

  17. Computer-aided drug design at Boehringer Ingelheim

    Science.gov (United States)

    Muegge, Ingo; Bergner, Andreas; Kriegl, Jan M.

    2017-03-01

    Computer-Aided Drug Design (CADD) is an integral part of the drug discovery endeavor at Boehringer Ingelheim (BI). CADD contributes to the evaluation of new therapeutic concepts, identifies small molecule starting points for drug discovery, and develops strategies for optimizing hit and lead compounds. The CADD scientists at BI benefit from the global use and development of both software platforms and computational services. A number of computational techniques developed in-house have significantly changed the way early drug discovery is carried out at BI. In particular, virtual screening in vast chemical spaces, which can be accessed by combinatorial chemistry, has added a new option for the identification of hits in many projects. Recently, a new framework has been implemented allowing fast, interactive predictions of relevant on and off target endpoints and other optimization parameters. In addition to the introduction of this new framework at BI, CADD has been focusing on the enablement of medicinal chemists to independently perform an increasing amount of molecular modeling and design work. This is made possible through the deployment of MOE as a global modeling platform, allowing computational and medicinal chemists to freely share ideas and modeling results. Furthermore, a central communication layer called the computational chemistry framework provides broad access to predictive models and other computational services.

  18. IPTV Service Framework Based on Secure Authentication and Lightweight Content Encryption for Screen-Migration in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Aymen Abdullah Alsaffar

    2015-01-01

    Full Text Available These days, the advancing of smart devices (e.g. smart phones, tablets, PC, etc. capabilities and the increase of internet bandwidth enables IPTV service provider to extend their services to smart mobile devices. User can just receive their IPTV service using any smart devices by accessing the internet via wireless network from anywhere anytime in the world which is convenience for users. However, wireless network communication has well a known critical security threats and vulnerabilities to user smart devices and IPTV service such as user identity theft, reply attack, MIM attack, and so forth. A secure authentication for user devices and multimedia protection mechanism is necessary to protect both user devices and IPTV services. As result, we proposed framework of IPTV service based on secure authentication mechanism and lightweight content encryption method for screen-migration in Cloud computing. We used cryptographic nonce combined with user ID and password to authenticate user device in any mobile terminal they passes by. In addition we used Lightweight content encryption to protect and reduce the content decode overload at mobile terminals. Our proposed authentication mechanism reduces the computational processing by 30% comparing to other authentication mechanism and our lightweight content encryption reduces encryption delay to 0.259 second.

  19. Design and performance analysis of solid-propellant rocket motors using a simplified computer program

    Science.gov (United States)

    Sforzini, R. H.

    1972-01-01

    An analysis and a computer program are presented which represent a compromise between the more sophisticated programs using precise burning geometric relations and the textbook type of solutions. The program requires approximately 900 computer cards including a set of 20 input data cards required for a typical problem. The computer operating time for a single configuration is approximately 1 minute and 30 seconds on the IBM 360 computer. About l minute and l5 seconds of the time is compilation time so that additional configurations input at the same time require approximately 15 seconds each. The program uses approximately 11,000 words on the IBM 360. The program is written in FORTRAN 4 and is readily adaptable for use on a number of different computers: IBM 7044, IBM 7094, and Univac 1108.

  20. NATO Advanced Research Workshop on Exploiting Mental Imagery with Computers in Mathematics Education

    CERN Document Server

    Mason, John

    1995-01-01

    The advent of fast and sophisticated computer graphics has brought dynamic and interactive images under the control of professional mathematicians and mathematics teachers. This volume in the NATO Special Programme on Advanced Educational Technology takes a comprehensive and critical look at how the computer can support the use of visual images in mathematical problem solving. The contributions are written by researchers and teachers from a variety of disciplines including computer science, mathematics, mathematics education, psychology, and design. Some focus on the use of external visual images and others on the development of individual mental imagery. The book is the first collected volume in a research area that is developing rapidly, and the authors pose some challenging new questions.

  1. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  2. A Verification Framework for Agent Communication

    NARCIS (Netherlands)

    Eijk, R.M. van; Boer, F.S. de; Hoek, W. van der; Meyer, J-J.Ch.

    2003-01-01

    In this paper, we introduce a verification method for the correctness of multiagent systems as described in the framework of acpl (Agent Communication Programming Language). The computational model of acpl consists of an integration of the two different paradigms of ccp (Concurrent Constraint

  3. A computational framework for modeling targets as complex adaptive systems

    Science.gov (United States)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  4. Big data analysis framework for healthcare and social sectors in Korea.

    Science.gov (United States)

    Song, Tae-Min; Ryu, Seewon

    2015-01-01

    We reviewed applications of big data analysis of healthcare and social services in developed countries, and subsequently devised a framework for such an analysis in Korea. We reviewed the status of implementing big data analysis of health care and social services in developed countries, and strategies used by the Ministry of Health and Welfare of Korea (Government 3.0). We formulated a conceptual framework of big data in the healthcare and social service sectors at the national level. As a specific case, we designed a process and method of social big data analysis on suicide buzz. Developed countries (e.g., the United States, the UK, Singapore, Australia, and even OECD and EU) are emphasizing the potential of big data, and using it as a tool to solve their long-standing problems. Big data strategies for the healthcare and social service sectors were formulated based on an ICT-based policy of current government and the strategic goals of the Ministry of Health and Welfare. We suggest a framework of big data analysis in the healthcare and welfare service sectors separately and assigned them tentative names: 'health risk analysis center' and 'integrated social welfare service network'. A framework of social big data analysis is presented by applying it to the prevention and proactive detection of suicide in Korea. There are some concerns with the utilization of big data in the healthcare and social welfare sectors. Thus, research on these issues must be conducted so that sophisticated and practical solutions can be reached.

  5. Computing at Belle II

    International Nuclear Information System (INIS)

    Kuhr, Thomas

    2012-01-01

    Belle II, a next-generation B-factory experiment, will search for new physics effects in a data sample about 50 times larger than the one collected by its predecessor, the Belle experiment. To match the advances in accelerator and detector technology, the computing system and the software have to be upgraded as well. The Belle II computing model is presented and an overview of the distributed computing system and the offline software framework is given.

  6. Layered architecture for quantum computing

    OpenAIRE

    Jones, N. Cody; Van Meter, Rodney; Fowler, Austin G.; McMahon, Peter L.; Kim, Jungsang; Ladd, Thaddeus D.; Yamamoto, Yoshihisa

    2010-01-01

    We develop a layered quantum-computer architecture, which is a systematic framework for tackling the individual challenges of developing a quantum computer while constructing a cohesive device design. We discuss many of the prominent techniques for implementing circuit-model quantum computing and introduce several new methods, with an emphasis on employing surface-code quantum error correction. In doing so, we propose a new quantum-computer architecture based on optical control of quantum dot...

  7. SCALING AN URBAN EMERGENCY EVACUATION FRAMEWORK: CHALLENGES AND PRACTICES

    Energy Technology Data Exchange (ETDEWEB)

    Karthik, Rajasekar [ORNL; Lu, Wei [ORNL

    2014-01-01

    Critical infrastructure disruption, caused by severe weather events, natural disasters, terrorist attacks, etc., has significant impacts on urban transportation systems. We built a computational framework to simulate urban transportation systems under critical infrastructure disruption in order to aid real-time emergency evacuation. This framework will use large scale datasets to provide a scalable tool for emergency planning and management. Our framework, World-Wide Emergency Evacuation (WWEE), integrates population distribution and urban infrastructure networks to model travel demand in emergency situations at global level. Also, a computational model of agent-based traffic simulation is used to provide an optimal evacuation plan for traffic operation purpose [1]. In addition, our framework provides a web-based high resolution visualization tool for emergency evacuation modelers and practitioners. We have successfully tested our framework with scenarios in both United States (Alexandria, VA) and Europe (Berlin, Germany) [2]. However, there are still some major drawbacks for scaling this framework to handle big data workloads in real time. On our back-end, lack of proper infrastructure limits us in ability to process large amounts of data, run the simulation efficiently and quickly, and provide fast retrieval and serving of data. On the front-end, the visualization performance of microscopic evacuation results is still not efficient enough due to high volume data communication between server and client. We are addressing these drawbacks by using cloud computing and next-generation web technologies, namely Node.js, NoSQL, WebGL, Open Layers 3 and HTML5 technologies. We will describe briefly about each one and how we are using and leveraging these technologies to provide an efficient tool for emergency management organizations. Our early experimentation demonstrates that using above technologies is a promising approach to build a scalable and high performance urban

  8. Computer simulation boosts automation in the stockyard

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-04-01

    Today's desktop computer and advanced software keep pace with handling equipment to reach new heights of sophistication with graphic simulation able to show precisely what is and could happen in the coal terminal's stockyard. The article describes an innovative coal terminal nearing completion on the Pacific coast at Lazaro Cardenas in Mexico, called the Petracalco terminal. Here coal is unloaded, stored and fed to the nearby power plant of Pdte Plutarco Elias Calles. The R & D department of the Italian company Techint, Italimpianti has developed MHATIS, a sophisticated software system for marine terminal management here, allowing analysis of performance with the use of graphical animation. Strategies can be tested before being put into practice and likely power station demand can be predicted. The design and operation of the MHATIS system is explained. Other integrated coal handling plants described in the article are that developed by the then PWH (renamed Krupp Foerdertechnik) of Germany for the Israel Electric Corporation and the installation by the same company of a further bucketwheel for a redesigned coal stockyard at the Port of Hamburg operated by Hansaport. 1 fig., 4 photos.

  9. Decomposing dendrophilia. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    Science.gov (United States)

    Honing, Henkjan; Zuidema, Willem

    2014-09-01

    The future of cognitive science will be about bridging neuroscience and behavioral studies, with essential roles played by comparative biology, formal modeling, and the theory of computation. Nowhere will this integration be more strongly needed than in understanding the biological basis of language and music. We thus strongly sympathize with the general framework that Fitch [1] proposes, and welcome the remarkably broad and readable review he presents to support it.

  10. Frameworks Coordinate Scientific Data Management

    Science.gov (United States)

    2012-01-01

    Jet Propulsion Laboratory computer scientists developed a unique software framework to help NASA manage its massive amounts of science data. Through a partnership with the Apache Software Foundation of Forest Hill, Maryland, the technology is now available as an open-source solution and is in use by cancer researchers and pediatric hospitals.

  11. Real Time Animation of Trees Based on BBSC in Computer Games

    Directory of Open Access Journals (Sweden)

    Xuefeng Ao

    2009-01-01

    Full Text Available That researchers in the field of computer games usually find it is difficult to simulate the motion of actual 3D model trees lies in the fact that the tree model itself has very complicated structure, and many sophisticated factors need to be considered during the simulation. Though there are some works on simulating 3D tree and its motion, few of them are used in computer games due to the high demand for real-time in computer games. In this paper, an approach of animating trees in computer games based on a novel tree model representation—Ball B-Spline Curves (BBSCs are proposed. By taking advantage of the good features of the BBSC-based model, physical simulation of the motion of leafless trees with wind blowing becomes easier and more efficient. The method can generate realistic 3D tree animation in real-time, which meets the high requirement for real time in computer games.

  12. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  13. Computers in dental radiography: a scenario for the future

    International Nuclear Information System (INIS)

    Webber, R.L.

    1985-01-01

    The recent emergence of cost-effective computing power makes it possible to integrate sophisticated data-sampling and image-interpretation techniques into dental radiography for the first time. A prototype system is being developed to permit clinical information expressed in three dimensions--plus time--to be made visible almost instantly. The associated X-ray dose for a complete three-dimensional survey of a selected dental region is predicted to be less than that required for a single conventional periapical radiograph exposed on D-speed film

  14. Magnetic field computations for ISX using GFUN-3D

    International Nuclear Information System (INIS)

    Cain, W.D.

    1977-01-01

    This paper presents a comparison between measured magnetic fields and the magnetic fields calculated by the three-dimensional computer program GFUN-3D for the Impurity Study Experiment (ISX). Several iron models are considered ranging in sophistication from 50 to 222 tetrahedra iron elements. The effects of air gaps and the efforts made to simulate effects of grain orientation and packing factor are detailed. The results obtained are compared with the measured magnetic fields, and explanations are presented to account for the variations which occur

  15. Computer aided construction of fault tree

    International Nuclear Information System (INIS)

    Kovacs, Z.

    1982-01-01

    Computer code CAT for the automatic construction of the fault tree is briefly described. Code CAT makes possible simple modelling of components using decision tables, it accelerates the fault tree construction process, constructs fault trees of different complexity, and is capable of harmonized co-operation with programs PREPandKITT 1,2 for fault tree analysis. The efficiency of program CAT and thus the accuracy and completeness of fault trees constructed significantly depends on the compilation and sophistication of decision tables. Currently, program CAT is used in co-operation with programs PREPandKITT 1,2 in reliability analyses of nuclear power plant systems. (B.S.)

  16. CernVM Co-Pilot: a Framework for Orchestrating Virtual Machines Running Applications of LHC Experiments on the Cloud

    International Nuclear Information System (INIS)

    Harutyunyan, A; Sánchez, C Aguado; Blomer, J; Buncic, P

    2011-01-01

    CernVM Co-Pilot is a framework for the delivery and execution of the workload on remote computing resources. It consists of components which are developed to ease the integration of geographically distributed resources (such as commercial or academic computing clouds, or the machines of users participating in volunteer computing projects) into existing computing grid infrastructures. The Co-Pilot framework can also be used to build an ad-hoc computing infrastructure on top of distributed resources. In this paper we present the architecture of the Co-Pilot framework, describe how it is used to execute the jobs of the ALICE and ATLAS experiments, as well as to run the Monte-Carlo simulation application of CERN Theoretical Physics Group.

  17. BaBar computing - From collisions to physics results

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    The BaBar experiment at SLAC studies B-physics at the Upsilon(4S) resonance using the high-luminosity e+e- collider PEP-II at the Stanford Linear Accelerator Center (SLAC). Taking, processing and analyzing the very large data samples is a significant computing challenge. This presentation will describe the entire BaBar computing chain and illustrate the solutions chosen as well as their evolution with the ever higher luminosity being delivered by PEP-II. This will include data acquisition and software triggering in a high availability, low-deadtime online environment, a prompt, automated calibration pass through the data SLAC and then the full reconstruction of the data that takes place at INFN-Padova within 24 hours. Monte Carlo production takes place in a highly automated fashion in 25+ sites. The resulting real and simulated data is distributed and made available at SLAC and other computing centers. For analysis a much more sophisticated skimming pass has been introduced in the past year, ...

  18. Computer-aided modeling framework – a generic modeling template

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    and test models systematically, efficiently and reliably. In this way, development of products and processes can be made faster, cheaper and more efficient. In this contribution, as part of the framework, a generic modeling template for the systematic derivation of problem specific models is presented....... The application of the modeling template is highlighted with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene...

  19. Using polarimetric radar observations and probabilistic inference to develop the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), a novel microphysical parameterization framework

    Science.gov (United States)

    van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.

    2016-12-01

    Microphysical parameterization schemes have reached an impressive level of sophistication: numerous prognostic hydrometeor categories, and either size-resolved (bin) particle size distributions, or multiple prognostic moments of the size distribution. Yet, uncertainty in model representation of microphysical processes and the effects of microphysics on numerical simulation of weather has not shown a improvement commensurate with the advanced sophistication of these schemes. We posit that this may be caused by unconstrained assumptions of these schemes, such as ad-hoc parameter value choices and structural uncertainties (e.g. choice of a particular form for the size distribution). We present work on development and observational constraint of a novel microphysical parameterization approach, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), which seeks to address these sources of uncertainty. Our framework avoids unnecessary a priori assumptions, and instead relies on observations to provide probabilistic constraint of the scheme structure and sensitivities to environmental and microphysical conditions. We harness the rich microphysical information content of polarimetric radar observations to develop and constrain BOSS within a Bayesian inference framework using a Markov Chain Monte Carlo sampler (see Kumjian et al., this meeting for details on development of an associated polarimetric forward operator). Our work shows how knowledge of microphysical processes is provided by polarimetric radar observations of diverse weather conditions, and which processes remain highly uncertain, even after considering observations.

  20. A Semantic Based Policy Management Framework for Cloud Computing Environments

    Science.gov (United States)

    Takabi, Hassan

    2013-01-01

    Cloud computing paradigm has gained tremendous momentum and generated intensive interest. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this dissertation, we mainly focus on issues related to policy management and access…