WorldWideScience

Sample records for computer models

  1. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  2. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  3. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  4. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  5. Computational models of neuromodulation.

    Science.gov (United States)

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  6. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  7. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    Models are playing important roles in design and analysis of chemicals based products and the processes that manufacture them. Computer-aided methods and tools have the potential to reduce the number of experiments, which can be expensive and time consuming, and there is a benefit of working...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...

  8. Computer Profiling Based Model for Investigation

    OpenAIRE

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  9. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  10. Overhead Crane Computer Model

    Science.gov (United States)

    Enin, S. S.; Omelchenko, E. Y.; Fomin, N. V.; Beliy, A. V.

    2018-03-01

    The paper has a description of a computer model of an overhead crane system. The designed overhead crane system consists of hoisting, trolley and crane mechanisms as well as a payload two-axis system. With the help of the differential equation of specified mechanisms movement derived through Lagrange equation of the II kind, it is possible to build an overhead crane computer model. The computer model was obtained using Matlab software. Transients of coordinate, linear speed and motor torque of trolley and crane mechanism systems were simulated. In addition, transients of payload swaying were obtained with respect to the vertical axis. A trajectory of the trolley mechanism with simultaneous operation with the crane mechanism is represented in the paper as well as a two-axis trajectory of payload. The designed computer model of an overhead crane is a great means for studying positioning control and anti-sway control systems.

  11. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  12. Computational Modeling of Space Physiology

    Science.gov (United States)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  13. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  14. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    Science.gov (United States)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  15. International Conference on Computational Intelligence, Cyber Security, and Computational Models

    CERN Document Server

    Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn

    2016-01-01

    This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

  16. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  17. Modeling Computer Virus and Its Dynamics

    Directory of Open Access Journals (Sweden)

    Mei Peng

    2013-01-01

    Full Text Available Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that the infected part of the computer disappears, and the virus dies out, and P0 is a globally asymptotically stable equilibrium if R01 then this model has only one viral equilibrium P*, which means that the computer persists at a constant endemic level, and P* is also globally asymptotically stable. Finally, some numerical examples are given to demonstrate the analytical results.

  18. COMPUTATIONAL MODELS FOR SUSTAINABLE DEVELOPMENT

    OpenAIRE

    Monendra Grover; Rajesh Kumar; Tapan Kumar Mondal; S. Rajkumar

    2011-01-01

    Genetic erosion is a serious problem and computational models have been developed to prevent it. The computational modeling in this field not only includes (terrestrial) reserve design, but also decision modeling for related problems such as habitat restoration, marine reserve design, and nonreserve approaches to conservation management. Models have been formulated for evaluating tradeoffs between socioeconomic, biophysical, and spatial criteria in establishing marine reserves. The percolatio...

  19. Ranked retrieval of Computational Biology models.

    Science.gov (United States)

    Henkel, Ron; Endler, Lukas; Peters, Andre; Le Novère, Nicolas; Waltemath, Dagmar

    2010-08-11

    The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models.

  20. CMS computing model evolution

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  1. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  2. The IceCube Computing Infrastructure Model

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Besides the big LHC experiments a number of mid-size experiments is coming online which need to define new computing models to meet the demands on processing and storage requirements of those experiments. We present the hybrid computing model of IceCube which leverages GRID models with a more flexible direct user model as an example of a possible solution. In IceCube a central datacenter at UW-Madison servers as Tier-0 with a single Tier-1 datacenter at DESY Zeuthen. We describe the setup of the IceCube computing infrastructure and report on our experience in successfully provisioning the IceCube computing needs.

  3. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  4. Sierra toolkit computational mesh conceptual model

    International Nuclear Information System (INIS)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-01-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  5. Modelling computer networks

    International Nuclear Information System (INIS)

    Max, G

    2011-01-01

    Traffic models in computer networks can be described as a complicated system. These systems show non-linear features and to simulate behaviours of these systems are also difficult. Before implementing network equipments users wants to know capability of their computer network. They do not want the servers to be overloaded during temporary traffic peaks when more requests arrive than the server is designed for. As a starting point for our study a non-linear system model of network traffic is established to exam behaviour of the network planned. The paper presents setting up a non-linear simulation model that helps us to observe dataflow problems of the networks. This simple model captures the relationship between the competing traffic and the input and output dataflow. In this paper, we also focus on measuring the bottleneck of the network, which was defined as the difference between the link capacity and the competing traffic volume on the link that limits end-to-end throughput. We validate the model using measurements on a working network. The results show that the initial model estimates well main behaviours and critical parameters of the network. Based on this study, we propose to develop a new algorithm, which experimentally determines and predict the available parameters of the network modelled.

  6. Mathematical Modeling and Computational Thinking

    Science.gov (United States)

    Sanford, John F.; Naidu, Jaideep T.

    2017-01-01

    The paper argues that mathematical modeling is the essence of computational thinking. Learning a computer language is a valuable assistance in learning logical thinking but of less assistance when learning problem-solving skills. The paper is third in a series and presents some examples of mathematical modeling using spreadsheets at an advanced…

  7. LHCb computing model

    CERN Document Server

    Frank, M; Pacheco, Andreu

    1998-01-01

    This document is a first attempt to describe the LHCb computing model. The CPU power needed to process data for the event filter and reconstruction is estimated to be 2.2 \\Theta 106 MIPS. This will be installed at the experiment and will be reused during non data-taking periods for reprocessing. The maximal I/O of these activities is estimated to be around 40 MB/s.We have studied three basic models concerning the placement of the CPU resources for the other computing activities, Monte Carlo-simulation (1:4 \\Theta 106 MIPS) and physics analysis (0:5 \\Theta 106 MIPS): CPU resources may either be located at the physicist's homelab, national computer centres (Regional Centres) or at CERN.The CPU resources foreseen for analysis are sufficient to allow 100 concurrent analyses. It is assumed that physicists will work in physics groups that produce analysis data at an average rate of 4.2 MB/s or 11 TB per month. However, producing these group analysis data requires reading capabilities of 660 MB/s. It is further assu...

  8. 40 CFR 194.23 - Models and computer codes.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  9. Trust Models in Ubiquitous Computing

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2008-01-01

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.......We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models....

  10. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  11. Computer models for economic and silvicultural decisions

    Science.gov (United States)

    Rosalie J. Ingram

    1989-01-01

    Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.

  12. Quantum vertex model for reversible classical computing.

    Science.gov (United States)

    Chamon, C; Mucciolo, E R; Ruckenstein, A E; Yang, Z-C

    2017-05-12

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without 'learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  13. GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique

    Science.gov (United States)

    Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.

    2015-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).

  14. Mapping the Most Significant Computer Hacking Events to a Temporal Computer Attack Model

    OpenAIRE

    Heerden , Renier ,; Pieterse , Heloise; Irwin , Barry

    2012-01-01

    Part 4: Section 3: ICT for Peace and War; International audience; This paper presents eight of the most significant computer hacking events (also known as computer attacks). These events were selected because of their unique impact, methodology, or other properties. A temporal computer attack model is presented that can be used to model computer based attacks. This model consists of the following stages: Target Identification, Reconnaissance, Attack, and Post-Attack Reconnaissance stages. The...

  15. Models of parallel computation :a survey and classification

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yunquan; CHEN Guoliang; SUN Guangzhong; MIAO Qiankun

    2007-01-01

    In this paper,the state-of-the-art parallel computational model research is reviewed.We will introduce various models that were developed during the past decades.According to their targeting architecture features,especially memory organization,we classify these parallel computational models into three generations.These models and their characteristics are discussed based on three generations classification.We believe that with the ever increasing speed gap between the CPU and memory systems,incorporating non-uniform memory hierarchy into computational models will become unavoidable.With the emergence of multi-core CPUs,the parallelism hierarchy of current computing platforms becomes more and more complicated.Describing this complicated parallelism hierarchy in future computational models becomes more and more important.A semi-automatic toolkit that can extract model parameters and their values on real computers can reduce the model analysis complexity,thus allowing more complicated models with more parameters to be adopted.Hierarchical memory and hierarchical parallelism will be two very important features that should be considered in future model design and research.

  16. Computer model for ductile fracture

    International Nuclear Information System (INIS)

    Moran, B.; Reaugh, J. E.

    1979-01-01

    A computer model is described for predicting ductile fracture initiation and propagation. The computer fracture model is calibrated by simple and notched round-bar tension tests and a precracked compact tension test. The model is used to predict fracture initiation and propagation in a Charpy specimen and compare the results with experiments. The calibrated model provides a correlation between Charpy V-notch (CVN) fracture energy and any measure of fracture toughness, such as J/sub Ic/. A second simpler empirical correlation was obtained using the energy to initiate fracture in the Charpy specimen rather than total energy CVN, and compared the results with the empirical correlation of Rolfe and Novak

  17. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  18. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  19. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  20. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  1. A physicist's model of computation

    International Nuclear Information System (INIS)

    Fredkin, E.

    1991-01-01

    An attempt is presented to make a statement about what a computer is and how it works from the perspective of physics. The single observation that computation can be a reversible process allows for the same kind of insight into computing as was obtained by Carnot's discovery that heat engines could be modelled as reversible processes. It allows us to bring computation into the realm of physics, where the power of physics allows us to ask and answer questions that seemed intractable from the viewpoint of computer science. Strangely enough, this effort makes it clear why computers get cheaper every year. (author) 14 refs., 4 figs

  2. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer-aided...... methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task....... The methodology has been implemented into a computer-aided modeling framework, which combines expert skills, tools, and database connections that are required for the different steps of the model development work-flow with the goal to increase the efficiency of the modeling process. The framework has two main...

  4. Elements of matrix modeling and computing with Matlab

    CERN Document Server

    White, Robert E

    2006-01-01

    As discrete models and computing have become more common, there is a need to study matrix computation and numerical linear algebra. Encompassing a diverse mathematical core, Elements of Matrix Modeling and Computing with MATLAB examines a variety of applications and their modeling processes, showing you how to develop matrix models and solve algebraic systems. Emphasizing practical skills, it creates a bridge from problems with two and three variables to more realistic problems that have additional variables. Elements of Matrix Modeling and Computing with MATLAB focuses on seven basic applicat

  5. Vehicle - Bridge interaction, comparison of two computing models

    Science.gov (United States)

    Melcer, Jozef; Kuchárová, Daniela

    2017-07-01

    The paper presents the calculation of the bridge response on the effect of moving vehicle moves along the bridge with various velocities. The multi-body plane computing model of vehicle is adopted. The bridge computing models are created in two variants. One computing model represents the bridge as the Bernoulli-Euler beam with continuously distributed mass and the second one represents the bridge as the lumped mass model with 1 degrees of freedom. The mid-span bridge dynamic deflections are calculated for both computing models. The results are mutually compared and quantitative evaluated.

  6. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  7. Creation of 'Ukrytie' objects computer model

    International Nuclear Information System (INIS)

    Mazur, A.B.; Kotlyarov, V.T.; Ermolenko, A.I.; Podbereznyj, S.S.; Postil, S.D.; Shaptala, D.V.

    1999-01-01

    A partial computer model of the 'Ukrytie' object was created with the use of geoinformation technologies. The computer model makes it possible to carry out information support of the works related to the 'Ukrytie' object stabilization and its conversion into ecologically safe system for analyzing, forecasting and controlling the processes occurring in the 'Ukrytie' object. Elements and structures of the 'Ukryttia' object were designed and input into the model

  8. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  9. Computational modelling in fluid mechanics

    International Nuclear Information System (INIS)

    Hauguel, A.

    1985-01-01

    The modelling of the greatest part of environmental or industrial flow problems gives very similar types of equations. The considerable increase in computing capacity over the last ten years consequently allowed numerical models of growing complexity to be processed. The varied group of computer codes presented are now a complementary tool of experimental facilities to achieve studies in the field of fluid mechanics. Several codes applied in the nuclear field (reactors, cooling towers, exchangers, plumes...) are presented among others [fr

  10. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  11. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  12. Computational multiscale modeling of intergranular cracking

    International Nuclear Information System (INIS)

    Simonovski, Igor; Cizelj, Leon

    2011-01-01

    A novel computational approach for simulation of intergranular cracks in a polycrystalline aggregate is proposed in this paper. The computational model includes a topological model of the experimentally determined microstructure of a 400 μm diameter stainless steel wire and automatic finite element discretization of the grains and grain boundaries. The microstructure was spatially characterized by X-ray diffraction contrast tomography and contains 362 grains and some 1600 grain boundaries. Available constitutive models currently include isotropic elasticity for the grain interior and cohesive behavior with damage for the grain boundaries. The experimentally determined lattice orientations are employed to distinguish between resistant low energy and susceptible high energy grain boundaries in the model. The feasibility and performance of the proposed computational approach is demonstrated by simulating the onset and propagation of intergranular cracking. The preliminary numerical results are outlined and discussed.

  13. Quantum Vertex Model for Reversible Classical Computing

    Science.gov (United States)

    Chamon, Claudio; Mucciolo, Eduardo; Ruckenstein, Andrei; Yang, Zhicheng

    We present a planar vertex model that encodes the result of a universal reversible classical computation in its ground state. The approach involves Boolean variables (spins) placed on links of a two-dimensional lattice, with vertices representing logic gates. Large short-ranged interactions between at most two spins implement the operation of each gate. The lattice is anisotropic with one direction corresponding to computational time, and with transverse boundaries storing the computation's input and output. The model displays no finite temperature phase transitions, including no glass transitions, independent of circuit. The computational complexity is encoded in the scaling of the relaxation rate into the ground state with the system size. We use thermal annealing and a novel and more efficient heuristic \\x9Dannealing with learning to study various computational problems. To explore faster relaxation routes, we construct an explicit mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating a novel approach to reversible classical computation based on quantum annealing.

  14. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  15. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar

    2016-03-21

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.

  17. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar; Henkel, Ron; Hoehndorf, Robert; Kacprowski, Tim; Knuepfer, Christian; Liebermeister, Wolfram

    2016-01-01

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users' intuition about model similarity, and to support complex model searches in databases.

  18. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  19. Computational models of airway branching morphogenesis.

    Science.gov (United States)

    Varner, Victor D; Nelson, Celeste M

    2017-07-01

    The bronchial network of the mammalian lung consists of millions of dichotomous branches arranged in a highly complex, space-filling tree. Recent computational models of branching morphogenesis in the lung have helped uncover the biological mechanisms that construct this ramified architecture. In this review, we focus on three different theoretical approaches - geometric modeling, reaction-diffusion modeling, and continuum mechanical modeling - and discuss how, taken together, these models have identified the geometric principles necessary to build an efficient bronchial network, as well as the patterning mechanisms that specify airway geometry in the developing embryo. We emphasize models that are integrated with biological experiments and suggest how recent progress in computational modeling has advanced our understanding of airway branching morphogenesis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Model to Implement Virtual Computing Labs via Cloud Computing Services

    Directory of Open Access Journals (Sweden)

    Washington Luna Encalada

    2017-07-01

    Full Text Available In recent years, we have seen a significant number of new technological ideas appearing in literature discussing the future of education. For example, E-learning, cloud computing, social networking, virtual laboratories, virtual realities, virtual worlds, massive open online courses (MOOCs, and bring your own device (BYOD are all new concepts of immersive and global education that have emerged in educational literature. One of the greatest challenges presented to e-learning solutions is the reproduction of the benefits of an educational institution’s physical laboratory. For a university without a computing lab, to obtain hands-on IT training with software, operating systems, networks, servers, storage, and cloud computing similar to that which could be received on a university campus computing lab, it is necessary to use a combination of technological tools. Such teaching tools must promote the transmission of knowledge, encourage interaction and collaboration, and ensure students obtain valuable hands-on experience. That, in turn, allows the universities to focus more on teaching and research activities than on the implementation and configuration of complex physical systems. In this article, we present a model for implementing ecosystems which allow universities to teach practical Information Technology (IT skills. The model utilizes what is called a “social cloud”, which utilizes all cloud computing services, such as Software as a Service (SaaS, Platform as a Service (PaaS, and Infrastructure as a Service (IaaS. Additionally, it integrates the cloud learning aspects of a MOOC and several aspects of social networking and support. Social clouds have striking benefits such as centrality, ease of use, scalability, and ubiquity, providing a superior learning environment when compared to that of a simple physical lab. The proposed model allows students to foster all the educational pillars such as learning to know, learning to be, learning

  1. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  2. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  3. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  4. Do's and Don'ts of Computer Models for Planning

    Science.gov (United States)

    Hammond, John S., III

    1974-01-01

    Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)

  5. Reduced order methods for modeling and computational reduction

    CERN Document Server

    Rozza, Gianluigi

    2014-01-01

    This monograph addresses the state of the art of reduced order methods for modeling and computational reduction of complex parametrized systems, governed by ordinary and/or partial differential equations, with a special emphasis on real time computing techniques and applications in computational mechanics, bioengineering and computer graphics.  Several topics are covered, including: design, optimization, and control theory in real-time with applications in engineering; data assimilation, geometry registration, and parameter estimation with special attention to real-time computing in biomedical engineering and computational physics; real-time visualization of physics-based simulations in computer science; the treatment of high-dimensional problems in state space, physical space, or parameter space; the interactions between different model reduction and dimensionality reduction approaches; the development of general error estimation frameworks which take into account both model and discretization effects. This...

  6. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  7. Towards The Deep Model : Understanding Visual Recognition Through Computational Models

    OpenAIRE

    Wang, Panqu

    2017-01-01

    Understanding how visual recognition is achieved in the human brain is one of the most fundamental questions in vision research. In this thesis I seek to tackle this problem from a neurocomputational modeling perspective. More specifically, I build machine learning-based models to simulate and explain cognitive phenomena related to human visual recognition, and I improve computational models using brain-inspired principles to excel at computer vision tasks.I first describe how a neurocomputat...

  8. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  9. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  10. A Computational Analysis Model for Open-ended Cognitions

    Science.gov (United States)

    Morita, Junya; Miwa, Kazuhisa

    In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.

  11. Modeling multimodal human-computer interaction

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2004-01-01

    Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze

  12. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  13. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...

  14. Editorial: Modelling and computational challenges in granular materials

    OpenAIRE

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss the current progress and latest advancements in the field of advanced numerical methods and modelling of granular materials. The focus will be on computational methods, improved algorithms and the m...

  15. SmartShadow models and methods for pervasive computing

    CERN Document Server

    Wu, Zhaohui

    2013-01-01

    SmartShadow: Models and Methods for Pervasive Computing offers a new perspective on pervasive computing with SmartShadow, which is designed to model a user as a personality ""shadow"" and to model pervasive computing environments as user-centric dynamic virtual personal spaces. Just like human beings' shadows in the physical world, it follows people wherever they go, providing them with pervasive services. The model, methods, and software infrastructure for SmartShadow are presented and an application for smart cars is also introduced.  The book can serve as a valuable reference work for resea

  16. Computational Modeling of Biological Systems From Molecules to Pathways

    CERN Document Server

    2012-01-01

    Computational modeling is emerging as a powerful new approach for studying and manipulating biological systems. Many diverse methods have been developed to model, visualize, and rationally alter these systems at various length scales, from atomic resolution to the level of cellular pathways. Processes taking place at larger time and length scales, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. Computational Modeling of Biological Systems: From Molecules to Pathways provides an overview of established computational methods for the modeling of biologically and medically relevant systems. It is suitable for researchers and professionals working in the fields of biophysics, computational biology, systems biology, and molecular medicine.

  17. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  18. A computational model of selection by consequences.

    OpenAIRE

    McDowell, J J

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied o...

  19. A Categorisation of Cloud Computing Business Models

    OpenAIRE

    Chang, Victor; Bacigalupo, David; Wills, Gary; De Roure, David

    2010-01-01

    This paper reviews current cloud computing business models and presents proposals on how organisations can achieve sustainability by adopting appropriate models. We classify cloud computing business models into eight types: (1) Service Provider and Service Orientation; (2) Support and Services Contracts; (3) In-House Private Clouds; (4) All-In-One Enterprise Cloud; (5) One-Stop Resources and Services; (6) Government funding; (7) Venture Capitals; and (8) Entertainment and Social Networking. U...

  20. Chaos Modelling with Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Chaos Modelling with Computers Unpredicatable Behaviour of Deterministic Systems. Balakrishnan Ramasamy T S K V Iyer. General Article Volume 1 Issue 5 May 1996 pp 29-39 ...

  1. A computer simulation model to compute the radiation transfer of mountainous regions

    Science.gov (United States)

    Li, Yuguang; Zhao, Feng; Song, Rui

    2011-11-01

    In mountainous regions, the radiometric signal recorded at the sensor depends on a number of factors such as sun angle, atmospheric conditions, surface cover type, and topography. In this paper, a computer simulation model of radiation transfer is designed and evaluated. This model implements the Monte Carlo ray-tracing techniques and is specifically dedicated to the study of light propagation in mountainous regions. The radiative processes between sun light and the objects within the mountainous region are realized by using forward Monte Carlo ray-tracing methods. The performance of the model is evaluated through detailed comparisons with the well-established 3D computer simulation model: RGM (Radiosity-Graphics combined Model) based on the same scenes and identical spectral parameters, which shows good agreements between these two models' results. By using the newly developed computer model, series of typical mountainous scenes are generated to analyze the physical mechanism of mountainous radiation transfer. The results show that the effects of the adjacent slopes are important for deep valleys and they particularly affect shadowed pixels, and the topographic effect needs to be considered in mountainous terrain before accurate inferences from remotely sensed data can be made.

  2. Applications of computer modeling to fusion research

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling

  3. Trust models in ubiquitous computing.

    Science.gov (United States)

    Krukow, Karl; Nielsen, Mogens; Sassone, Vladimiro

    2008-10-28

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.

  4. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  5. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  6. Blackboard architecture and qualitative model in a computer aided assistant designed to define computers for HEP computing

    International Nuclear Information System (INIS)

    Nodarse, F.F.; Ivanov, V.G.

    1991-01-01

    Using BLACKBOARD architecture and qualitative model, an expert systm was developed to assist the use in defining the computers method for High Energy Physics computing. The COMEX system requires an IBM AT personal computer or compatible with than 640 Kb RAM and hard disk. 5 refs.; 9 figs

  7. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  8. Computational models in physics teaching: a framework

    Directory of Open Access Journals (Sweden)

    Marco Antonio Moreira

    2012-08-01

    Full Text Available The purpose of the present paper is to present a theoretical framework to promote and assist meaningful physics learning through computational models. Our proposal is based on the use of a tool, the AVM diagram, to design educational activities involving modeling and computer simulations. The idea is to provide a starting point for the construction and implementation of didactical approaches grounded in a coherent epistemological view about scientific modeling.

  9. A model for calculating the optimal replacement interval of computer systems

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1981-08-01

    A mathematical model for calculating the optimal replacement interval of computer systems is described. This model is made to estimate the best economical interval of computer replacement when computing demand, cost and performance of computer, etc. are known. The computing demand is assumed to monotonously increase every year. Four kinds of models are described. In the model 1, a computer system is represented by only a central processing unit (CPU) and all the computing demand is to be processed on the present computer until the next replacement. On the other hand in the model 2, the excessive demand is admitted and may be transferred to other computing center and processed costly there. In the model 3, the computer system is represented by a CPU, memories (MEM) and input/output devices (I/O) and it must process all the demand. Model 4 is same as model 3, but the excessive demand is admitted to be processed in other center. (1) Computing demand at the JAERI, (2) conformity of Grosch's law for the recent computers, (3) replacement cost of computer systems, etc. are also described. (author)

  10. Modeling soft factors in computer-based wargames

    Science.gov (United States)

    Alexander, Steven M.; Ross, David O.; Vinarskai, Jonathan S.; Farr, Steven D.

    2002-07-01

    Computer-based wargames have seen much improvement in recent years due to rapid increases in computing power. Because these games have been developed for the entertainment industry, most of these advances have centered on the graphics, sound, and user interfaces integrated into these wargames with less attention paid to the game's fidelity. However, for a wargame to be useful to the military, it must closely approximate as many of the elements of war as possible. Among the elements that are typically not modeled or are poorly modeled in nearly all military computer-based wargames are systematic effects, command and control, intelligence, morale, training, and other human and political factors. These aspects of war, with the possible exception of systematic effects, are individually modeled quite well in many board-based commercial wargames. The work described in this paper focuses on incorporating these elements from the board-based games into a computer-based wargame. This paper will also address the modeling and simulation of the systemic paralysis of an adversary that is implied by the concept of Effects Based Operations (EBO). Combining the fidelity of current commercial board wargames with the speed, ease of use, and advanced visualization of the computer can significantly improve the effectiveness of military decision making and education. Once in place, the process of converting board wargames concepts to computer wargames will allow the infusion of soft factors into military training and planning.

  11. Computer models for kinetic equations of magnetically confined plasmas

    International Nuclear Information System (INIS)

    Killeen, J.; Kerbel, G.D.; McCoy, M.G.; Mirin, A.A.; Horowitz, E.J.; Shumaker, D.E.

    1987-01-01

    This paper presents four working computer models developed by the computational physics group of the National Magnetic Fusion Energy Computer Center. All of the models employ a kinetic description of plasma species. Three of the models are collisional, i.e., they include the solution of the Fokker-Planck equation in velocity space. The fourth model is collisionless and treats the plasma ions by a fully three-dimensional particle-in-cell method

  12. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  13. A response-modeling alternative to surrogate models for support in computational analyses

    International Nuclear Information System (INIS)

    Rutherford, Brian

    2006-01-01

    Often, the objectives in a computational analysis involve characterization of system performance based on some function of the computed response. In general, this characterization includes (at least) an estimate or prediction for some performance measure and an estimate of the associated uncertainty. Surrogate models can be used to approximate the response in regions where simulations were not performed. For most surrogate modeling approaches, however (1) estimates are based on smoothing of available data and (2) uncertainty in the response is specified in a point-wise (in the input space) fashion. These aspects of the surrogate model construction might limit their capabilities. One alternative is to construct a probability measure, G(r), for the computer response, r, based on available data. This 'response-modeling' approach will permit probability estimation for an arbitrary event, E(r), based on the computer response. In this general setting, event probabilities can be computed: prob(E)=∫ r I(E(r))dG(r) where I is the indicator function. Furthermore, one can use G(r) to calculate an induced distribution on a performance measure, pm. For prediction problems where the performance measure is a scalar, its distribution F pm is determined by: F pm (z)=∫ r I(pm(r)≤z)dG(r). We introduce response models for scalar computer output and then generalize the approach to more complicated responses that utilize multiple response models

  14. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  15. The Architectural Designs of a Nanoscale Computing Model

    Directory of Open Access Journals (Sweden)

    Mary M. Eshaghian-Wilner

    2004-08-01

    Full Text Available A generic nanoscale computing model is presented in this paper. The model consists of a collection of fully interconnected nanoscale computing modules, where each module is a cube of cells made out of quantum dots, spins, or molecules. The cells dynamically switch between two states by quantum interactions among their neighbors in all three dimensions. This paper includes a brief introduction to the field of nanotechnology from a computing point of view and presents a set of preliminary architectural designs for fabricating the nanoscale model studied.

  16. Geometric and computer-aided spline hob modeling

    Science.gov (United States)

    Brailov, I. G.; Myasoedova, T. M.; Panchuk, K. L.; Krysova, I. V.; Rogoza, YU A.

    2018-03-01

    The paper considers acquiring the spline hob geometric model. The objective of the research is the development of a mathematical model of spline hob for spline shaft machining. The structure of the spline hob is described taking into consideration the motion in parameters of the machine tool system of cutting edge positioning and orientation. Computer-aided study is performed with the use of CAD and on the basis of 3D modeling methods. Vector representation of cutting edge geometry is accepted as the principal method of spline hob mathematical model development. The paper defines the correlations described by parametric vector functions representing helical cutting edges designed for spline shaft machining with consideration for helical movement in two dimensions. An application for acquiring the 3D model of spline hob is developed on the basis of AutoLISP for AutoCAD environment. The application presents the opportunity for the use of the acquired model for milling process imitation. An example of evaluation, analytical representation and computer modeling of the proposed geometrical model is reviewed. In the mentioned example, a calculation of key spline hob parameters assuring the capability of hobbing a spline shaft of standard design is performed. The polygonal and solid spline hob 3D models are acquired by the use of imitational computer modeling.

  17. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  18. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  19. Structure, function, and behaviour of computational models in systems biology.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens; Dittrich, Peter; Le Novère, Nicolas

    2013-05-31

    Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such "bio-models" necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. We present a conceptual framework - the meaning facets - which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model's components (structure), the meaning of the model's intended use (function), and the meaning of the model's dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research.

  20. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  1. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...

  2. Computer Modeling of Direct Metal Laser Sintering

    Science.gov (United States)

    Cross, Matthew

    2014-01-01

    A computational approach to modeling direct metal laser sintering (DMLS) additive manufacturing process is presented. The primary application of the model is for determining the temperature history of parts fabricated using DMLS to evaluate residual stresses found in finished pieces and to assess manufacturing process strategies to reduce part slumping. The model utilizes MSC SINDA as a heat transfer solver with imbedded FORTRAN computer code to direct laser motion, apply laser heating as a boundary condition, and simulate the addition of metal powder layers during part fabrication. Model results are compared to available data collected during in situ DMLS part manufacture.

  3. Climate Ocean Modeling on Parallel Computers

    Science.gov (United States)

    Wang, P.; Cheng, B. N.; Chao, Y.

    1998-01-01

    Ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change. However, modeling the ocean circulation at various spatial and temporal scales is a very challenging computational task.

  4. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  5. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  6. Ch. 33 Modeling: Computational Thermodynamics

    International Nuclear Information System (INIS)

    Besmann, Theodore M.

    2012-01-01

    This chapter considers methods and techniques for computational modeling for nuclear materials with a focus on fuels. The basic concepts for chemical thermodynamics are described and various current models for complex crystalline and liquid phases are illustrated. Also included are descriptions of available databases for use in chemical thermodynamic studies and commercial codes for performing complex equilibrium calculations.

  7. Uses of Computer Simulation Models in Ag-Research and Everyday Life

    Science.gov (United States)

    When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...

  8. Deployment Models: Towards Eliminating Security Concerns From Cloud Computing

    OpenAIRE

    Zhao, Gansen; Chunming, Rong; Jaatun, Martin Gilje; Sandnes, Frode Eika

    2010-01-01

    Cloud computing has become a popular choice as an alternative to investing new IT systems. When making decisions on adopting cloud computing related solutions, security has always been a major concern. This article summarizes security concerns in cloud computing and proposes five service deployment models to ease these concerns. The proposed models provide different security related features to address different requirements and scenarios and can serve as reference models for deployment. D...

  9. The emerging role of cloud computing in molecular modelling.

    Science.gov (United States)

    Ebejer, Jean-Paul; Fulle, Simone; Morris, Garrett M; Finn, Paul W

    2013-07-01

    There is a growing recognition of the importance of cloud computing for large-scale and data-intensive applications. The distinguishing features of cloud computing and their relationship to other distributed computing paradigms are described, as are the strengths and weaknesses of the approach. We review the use made to date of cloud computing for molecular modelling projects and the availability of front ends for molecular modelling applications. Although the use of cloud computing technologies for molecular modelling is still in its infancy, we demonstrate its potential by presenting several case studies. Rapid growth can be expected as more applications become available and costs continue to fall; cloud computing can make a major contribution not just in terms of the availability of on-demand computing power, but could also spur innovation in the development of novel approaches that utilize that capacity in more effective ways. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  11. Life system modeling and intelligent computing. Pt. I. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part I of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 55 papers in this volume are organized in topical sections on intelligent modeling, monitoring, and control of complex nonlinear systems; autonomy-oriented computing and intelligent agents; advanced theory and methodology in fuzzy systems and soft computing; computational intelligence in utilization of clean and renewable energy resources; intelligent modeling, control and supervision for energy saving and pollution reduction; intelligent methods in developing vehicles, engines and equipments; computational methods and intelligence in modeling genetic and biochemical networks and regulation. (orig.)

  12. Computational Models of Rock Failure

    Science.gov (United States)

    May, Dave A.; Spiegelman, Marc

    2017-04-01

    Practitioners in computational geodynamics, as per many other branches of applied science, typically do not analyse the underlying PDE's being solved in order to establish the existence or uniqueness of solutions. Rather, such proofs are left to the mathematicians, and all too frequently these results lag far behind (in time) the applied research being conducted, are often unintelligible to the non-specialist, are buried in journals applied scientists simply do not read, or simply have not been proven. As practitioners, we are by definition pragmatic. Thus, rather than first analysing our PDE's, we first attempt to find approximate solutions by throwing all our computational methods and machinery at the given problem and hoping for the best. Typically this approach leads to a satisfactory outcome. Usually it is only if the numerical solutions "look odd" that we start delving deeper into the math. In this presentation I summarise our findings in relation to using pressure dependent (Drucker-Prager type) flow laws in a simplified model of continental extension in which the material is assumed to be an incompressible, highly viscous fluid. Such assumptions represent the current mainstream adopted in computational studies of mantle and lithosphere deformation within our community. In short, we conclude that for the parameter range of cohesion and friction angle relevant to studying rocks, the incompressibility constraint combined with a Drucker-Prager flow law can result in problems which have no solution. This is proven by a 1D analytic model and convincingly demonstrated by 2D numerical simulations. To date, we do not have a robust "fix" for this fundamental problem. The intent of this submission is to highlight the importance of simple analytic models, highlight some of the dangers / risks of interpreting numerical solutions without understanding the properties of the PDE we solved, and lastly to stimulate discussions to develop an improved computational model of

  13. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  14. Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.

    Science.gov (United States)

    Pearl, Lisa S; Sprouse, Jon

    2015-06-01

    Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.

  15. Computational modeling of neural activities for statistical inference

    CERN Document Server

    Kolossa, Antonio

    2016-01-01

    This authored monograph supplies empirical evidence for the Bayesian brain hypothesis by modeling event-related potentials (ERP) of the human electroencephalogram (EEG) during successive trials in cognitive tasks. The employed observer models are useful to compute probability distributions over observable events and hidden states, depending on which are present in the respective tasks. Bayesian model selection is then used to choose the model which best explains the ERP amplitude fluctuations. Thus, this book constitutes a decisive step towards a better understanding of the neural coding and computing of probabilities following Bayesian rules. The target audience primarily comprises research experts in the field of computational neurosciences, but the book may also be beneficial for graduate students who want to specialize in this field. .

  16. An integrative computational modelling of music structure apprehension

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2014-01-01

    , the computational model, by virtue of its generality, extensiveness and operationality, is suggested as a blueprint for the establishment of cognitively validated model of music structure apprehension. Available as a Matlab module, it can be used for practical musicological uses.......An objectivization of music analysis requires a detailed formalization of the underlying principles and methods. The formalization of the most elementary structural processes is hindered by the complexity of music, both in terms of profusions of entities (such as notes) and of tight interactions...... between a large number of dimensions. Computational modeling would enable systematic and exhaustive tests on sizeable pieces of music, yet current researches cover particular musical dimensions with limited success. The aim of this research is to conceive a computational modeling of music analysis...

  17. Computational algebraic geometry of epidemic models

    Science.gov (United States)

    Rodríguez Vega, Martín.

    2014-06-01

    Computational Algebraic Geometry is applied to the analysis of various epidemic models for Schistosomiasis and Dengue, both, for the case without control measures and for the case where control measures are applied. The models were analyzed using the mathematical software Maple. Explicitly the analysis is performed using Groebner basis, Hilbert dimension and Hilbert polynomials. These computational tools are included automatically in Maple. Each of these models is represented by a system of ordinary differential equations, and for each model the basic reproductive number (R0) is calculated. The effects of the control measures are observed by the changes in the algebraic structure of R0, the changes in Groebner basis, the changes in Hilbert dimension, and the changes in Hilbert polynomials. It is hoped that the results obtained in this paper become of importance for designing control measures against the epidemic diseases described. For future researches it is proposed the use of algebraic epidemiology to analyze models for airborne and waterborne diseases.

  18. Modeling of Communication in a Computational Situation Assessment Model

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, or situation awareness, because failures of situation assessment may result in wrong decisions for process control and finally errors of commission in nuclear power plants. Quantitative or prescriptive models to predict operator's situation assessment in a situation, the results of situation assessment, provide many benefits such as HSI design solutions, human performance data, and human reliability. Unfortunately, a few computational situation assessment models for NPP operators have been proposed and those insufficiently embed human cognitive characteristics. Thus we proposed a new computational situation assessment model of nuclear power plant operators. The proposed model incorporating significant cognitive factors uses a Bayesian belief network (BBN) as model architecture. It is believed that communication between nuclear power plant operators affects operators' situation assessment and its result, situation awareness. We tried to verify that the proposed model represent the effects of communication on situation assessment. As the result, the proposed model succeeded in representing the operators' behavior and this paper shows the details

  19. Computer models of vocal tract evolution: an overview and critique

    NARCIS (Netherlands)

    de Boer, B.; Fitch, W. T.

    2010-01-01

    Human speech has been investigated with computer models since the invention of digital computers, and models of the evolution of speech first appeared in the late 1960s and early 1970s. Speech science and computer models have a long shared history because speech is a physical signal and can be

  20. The complete guide to blender graphics computer modeling and animation

    CERN Document Server

    Blain, John M

    2014-01-01

    Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments

  1. COMPUTATIONAL MODELING OF AIRFLOW IN NONREGULAR SHAPED CHANNELS

    Directory of Open Access Journals (Sweden)

    A. A. Voronin

    2013-05-01

    Full Text Available The basic approaches to computational modeling of airflow in the human nasal cavity are analyzed. Different models of turbulent flow which may be used in order to calculate air velocity and pressure are discussed. Experimental measurement results of airflow temperature are illustrated. Geometrical model of human nasal cavity reconstructed from computer-aided tomography scans and numerical simulation results of airflow inside this model are also given. Spatial distributions of velocity and temperature for inhaled and exhaled air are shown.

  2. Applied Mathematics, Modelling and Computational Science

    CERN Document Server

    Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

    2015-01-01

    The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

  3. Editorial: Modelling and computational challenges in granular materials

    NARCIS (Netherlands)

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss

  4. Generating Computational Models for Serious Gaming

    NARCIS (Netherlands)

    Westera, Wim

    2018-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  5. Airfoil Computations using the γ - Reθ Model

    DEFF Research Database (Denmark)

    Sørensen, Niels N.

    computations. Based on this, an estimate of the error in the computations is determined to be approximately one percent in the attached region. Following the verification of the implemented model, the model is applied to four airfoils, NACA64- 018, NACA64-218, NACA64-418 and NACA64-618 and the results...

  6. A System Computational Model of Implicit Emotional Learning.

    Science.gov (United States)

    Puviani, Luca; Rama, Sidita

    2016-01-01

    Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation.

  7. Category-theoretic models of algebraic computer systems

    Science.gov (United States)

    Kovalyov, S. P.

    2016-01-01

    A computer system is said to be algebraic if it contains nodes that implement unconventional computation paradigms based on universal algebra. A category-based approach to modeling such systems that provides a theoretical basis for mapping tasks to these systems' architecture is proposed. The construction of algebraic models of general-purpose computations involving conditional statements and overflow control is formally described by a reflector in an appropriate category of algebras. It is proved that this reflector takes the modulo ring whose operations are implemented in the conventional arithmetic processors to the Łukasiewicz logic matrix. Enrichments of the set of ring operations that form bases in the Łukasiewicz logic matrix are found.

  8. Computational Methods for Modeling Aptamers and Designing Riboswitches

    Directory of Open Access Journals (Sweden)

    Sha Gong

    2017-11-01

    Full Text Available Riboswitches, which are located within certain noncoding RNA region perform functions as genetic “switches”, regulating when and where genes are expressed in response to certain ligands. Understanding the numerous functions of riboswitches requires computation models to predict structures and structural changes of the aptamer domains. Although aptamers often form a complex structure, computational approaches, such as RNAComposer and Rosetta, have already been applied to model the tertiary (three-dimensional (3D structure for several aptamers. As structural changes in aptamers must be achieved within the certain time window for effective regulation, kinetics is another key point for understanding aptamer function in riboswitch-mediated gene regulation. The coarse-grained self-organized polymer (SOP model using Langevin dynamics simulation has been successfully developed to investigate folding kinetics of aptamers, while their co-transcriptional folding kinetics can be modeled by the helix-based computational method and BarMap approach. Based on the known aptamers, the web server Riboswitch Calculator and other theoretical methods provide a new tool to design synthetic riboswitches. This review will represent an overview of these computational methods for modeling structure and kinetics of riboswitch aptamers and for designing riboswitches.

  9. Cosmic logic: a computational model

    International Nuclear Information System (INIS)

    Vanchurin, Vitaly

    2016-01-01

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps

  10. Computational modelling of the impact of AIDS on business.

    Science.gov (United States)

    Matthews, Alan P

    2007-07-01

    An overview of computational modelling of the impact of AIDS on business in South Africa, with a detailed description of the AIDS Projection Model (APM) for companies, developed by the author, and suggestions for further work. Computational modelling of the impact of AIDS on business in South Africa requires modelling of the epidemic as a whole, and of its impact on a company. This paper gives an overview of epidemiological modelling, with an introduction to the Actuarial Society of South Africa (ASSA) model, the most widely used such model for South Africa. The APM produces projections of HIV prevalence, new infections, and AIDS mortality on a company, based on the anonymous HIV testing of company employees, and projections from the ASSA model. A smoothed statistical model of the prevalence test data is computed, and then the ASSA model projection for each category of employees is adjusted so that it matches the measured prevalence in the year of testing. FURTHER WORK: Further techniques that could be developed are microsimulation (representing individuals in the computer), scenario planning for testing strategies, and models for the business environment, such as models of entire sectors, and mapping of HIV prevalence in time and space, based on workplace and community data.

  11. Computational neurorehabilitation: modeling plasticity and learning to predict recovery.

    Science.gov (United States)

    Reinkensmeyer, David J; Burdet, Etienne; Casadio, Maura; Krakauer, John W; Kwakkel, Gert; Lang, Catherine E; Swinnen, Stephan P; Ward, Nick S; Schweighofer, Nicolas

    2016-04-30

    Despite progress in using computational approaches to inform medicine and neuroscience in the last 30 years, there have been few attempts to model the mechanisms underlying sensorimotor rehabilitation. We argue that a fundamental understanding of neurologic recovery, and as a result accurate predictions at the individual level, will be facilitated by developing computational models of the salient neural processes, including plasticity and learning systems of the brain, and integrating them into a context specific to rehabilitation. Here, we therefore discuss Computational Neurorehabilitation, a newly emerging field aimed at modeling plasticity and motor learning to understand and improve movement recovery of individuals with neurologic impairment. We first explain how the emergence of robotics and wearable sensors for rehabilitation is providing data that make development and testing of such models increasingly feasible. We then review key aspects of plasticity and motor learning that such models will incorporate. We proceed by discussing how computational neurorehabilitation models relate to the current benchmark in rehabilitation modeling - regression-based, prognostic modeling. We then critically discuss the first computational neurorehabilitation models, which have primarily focused on modeling rehabilitation of the upper extremity after stroke, and show how even simple models have produced novel ideas for future investigation. Finally, we conclude with key directions for future research, anticipating that soon we will see the emergence of mechanistic models of motor recovery that are informed by clinical imaging results and driven by the actual movement content of rehabilitation therapy as well as wearable sensor-based records of daily activity.

  12. Computational modeling and engineering in pediatric and congenital heart disease.

    Science.gov (United States)

    Marsden, Alison L; Feinstein, Jeffrey A

    2015-10-01

    Recent methodological advances in computational simulations are enabling increasingly realistic simulations of hemodynamics and physiology, driving increased clinical utility. We review recent developments in the use of computational simulations in pediatric and congenital heart disease, describe the clinical impact in modeling in single-ventricle patients, and provide an overview of emerging areas. Multiscale modeling combining patient-specific hemodynamics with reduced order (i.e., mathematically and computationally simplified) circulatory models has become the de-facto standard for modeling local hemodynamics and 'global' circulatory physiology. We review recent advances that have enabled faster solutions, discuss new methods (e.g., fluid structure interaction and uncertainty quantification), which lend realism both computationally and clinically to results, highlight novel computationally derived surgical methods for single-ventricle patients, and discuss areas in which modeling has begun to exert its influence including Kawasaki disease, fetal circulation, tetralogy of Fallot (and pulmonary tree), and circulatory support. Computational modeling is emerging as a crucial tool for clinical decision-making and evaluation of novel surgical methods and interventions in pediatric cardiology and beyond. Continued development of modeling methods, with an eye towards clinical needs, will enable clinical adoption in a wide range of pediatric and congenital heart diseases.

  13. Climate models on massively parallel computers

    International Nuclear Information System (INIS)

    Vitart, F.; Rouvillois, P.

    1993-01-01

    First results got on massively parallel computers (Multiple Instruction Multiple Data and Simple Instruction Multiple Data) allow to consider building of coupled models with high resolutions. This would make possible simulation of thermoaline circulation and other interaction phenomena between atmosphere and ocean. The increasing of computers powers, and then the improvement of resolution will go us to revise our approximations. Then hydrostatic approximation (in ocean circulation) will not be valid when the grid mesh will be of a dimension lower than a few kilometers: We shall have to find other models. The expert appraisement got in numerical analysis at the Center of Limeil-Valenton (CEL-V) will be used again to imagine global models taking in account atmosphere, ocean, ice floe and biosphere, allowing climate simulation until a regional scale

  14. Rough – Granular Computing knowledge discovery models

    Directory of Open Access Journals (Sweden)

    Mohammed M. Eissa

    2016-11-01

    Full Text Available Medical domain has become one of the most important areas of research in order to richness huge amounts of medical information about the symptoms of diseases and how to distinguish between them to diagnose it correctly. Knowledge discovery models play vital role in refinement and mining of medical indicators to help medical experts to settle treatment decisions. This paper introduces four hybrid Rough – Granular Computing knowledge discovery models based on Rough Sets Theory, Artificial Neural Networks, Genetic Algorithm and Rough Mereology Theory. A comparative analysis of various knowledge discovery models that use different knowledge discovery techniques for data pre-processing, reduction, and data mining supports medical experts to extract the main medical indicators, to reduce the misdiagnosis rates and to improve decision-making for medical diagnosis and treatment. The proposed models utilized two medical datasets: Coronary Heart Disease dataset and Hepatitis C Virus dataset. The main purpose of this paper was to explore and evaluate the proposed models based on Granular Computing methodology for knowledge extraction according to different evaluation criteria for classification of medical datasets. Another purpose is to make enhancement in the frame of KDD processes for supervised learning using Granular Computing methodology.

  15. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  16. On turbulence models for rod bundle flow computations

    International Nuclear Information System (INIS)

    Hazi, Gabor

    2005-01-01

    In commercial computational fluid dynamics codes there is more than one turbulence model built in. It is the user responsibility to choose one of those models, suitable for the problem studied. In the last decade, several computations were presented using computational fluid dynamics for the simulation of various problems of the nuclear industry. A common feature in a number of those simulations is that they were performed using the standard k-ε turbulence model without justifying the choice of the model. The simulation results were rarely satisfactory. In this paper, we shall consider the flow in a fuel rod bundle as a case study and discuss why the application of the standard k-ε model fails to give reasonable results in this situation. We also show that a turbulence model based on the Reynolds stress transport equations can provide qualitatively correct results. Generally, our aim is pedagogical, we would like to call the readers attention to the fact that turbulence models have to be selected based on theoretical considerations and/or adequate information obtained from measurements

  17. Assessment of weld thickness loss in offshore pipelines using computed radiography and computational modeling

    International Nuclear Information System (INIS)

    Correa, S.C.A.; Souza, E.M.; Oliveira, D.F.; Silva, A.X.; Lopes, R.T.; Marinho, C.; Camerini, C.S.

    2009-01-01

    In order to guarantee the structural integrity of oil plants it is crucial to monitor the amount of weld thickness loss in offshore pipelines. However, in spite of its relevance, this parameter is very difficult to determine, due to both the large diameter of most pipes and the complexity of the multi-variable system involved. In this study, a computational modeling based on Monte Carlo MCNPX code is combined with computed radiography to estimate the weld thickness loss in large-diameter offshore pipelines. Results show that computational modeling is a powerful tool to estimate intensity variations in radiographic images generated by weld thickness variations, and it can be combined with computed radiography to assess weld thickness loss in offshore and subsea pipelines.

  18. Introduction to computation and modeling for differential equations

    CERN Document Server

    Edsberg, Lennart

    2008-01-01

    An introduction to scientific computing for differential equationsIntroduction to Computation and Modeling for Differential Equations provides a unified and integrated view of numerical analysis, mathematical modeling in applications, and programming to solve differential equations, which is essential in problem-solving across many disciplines, such as engineering, physics, and economics. This book successfully introduces readers to the subject through a unique ""Five-M"" approach: Modeling, Mathematics, Methods, MATLAB, and Multiphysics. This approach facilitates a thorough understanding of h

  19. A novel patient-specific model to compute coronary fractional flow reserve.

    Science.gov (United States)

    Kwon, Soon-Sung; Chung, Eui-Chul; Park, Jin-Seo; Kim, Gook-Tae; Kim, Jun-Woo; Kim, Keun-Hong; Shin, Eun-Seok; Shim, Eun Bo

    2014-09-01

    The fractional flow reserve (FFR) is a widely used clinical index to evaluate the functional severity of coronary stenosis. A computer simulation method based on patients' computed tomography (CT) data is a plausible non-invasive approach for computing the FFR. This method can provide a detailed solution for the stenosed coronary hemodynamics by coupling computational fluid dynamics (CFD) with the lumped parameter model (LPM) of the cardiovascular system. In this work, we have implemented a simple computational method to compute the FFR. As this method uses only coronary arteries for the CFD model and includes only the LPM of the coronary vascular system, it provides simpler boundary conditions for the coronary geometry and is computationally more efficient than existing approaches. To test the efficacy of this method, we simulated a three-dimensional straight vessel using CFD coupled with the LPM. The computed results were compared with those of the LPM. To validate this method in terms of clinically realistic geometry, a patient-specific model of stenosed coronary arteries was constructed from CT images, and the computed FFR was compared with clinically measured results. We evaluated the effect of a model aorta on the computed FFR and compared this with a model without the aorta. Computationally, the model without the aorta was more efficient than that with the aorta, reducing the CPU time required for computing a cardiac cycle to 43.4%. Copyright © 2014. Published by Elsevier Ltd.

  20. Toward a computational model of hemostasis

    Science.gov (United States)

    Leiderman, Karin; Danes, Nicholas; Schoeman, Rogier; Neeves, Keith

    2017-11-01

    Hemostasis is the process by which a blood clot forms to prevent bleeding at a site of injury. The formation time, size and structure of a clot depends on the local hemodynamics and the nature of the injury. Our group has previously developed computational models to study intravascular clot formation, a process confined to the interior of a single vessel. Here we present the first stage of an experimentally-validated, computational model of extravascular clot formation (hemostasis) in which blood through a single vessel initially escapes through a hole in the vessel wall and out a separate injury channel. This stage of the model consists of a system of partial differential equations that describe platelet aggregation and hemodynamics, solved via the finite element method. We also present results from the analogous, in vitro, microfluidic model. In both models, formation of a blood clot occludes the injury channel and stops flow from escaping while blood in the main vessel retains its fluidity. We discuss the different biochemical and hemodynamic effects on clot formation using distinct geometries representing intra- and extravascular injuries.

  1. Computer-Aided Modeling of Lipid Processing Technology

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel

    2011-01-01

    increase along with growing interest in biofuels, the oleochemical industry faces in the upcoming years major challenges in terms of design and development of better products and more sustainable processes to make them. Computer-aided methods and tools for process synthesis, modeling and simulation...... are widely used for design, analysis, and optimization of processes in the chemical and petrochemical industries. These computer-aided tools have helped the chemical industry to evolve beyond commodities toward specialty chemicals and ‘consumer oriented chemicals based products’. Unfortunately...... to develop systematic computer-aided methods (property models) and tools (database) related to the prediction of the necessary physical properties suitable for design and analysis of processes employing lipid technologies. The methods and tools include: the development of a lipid-database (CAPEC...

  2. Computational model for dosimetric purposes in dental procedures

    International Nuclear Information System (INIS)

    Kawamoto, Renato H.; Campos, Tarcisio R.

    2013-01-01

    This study aims to develop a computational model for dosimetric purposes the oral region, based on computational tools SISCODES and MCNP-5, to predict deterministic effects and minimize stochastic effects caused by ionizing radiation by radiodiagnosis. Based on a set of digital information provided by computed tomography, three-dimensional voxel model was created, and its tissues represented. The model was exported to the MCNP code. In association with SICODES, we used the Monte Carlo N-Particle Transport Code (MCNP-5) method to play the corresponding interaction of nuclear particles with human tissues statistical process. The study will serve as a source of data for dosimetric studies in the oral region, providing deterministic effect and minimize the stochastic effect of ionizing radiation

  3. On the Bayesian calibration of computer model mixtures through experimental data, and the design of predictive models

    Science.gov (United States)

    Karagiannis, Georgios; Lin, Guang

    2017-08-01

    For many real systems, several computer models may exist with different physics and predictive abilities. To achieve more accurate simulations/predictions, it is desirable for these models to be properly combined and calibrated. We propose the Bayesian calibration of computer model mixture method which relies on the idea of representing the real system output as a mixture of the available computer model outputs with unknown input dependent weight functions. The method builds a fully Bayesian predictive model as an emulator for the real system output by combining, weighting, and calibrating the available models in the Bayesian framework. Moreover, it fits a mixture of calibrated computer models that can be used by the domain scientist as a mean to combine the available computer models, in a flexible and principled manner, and perform reliable simulations. It can address realistic cases where one model may be more accurate than the others at different input values because the mixture weights, indicating the contribution of each model, are functions of the input. Inference on the calibration parameters can consider multiple computer models associated with different physics. The method does not require knowledge of the fidelity order of the models. We provide a technique able to mitigate the computational overhead due to the consideration of multiple computer models that is suitable to the mixture model framework. We implement the proposed method in a real-world application involving the Weather Research and Forecasting large-scale climate model.

  4. Biomedical Imaging and Computational Modeling in Biomechanics

    CERN Document Server

    Iacoviello, Daniela

    2013-01-01

    This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease.   The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.

  5. Getting computer models to communicate

    International Nuclear Information System (INIS)

    Caremoli, Ch.; Erhard, P.

    1999-01-01

    Today's computers have the processing power to deliver detailed and global simulations of complex industrial processes such as the operation of a nuclear reactor core. So should we be producing new, global numerical models to take full advantage of this new-found power? If so, it would be a long-term job. There is, however, another solution; to couple the existing validated numerical models together so that they work as one. (authors)

  6. Cloud Computing Adoption Business Model Factors: Does Enterprise Size Matter?

    OpenAIRE

    Bogataj Habjan, Kristina; Pucihar, Andreja

    2017-01-01

    This paper presents the results of research investigating the impact of business model factors on cloud computing adoption. The introduced research model consists of 40 cloud computing business model factors, grouped into eight factor groups. Their impact and importance for cloud computing adoption were investigated among enterpirses in Slovenia. Furthermore, differences in opinion according to enterprise size were investigated. Research results show no statistically significant impacts of in...

  7. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented...

  8. Computational Design Modelling : Proceedings of the Design Modelling Symposium

    CERN Document Server

    Kilian, Axel; Palz, Norbert; Scheurer, Fabian

    2012-01-01

    This book publishes the peer-reviewed proceeding of the third Design Modeling Symposium Berlin . The conference constitutes a platform for dialogue on experimental practice and research within the field of computationally informed architectural design. More than 60 leading experts the computational processes within the field of computationally informed architectural design to develop a broader and less exotic building practice that bears more subtle but powerful traces of the complex tool set and approaches we have developed and studied over recent years. The outcome are new strategies for a reasonable and innovative implementation of digital potential in truly innovative and radical design guided by both responsibility towards processes and the consequences they initiate.

  9. Description of mathematical models and computer programs

    International Nuclear Information System (INIS)

    1977-01-01

    The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives

  10. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  11. Computational challenges in modeling gene regulatory events.

    Science.gov (United States)

    Pataskar, Abhijeet; Tiwari, Vijay K

    2016-10-19

    Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.

  12. Shadow Replication: An Energy-Aware, Fault-Tolerant Computational Model for Green Cloud Computing

    Directory of Open Access Journals (Sweden)

    Xiaolong Cui

    2014-08-01

    Full Text Available As the demand for cloud computing continues to increase, cloud service providers face the daunting challenge to meet the negotiated SLA agreement, in terms of reliability and timely performance, while achieving cost-effectiveness. This challenge is increasingly compounded by the increasing likelihood of failure in large-scale clouds and the rising impact of energy consumption and CO2 emission on the environment. This paper proposes Shadow Replication, a novel fault-tolerance model for cloud computing, which seamlessly addresses failure at scale, while minimizing energy consumption and reducing its impact on the environment. The basic tenet of the model is to associate a suite of shadow processes to execute concurrently with the main process, but initially at a much reduced execution speed, to overcome failures as they occur. Two computationally-feasible schemes are proposed to achieve Shadow Replication. A performance evaluation framework is developed to analyze these schemes and compare their performance to traditional replication-based fault tolerance methods, focusing on the inherent tradeoff between fault tolerance, the specified SLA and profit maximization. The results show that Shadow Replication leads to significant energy reduction, and is better suited for compute-intensive execution models, where up to 30% more profit increase can be achieved due to reduced energy consumption.

  13. Efficiency using computer simulation of Reverse Threshold Model Theory on assessing a “One Laptop Per Child” computer versus desktop computer

    Directory of Open Access Journals (Sweden)

    Supat Faarungsang

    2017-04-01

    Full Text Available The Reverse Threshold Model Theory (RTMT model was introduced based on limiting factor concepts, but its efficiency compared to the Conventional Model (CM has not been published. This investigation assessed the efficiency of RTMT compared to CM using computer simulation on the “One Laptop Per Child” computer and a desktop computer. Based on probability values, it was found that RTMT was more efficient than CM among eight treatment combinations and an earlier study verified that RTMT gives complete elimination of random error. Furthermore, RTMT has several advantages over CM and is therefore proposed to be applied to most research data.

  14. The Next Generation ARC Middleware and ATLAS Computing Model

    CERN Document Server

    Filipcic, A; The ATLAS collaboration; Smirnova, O; Konstantinov, A; Karpenko, D

    2012-01-01

    The distributed NDGF Tier-1 and associated Nordugrid clusters are well integrated into the ATLAS computing model but follow a slightly different paradigm than other ATLAS resources. The current strategy does not divide the sites as in the commonly used hierarchical model, but rather treats them as a single storage endpoint and a pool of distributed computing nodes. The next generation ARC middleware with its several new technologies provides new possibilities in development of the ATLAS computing model, such as pilot jobs with pre-cached input files, automatic job migration between the sites, integration of remote sites without connected storage elements, and automatic brokering for jobs with non-standard resource requirements. ARC's data transfer model provides an automatic way for the computing sites to participate in ATLAS' global task management system without requiring centralised brokering or data transfer services. The powerful API combined with Python and Java bindings can easily be used to build new ...

  15. Lattice Boltzmann model capable of mesoscopic vorticity computation

    Science.gov (United States)

    Peng, Cheng; Guo, Zhaoli; Wang, Lian-Ping

    2017-11-01

    It is well known that standard lattice Boltzmann (LB) models allow the strain-rate components to be computed mesoscopically (i.e., through the local particle distributions) and as such possess a second-order accuracy in strain rate. This is one of the appealing features of the lattice Boltzmann method (LBM) which is of only second-order accuracy in hydrodynamic velocity itself. However, no known LB model can provide the same quality for vorticity and pressure gradients. In this paper, we design a multiple-relaxation time LB model on a three-dimensional 27-discrete-velocity (D3Q27) lattice. A detailed Chapman-Enskog analysis is presented to illustrate all the necessary constraints in reproducing the isothermal Navier-Stokes equations. The remaining degrees of freedom are carefully analyzed to derive a model that accommodates mesoscopic computation of all the velocity and pressure gradients from the nonequilibrium moments. This way of vorticity calculation naturally ensures a second-order accuracy, which is also proven through an asymptotic analysis. We thus show, with enough degrees of freedom and appropriate modifications, the mesoscopic vorticity computation can be achieved in LBM. The resulting model is then validated in simulations of a three-dimensional decaying Taylor-Green flow, a lid-driven cavity flow, and a uniform flow passing a fixed sphere. Furthermore, it is shown that the mesoscopic vorticity computation can be realized even with single relaxation parameter.

  16. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking.

    Science.gov (United States)

    Pârvu, Ovidiu; Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour

  17. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    Science.gov (United States)

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  18. Predictive modeling of liquid-sodium thermal–hydraulics experiments and computations

    International Nuclear Information System (INIS)

    Arslan, Erkan; Cacuci, Dan G.

    2014-01-01

    Highlights: • We applied the predictive modeling method of Cacuci and Ionescu-Bujor (2010). • We assimilated data from sodium flow experiments. • We used computational fluid dynamics simulations of sodium experiments. • The predictive modeling method greatly reduced uncertainties in predicted results. - Abstract: This work applies the predictive modeling procedure formulated by Cacuci and Ionescu-Bujor (2010) to assimilate data from liquid-sodium thermal–hydraulics experiments in order to reduce systematically the uncertainties in the predictions of computational fluid dynamics (CFD) simulations. The predicted CFD-results for the best-estimate model parameters and results describing sodium-flow velocities and temperature distributions are shown to be significantly more precise than the original computations and experiments, in that the predicted uncertainties for the best-estimate results and model parameters are significantly smaller than both the originally computed and the experimental uncertainties

  19. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  20. Computational and Statistical Models: A Comparison for Policy Modeling of Childhood Obesity

    Science.gov (United States)

    Mabry, Patricia L.; Hammond, Ross; Ip, Edward Hak-Sing; Huang, Terry T.-K.

    As systems science methodologies have begun to emerge as a set of innovative approaches to address complex problems in behavioral, social science, and public health research, some apparent conflicts with traditional statistical methodologies for public health have arisen. Computational modeling is an approach set in context that integrates diverse sources of data to test the plausibility of working hypotheses and to elicit novel ones. Statistical models are reductionist approaches geared towards proving the null hypothesis. While these two approaches may seem contrary to each other, we propose that they are in fact complementary and can be used jointly to advance solutions to complex problems. Outputs from statistical models can be fed into computational models, and outputs from computational models can lead to further empirical data collection and statistical models. Together, this presents an iterative process that refines the models and contributes to a greater understanding of the problem and its potential solutions. The purpose of this panel is to foster communication and understanding between statistical and computational modelers. Our goal is to shed light on the differences between the approaches and convey what kinds of research inquiries each one is best for addressing and how they can serve complementary (and synergistic) roles in the research process, to mutual benefit. For each approach the panel will cover the relevant "assumptions" and how the differences in what is assumed can foster misunderstandings. The interpretations of the results from each approach will be compared and contrasted and the limitations for each approach will be delineated. We will use illustrative examples from CompMod, the Comparative Modeling Network for Childhood Obesity Policy. The panel will also incorporate interactive discussions with the audience on the issues raised here.

  1. Computational Models for Calcium-Mediated Astrocyte Functions

    Directory of Open Access Journals (Sweden)

    Tiina Manninen

    2018-04-01

    Full Text Available The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop

  2. Computational Models for Calcium-Mediated Astrocyte Functions.

    Science.gov (United States)

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus

  3. The use of conduction model in laser weld profile computation

    Science.gov (United States)

    Grabas, Bogusław

    2007-02-01

    Profiles of joints resulting from deep penetration laser beam welding of a flat workpiece of carbon steel were computed. A semi-analytical conduction model solved with Green's function method was used in computations. In the model, the moving heat source was attenuated exponentially in accordance with Beer-Lambert law. Computational results were compared with those in the experiment.

  4. Computational Psychometrics for Modeling System Dynamics during Stressful Disasters

    Directory of Open Access Journals (Sweden)

    Pietro Cipresso

    2017-08-01

    Full Text Available Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future.

  5. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  6. Computer modeling of road bridge for simulation moving load

    Directory of Open Access Journals (Sweden)

    Miličić Ilija M.

    2016-01-01

    Full Text Available In this paper is shown computational modelling one span road structures truss bridge with the roadway on the upper belt of. Calculation models were treated as planar and spatial girders made up of 1D finite elements with applications for CAA: Tower and Bridge Designer 2016 (2nd Edition. The conducted computer simulations results are obtained for each comparison of the impact of moving load according to the recommendations of the two standards SRPS and AASHATO. Therefore, it is a variant of the bridge structure modeling application that provides Bridge Designer 2016 (2nd Edition identical modeled in an environment of Tower. As important information for the selection of a computer applications point out that the application Bridge Designer 2016 (2nd Edition we arent unable to treat the impacts moving load model under national standard - V600. .

  7. Epistemic Gameplay and Discovery in Computational Model-Based Inquiry Activities

    Science.gov (United States)

    Wilkerson, Michelle Hoda; Shareff, Rebecca; Laina, Vasiliki; Gravel, Brian

    2018-01-01

    In computational modeling activities, learners are expected to discover the inner workings of scientific and mathematical systems: First elaborating their understandings of a given system through constructing a computer model, then "debugging" that knowledge by testing and refining the model. While such activities have been shown to…

  8. Computer-aided modeling for efficient and innovative product-process engineering

    DEFF Research Database (Denmark)

    Heitzig, Martina

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy and water. This trend is set to continue due to the substantial benefits computer...... in chemical and biochemical engineering have been solved to illustrate the application of the generic modelling methodology, the computeraided modelling framework and the developed software tool.......-aided methods provide. The key prerequisite of computer-aided productprocess engineering is however the availability of models of different types, forms and application modes. The development of the models required for the systems under investigation tends to be a challenging, time-consuming and therefore cost...

  9. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  10. Light reflection models for computer graphics.

    Science.gov (United States)

    Greenberg, D P

    1989-04-14

    During the past 20 years, computer graphic techniques for simulating the reflection of light have progressed so that today images of photorealistic quality can be produced. Early algorithms considered direct lighting only, but global illumination phenomena with indirect lighting, surface interreflections, and shadows can now be modeled with ray tracing, radiosity, and Monte Carlo simulations. This article describes the historical development of computer graphic algorithms for light reflection and pictorially illustrates what will be commonly available in the near future.

  11. Python for Scientific Computing Education: Modeling of Queueing Systems

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2014-01-01

    Full Text Available In this paper, we present the methodology for the introduction to scientific computing based on model-centered learning. We propose multiphase queueing systems as a basis for learning objects. We use Python and parallel programming for implementing the models and present the computer code and results of stochastic simulations.

  12. Computational Intelligence. Mortality Models for the Actuary

    NARCIS (Netherlands)

    Willemse, W.J.

    2001-01-01

    This thesis applies computational intelligence to the field of actuarial (insurance) science. In particular, this thesis deals with life insurance where mortality modelling is important. Actuaries use ancient models (mortality laws) from the nineteenth century, for example Gompertz' and Makeham's

  13. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  14. Life system modeling and intelligent computing. Pt. II. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part II of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 56 papers in this volume are organized in topical sections on advanced evolutionary computing theory and algorithms; advanced neural network and fuzzy system theory and algorithms; modeling and simulation of societies and collective behavior; biomedical signal processing, imaging, and visualization; intelligent computing and control in distributed power generation systems; intelligent methods in power and energy infrastructure development; intelligent modeling, monitoring, and control of complex nonlinear systems. (orig.)

  15. Global Stability of an Epidemic Model of Computer Virus

    Directory of Open Access Journals (Sweden)

    Xiaofan Yang

    2014-01-01

    Full Text Available With the rapid popularization of the Internet, computers can enter or leave the Internet increasingly frequently. In fact, no antivirus software can detect and remove all sorts of computer viruses. This implies that viruses would persist on the Internet. To better understand the spread of computer viruses in these situations, a new propagation model is established and analyzed. The unique equilibrium of the model is globally asymptotically stable, in accordance with the reality. A parameter analysis of the equilibrium is also conducted.

  16. Simulation models for computational plasma physics: Concluding report

    International Nuclear Information System (INIS)

    Hewett, D.W.

    1994-01-01

    In this project, the authors enhanced their ability to numerically simulate bounded plasmas that are dominated by low-frequency electric and magnetic fields. They moved towards this goal in several ways; they are now in a position to play significant roles in the modeling of low-frequency electromagnetic plasmas in several new industrial applications. They have significantly increased their facility with the computational methods invented to solve the low frequency limit of Maxwell's equations (DiPeso, Hewett, accepted, J. Comp. Phys., 1993). This low frequency model is called the Streamlined Darwin Field model (SDF, Hewett, Larson, and Doss, J. Comp. Phys., 1992) has now been implemented in a fully non-neutral SDF code BEAGLE (Larson, Ph.D. dissertation, 1993) and has further extended to the quasi-neutral limit (DiPeso, Hewett, Comp. Phys. Comm., 1993). In addition, they have resurrected the quasi-neutral, zero-electron-inertia model (ZMR) and began the task of incorporating internal boundary conditions into this model that have the flexibility of those in GYMNOS, a magnetostatic code now used in ion source work (Hewett, Chen, ICF Quarterly Report, July--September, 1993). Finally, near the end of this project, they invented a new type of banded matrix solver that can be implemented on a massively parallel computer -- thus opening the door for the use of all their ADI schemes on these new computer architecture's (Mattor, Williams, Hewett, submitted to Parallel Computing, 1993)

  17. Pulse cleaning flow models and numerical computation of candle ceramic filters.

    Science.gov (United States)

    Tian, Gui-shan; Ma, Zhen-ji; Zhang, Xin-yi; Xu, Ting-xiang

    2002-04-01

    Analytical and numerical computed models are developed for reverse pulse cleaning system of candle ceramic filters. A standard turbulent model is demonstrated suitably to the designing computation of reverse pulse cleaning system from the experimental and one-dimensional computational result. The computed results can be used to guide the designing of reverse pulse cleaning system, which is optimum Venturi geometry. From the computed results, the general conclusions and the designing methods are obtained.

  18. The role of computer modelling in participatory integrated assessments

    International Nuclear Information System (INIS)

    Siebenhuener, Bernd; Barth, Volker

    2005-01-01

    In a number of recent research projects, computer models have been included in participatory procedures to assess global environmental change. The intention was to support knowledge production and to help the involved non-scientists to develop a deeper understanding of the interactions between natural and social systems. This paper analyses the experiences made in three projects with the use of computer models from a participatory and a risk management perspective. Our cross-cutting analysis of the objectives, the employed project designs and moderation schemes and the observed learning processes in participatory processes with model use shows that models play a mixed role in informing participants and stimulating discussions. However, no deeper reflection on values and belief systems could be achieved. In terms of the risk management phases, computer models serve best the purposes of problem definition and option assessment within participatory integrated assessment (PIA) processes

  19. Distributed parallel computing in stochastic modeling of groundwater systems.

    Science.gov (United States)

    Dong, Yanhui; Li, Guomin; Xu, Haizhen

    2013-03-01

    Stochastic modeling is a rapidly evolving, popular approach to the study of the uncertainty and heterogeneity of groundwater systems. However, the use of Monte Carlo-type simulations to solve practical groundwater problems often encounters computational bottlenecks that hinder the acquisition of meaningful results. To improve the computational efficiency, a system that combines stochastic model generation with MODFLOW-related programs and distributed parallel processing is investigated. The distributed computing framework, called the Java Parallel Processing Framework, is integrated into the system to allow the batch processing of stochastic models in distributed and parallel systems. As an example, the system is applied to the stochastic delineation of well capture zones in the Pinggu Basin in Beijing. Through the use of 50 processing threads on a cluster with 10 multicore nodes, the execution times of 500 realizations are reduced to 3% compared with those of a serial execution. Through this application, the system demonstrates its potential in solving difficult computational problems in practical stochastic modeling. © 2012, The Author(s). Groundwater © 2012, National Ground Water Association.

  20. PETRI NET MODELING OF COMPUTER VIRUS LIFE CYCLE

    African Journals Online (AJOL)

    Dr Obe

    dynamic system analysis is applied to model the virus life cycle. Simulation of the derived model ... Keywords: Virus lifecycle, Petri nets, modeling. simulation. .... complex process. Figure 2 .... by creating Matlab files for five different computer ...

  1. Airfoil computations using the gamma-Retheta model; Wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen, Niels N.

    2009-05-15

    The present work addresses the validation of the implementation of the Menter, Langtry et al. gamma-theta correlation based transition model [1, 2, 3] in the EllipSys2D code. Firstly the 2. order of accuracy of the code is verified using a grid refinement study for laminar, turbulent and transitional computations. Based on this, an estimate of the error in the computations is determined to be approximately one percent in the attached region. Following the verification of the implemented model, the model is applied to four airfoils, NACA64-018, NACA64-218, NACA64-418 and NACA64-618 and the results are compared to measurements [4] and computations using the Xfoil code by Drela et al. [5]. In the linear pre stall region good agreement is observed both for lift and drag, while differences to both measurements and Xfoil computations are observed in stalled conditions. (au)

  2. On the usage of ultrasound computational models for decision making under ambiguity

    Science.gov (United States)

    Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron

    2018-04-01

    Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.

  3. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  4. Computer Modelling «Smart Building»

    Directory of Open Access Journals (Sweden)

    O. Yu. Maryasin

    2016-01-01

    Full Text Available Currently ”Smart building” or ”Smart house” technology is developing actively in industrialized countries. The main idea of ”smart building” or ”smart house” is to have a system which is able to identify definite situations happening in house and respond accordingly. Automated house management system is made for automated control and management and also for organization of interaction between separated systems of engineering equipment. This system includes automation subsystems of one or another engineering equipment as separated components. In order to perform study of different functioning modes of engineering subsystems and the whole system, mathematical and computer modeling needs to be used. From mathematical point of veiw description of ”Smart building” is a continuous-discrete or hybrid system consisting of interacting elements of different nature, whose behavior is described by continuous and discrete processes. In the article the authors present a computer model ”Smart building” which allows to model the work of main engineering subsystems and management algorithms. The model is created in Simulink Matlab system with ”physical modeling” library Simscape and Stateflow library. The peculiarity of this model is the use of specialized management and control algorithms which allow providing coordinated interaction of subsystems and optimizing power consumption. 

  5. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  6. A scalable approach to modeling groundwater flow on massively parallel computers

    International Nuclear Information System (INIS)

    Ashby, S.F.; Falgout, R.D.; Tompson, A.F.B.

    1995-12-01

    We describe a fully scalable approach to the simulation of groundwater flow on a hierarchy of computing platforms, ranging from workstations to massively parallel computers. Specifically, we advocate the use of scalable conceptual models in which the subsurface model is defined independently of the computational grid on which the simulation takes place. We also describe a scalable multigrid algorithm for computing the groundwater flow velocities. We axe thus able to leverage both the engineer's time spent developing the conceptual model and the computing resources used in the numerical simulation. We have successfully employed this approach at the LLNL site, where we have run simulations ranging in size from just a few thousand spatial zones (on workstations) to more than eight million spatial zones (on the CRAY T3D)-all using the same conceptual model

  7. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

  8. ADGEN: ADjoint GENerator for computer models

    Energy Technology Data Exchange (ETDEWEB)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.

    1989-05-01

    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs.

  9. ADGEN: ADjoint GENerator for computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.

    1989-05-01

    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs

  10. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    Science.gov (United States)

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services. PMID:28112020

  11. Establishing a Cloud Computing Success Model for Hospitals in Taiwan.

    Science.gov (United States)

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  12. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    Directory of Open Access Journals (Sweden)

    Jiunn-Woei Lian PhD

    2017-01-01

    Full Text Available The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  13. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  14. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    Science.gov (United States)

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate

  15. A composite computational model of liver glucose homeostasis. I. Building the composite model.

    Science.gov (United States)

    Hetherington, J; Sumner, T; Seymour, R M; Li, L; Rey, M Varela; Yamaji, S; Saffrey, P; Margoninski, O; Bogle, I D L; Finkelstein, A; Warner, A

    2012-04-07

    A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an 'engineering' approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.

  16. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  17. Attacker Modelling in Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Papini, Davide

    in with our everyday life. This future is visible to everyone nowadays: terms like smartphone, cloud, sensor, network etc. are widely known and used in our everyday life. But what about the security of such systems. Ubiquitous computing devices can be limited in terms of energy, computing power and memory...... attacker remain somehow undened and still under extensive investigation. This Thesis explores the nature of the ubiquitous attacker with a focus on how she interacts with the physical world and it denes a model that captures the abilities of the attacker. Furthermore a quantitative implementation...

  18. Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?

    Science.gov (United States)

    Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.

    2017-12-01

    This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.

  19. Methods for teaching geometric modelling and computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Rotkov, S.I.; Faitel`son, Yu. Ts.

    1992-05-01

    This paper considers methods for teaching the methods and algorithms of geometric modelling and computer graphics to programmers, designers and users of CAD and computer-aided research systems. There is a bibliography that can be used to prepare lectures and practical classes. 37 refs., 1 tab.

  20. Computational Modeling of Teaching and Learning through Application of Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Richard Lamb

    2015-09-01

    Full Text Available Within the mind, there are a myriad of ideas that make sense within the bounds of everyday experience, but are not reflective of how the world actually exists; this is particularly true in the domain of science. Classroom learning with teacher explanation are a bridge through which these naive understandings can be brought in line with scientific reality. The purpose of this paper is to examine how the application of a Multiobjective Evolutionary Algorithm (MOEA can work in concert with an existing computational-model to effectively model critical-thinking in the science classroom. An evolutionary algorithm is an algorithm that iteratively optimizes machine learning based computational models. The research question is, does the application of an evolutionary algorithm provide a means to optimize the Student Task and Cognition Model (STAC-M and does the optimized model sufficiently represent and predict teaching and learning outcomes in the science classroom? Within this computational study, the authors outline and simulate the effect of teaching on the ability of a “virtual” student to solve a Piagetian task. Using the Student Task and Cognition Model (STAC-M a computational model of student cognitive processing in science class developed in 2013, the authors complete a computational experiment which examines the role of cognitive retraining on student learning. Comparison of the STAC-M and the STAC-M with inclusion of the Multiobjective Evolutionary Algorithm shows greater success in solving the Piagetian science-tasks post cognitive retraining with the Multiobjective Evolutionary Algorithm. This illustrates the potential uses of cognitive and neuropsychological computational modeling in educational research. The authors also outline the limitations and assumptions of computational modeling.

  1. Finite difference computing with exponential decay models

    CERN Document Server

    Langtangen, Hans Petter

    2016-01-01

    This text provides a very simple, initial introduction to the complete scientific computing pipeline: models, discretization, algorithms, programming, verification, and visualization. The pedagogical strategy is to use one case study – an ordinary differential equation describing exponential decay processes – to illustrate fundamental concepts in mathematics and computer science. The book is easy to read and only requires a command of one-variable calculus and some very basic knowledge about computer programming. Contrary to similar texts on numerical methods and programming, this text has a much stronger focus on implementation and teaches testing and software engineering in particular. .

  2. Computational modelling of thermo-mechanical and transport properties of carbon nanotubes

    International Nuclear Information System (INIS)

    Rafii-Tabar, H.

    2004-01-01

    Over the recent years, numerical modelling and computer-based simulation of the properties of carbon nanotubes have become the focal points of research in computational nano-science and its associated fields of computational condensed matter physics and materials modelling. Modelling of the mechanical, thermal and transport properties of nanotubes via numerical simulations forms the central part of this research, concerned with the nano-scale mechanics and nano-scale thermodynamics of nanotubes, and nano-scale adsorption, storage and flow properties in nanotubes. A review of these properties, obtained via computational modelling studies, is presented here. We first introduce the physics of carbon nanotubes, and then present the computational simulation tools that are appropriate for conducting a modelling study at the nano-scales. These include the molecular dynamics (MD), the Monte Carlo (MC), and the ab initio MD simulation methods. A complete range of inter-atomic potentials, of two-body and many-body varieties, that underlie all the modelling studies considered in this review is also given. Mechanical models from continuum-based elasticity theory that have been extensively employed in computing the energetics of nanotubes, or interpret the results from atomistic modelling, are presented and discussed. These include models based on the continuum theory of curved plates, shells, vibrating rods and bending beams. The validity of these continuum-based models has also been examined and the conditions under which they are applicable to nanotube modelling have been listed. Pertinent concepts from continuum theories of stress analysis are included, and the relevant methods for conducting the computation of the stress tensor, elastic constants and elastic modulii at the atomic level are also given. We then survey a comprehensive range of modelling studies concerned with the adsorption and storage of gases, and flow of fluids, in carbon nanotubes of various types. This

  3. Computational modelling of thermo-mechanical and transport properties of carbon nanotubes

    Energy Technology Data Exchange (ETDEWEB)

    Rafii-Tabar, H

    2004-02-01

    Over the recent years, numerical modelling and computer-based simulation of the properties of carbon nanotubes have become the focal points of research in computational nano-science and its associated fields of computational condensed matter physics and materials modelling. Modelling of the mechanical, thermal and transport properties of nanotubes via numerical simulations forms the central part of this research, concerned with the nano-scale mechanics and nano-scale thermodynamics of nanotubes, and nano-scale adsorption, storage and flow properties in nanotubes. A review of these properties, obtained via computational modelling studies, is presented here. We first introduce the physics of carbon nanotubes, and then present the computational simulation tools that are appropriate for conducting a modelling study at the nano-scales. These include the molecular dynamics (MD), the Monte Carlo (MC), and the ab initio MD simulation methods. A complete range of inter-atomic potentials, of two-body and many-body varieties, that underlie all the modelling studies considered in this review is also given. Mechanical models from continuum-based elasticity theory that have been extensively employed in computing the energetics of nanotubes, or interpret the results from atomistic modelling, are presented and discussed. These include models based on the continuum theory of curved plates, shells, vibrating rods and bending beams. The validity of these continuum-based models has also been examined and the conditions under which they are applicable to nanotube modelling have been listed. Pertinent concepts from continuum theories of stress analysis are included, and the relevant methods for conducting the computation of the stress tensor, elastic constants and elastic modulii at the atomic level are also given. We then survey a comprehensive range of modelling studies concerned with the adsorption and storage of gases, and flow of fluids, in carbon nanotubes of various types. This

  4. Ocean Modeling and Visualization on Massively Parallel Computer

    Science.gov (United States)

    Chao, Yi; Li, P. Peggy; Wang, Ping; Katz, Daniel S.; Cheng, Benny N.

    1997-01-01

    Climate modeling is one of the grand challenges of computational science, and ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change.

  5. Computer modeling of flow induced in-reactor vibrations

    International Nuclear Information System (INIS)

    Turula, P.; Mulcahy, T.M.

    1977-01-01

    An assessment of the reliability of finite element method computer models, as applied to the computation of flow induced vibration response of components used in nuclear reactors, is presented. The prototype under consideration was the Fast Flux Test Facility reactor being constructed for US-ERDA. Data were available from an extensive test program which used a scale model simulating the hydraulic and structural characteristics of the prototype components, subjected to scaled prototypic flow conditions as well as to laboratory shaker excitations. Corresponding analytical solutions of the component vibration problems were obtained using the NASTRAN computer code. Modal analyses and response analyses were performed. The effect of the surrounding fluid was accounted for. Several possible forcing function definitions were considered. Results indicate that modal computations agree well with experimental data. Response amplitude comparisons are good only under conditions favorable to a clear definition of the structural and hydraulic properties affecting the component motion. 20 refs

  6. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    Science.gov (United States)

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  7. Model Infrastruktur dan Manajemen Platform Server Berbasis Cloud Computing

    Directory of Open Access Journals (Sweden)

    Mulki Indana Zulfa

    2017-11-01

    Full Text Available Cloud computing is a new technology that is still very rapidly growing. This technology makes the Internet as the main media for the management of data and applications remotely. Cloud computing allows users to run an application without having to think about infrastructure and its platforms. Other technical aspects such as memory, storage, backup and restore, can be done very easily. This research is intended to modeling the infrastructure and management of computer platform in computer network of Faculty of Engineering, University of Jenderal Soedirman. The first stage in this research is literature study, by finding out the implementation model in previous research. Then the result will be combined with a new approach to existing resources and try to implement directly on the existing server network. The results showed that the implementation of cloud computing technology is able to replace the existing platform network.

  8. Complex system modelling and control through intelligent soft computations

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The book offers a snapshot of the theories and applications of soft computing in the area of complex systems modeling and control. It presents the most important findings discussed during the 5th International Conference on Modelling, Identification and Control, held in Cairo, from August 31-September 2, 2013. The book consists of twenty-nine selected contributions, which have been thoroughly reviewed and extended before their inclusion in the volume. The different chapters, written by active researchers in the field, report on both current theories and important applications of soft-computing. Besides providing the readers with soft-computing fundamentals, and soft-computing based inductive methodologies/algorithms, the book also discusses key industrial soft-computing applications, as well as multidisciplinary solutions developed for a variety of purposes, like windup control, waste management, security issues, biomedical applications and many others. It is a perfect reference guide for graduate students, r...

  9. The Mechanics of Embodiment: A Dialog on Embodiment and Computational Modeling

    Science.gov (United States)

    Pezzulo, Giovanni; Barsalou, Lawrence W.; Cangelosi, Angelo; Fischer, Martin H.; McRae, Ken; Spivey, Michael J.

    2011-01-01

    Embodied theories are increasingly challenging traditional views of cognition by arguing that conceptual representations that constitute our knowledge are grounded in sensory and motor experiences, and processed at this sensorimotor level, rather than being represented and processed abstractly in an amodal conceptual system. Given the established empirical foundation, and the relatively underspecified theories to date, many researchers are extremely interested in embodied cognition but are clamoring for more mechanistic implementations. What is needed at this stage is a push toward explicit computational models that implement sensorimotor grounding as intrinsic to cognitive processes. In this article, six authors from varying backgrounds and approaches address issues concerning the construction of embodied computational models, and illustrate what they view as the critical current and next steps toward mechanistic theories of embodiment. The first part has the form of a dialog between two fictional characters: Ernest, the “experimenter,” and Mary, the “computational modeler.” The dialog consists of an interactive sequence of questions, requests for clarification, challenges, and (tentative) answers, and touches the most important aspects of grounded theories that should inform computational modeling and, conversely, the impact that computational modeling could have on embodied theories. The second part of the article discusses the most important open challenges for embodied computational modeling. PMID:21713184

  10. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  11. Computer Modelling of Photochemical Smog Formation

    Science.gov (United States)

    Huebert, Barry J.

    1974-01-01

    Discusses a computer program that has been used in environmental chemistry courses as an example of modelling as a vehicle for teaching chemical dynamics, and as a demonstration of some of the factors which affect the production of smog. (Author/GS)

  12. Hybrid computer modelling in plasma physics

    International Nuclear Information System (INIS)

    Hromadka, J; Ibehej, T; Hrach, R

    2016-01-01

    Our contribution is devoted to development of hybrid modelling techniques. We investigate sheath structures in the vicinity of solids immersed in low temperature argon plasma of different pressures by means of particle and fluid computer models. We discuss the differences in results obtained by these methods and try to propose a way to improve the results of fluid models in the low pressure area. There is a possibility to employ Chapman-Enskog method to find appropriate closure relations of fluid equations in a case when particle distribution function is not Maxwellian. We try to follow this way to enhance fluid model and to use it in hybrid plasma model further. (paper)

  13. A Computational Model of Selection by Consequences

    Science.gov (United States)

    McDowell, J. J.

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…

  14. Process-Based Development of Competence Models to Computer Science Education

    Science.gov (United States)

    Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter

    2016-01-01

    A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…

  15. Security Management Model in Cloud Computing Environment

    OpenAIRE

    Ahmadpanah, Seyed Hossein

    2016-01-01

    In the cloud computing environment, cloud virtual machine (VM) will be more and more the number of virtual machine security and management faced giant Challenge. In order to address security issues cloud computing virtualization environment, this paper presents a virtual machine based on efficient and dynamic deployment VM security management model state migration and scheduling, study of which virtual machine security architecture, based on AHP (Analytic Hierarchy Process) virtual machine de...

  16. Validation of Computer Models for Homeland Security Purposes

    International Nuclear Information System (INIS)

    Schweppe, John E.; Ely, James; Kouzes, Richard T.; McConn, Ronald J.; Pagh, Richard T.; Robinson, Sean M.; Siciliano, Edward R.; Borgardt, James D.; Bender, Sarah E.; Earnhart, Alison H.

    2005-01-01

    At Pacific Northwest National Laboratory, we are developing computer models of radiation portal monitors for screening vehicles and cargo. Detailed models of the radiation detection equipment, vehicles, cargo containers, cargos, and radioactive sources have been created. These are used to determine the optimal configuration of detectors and the best alarm algorithms for the detection of items of interest while minimizing nuisance alarms due to the presence of legitimate radioactive material in the commerce stream. Most of the modeling is done with the Monte Carlo code MCNP to describe the transport of gammas and neutrons from extended sources through large, irregularly shaped absorbers to large detectors. A fundamental prerequisite is the validation of the computational models against field measurements. We describe the first step of this validation process, the comparison of the models to measurements with bare static sources

  17. Computer-animated model of accommodation and presbyopia.

    Science.gov (United States)

    Goldberg, Daniel B

    2015-02-01

    To understand, demonstrate, and further research the mechanisms of accommodation and presbyopia. Private practice, Little Silver, New Jersey, USA. Experimental study. The CAMA 2.0 computer-animated model of accommodation and presbyopia was produced in collaboration with an experienced medical animator using Autodesk Maya animation software and Adobe After Effects. The computer-animated model demonstrates the configuration and synchronous movements of all accommodative elements. A new classification of the zonular apparatus based on structure and function is proposed. There are 3 divisions of zonular fibers; that is, anterior, crossing, and posterior. The crossing zonular fibers form a scaffolding to support the lens; the anterior and posterior zonular fibers work reciprocally to achieve focused vision. The model demonstrates the important support function of Weiger ligament. Dynamic movement of the ora serrata demonstrates that the forces of ciliary muscle contraction store energy for disaccommodation in the elastic choroid. The flow of aqueous and vitreous provides strong evidence for our understanding of the hydrodynamic interactions during the accommodative cycle. The interaction may result from the elastic stretch in the choroid transmitted to the vitreous rather than from vitreous pressue. The model supports the concept that presbyopia results from loss of elasticity and increasing ocular rigidity in both the lenticular and extralenticular structures. The computer-animated model demonstrates the structures of accommodation moving in synchrony and might enhance understanding of the mechanisms of accommodation and presbyopia. Dr. Goldberg is a consultant to Acevision, Inc., and Bausch & Lomb. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  18. Computer - based modeling in extract sciences research -III ...

    African Journals Online (AJOL)

    Molecular modeling techniques have been of great applicability in the study of the biological sciences and other exact science fields like agriculture, mathematics, computer science and the like. In this write up, a list of computer programs for predicting, for instance, the structure of proteins has been provided. Discussions on ...

  19. Application of computer-aided multi-scale modelling framework – Aerosol case study

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Glarborg, Peter

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy and water. This trend is set to continue due to the substantial benefits computer-aided...... methods provide. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task involving...... numerous steps, expert skills and different modelling tools. This motivates the development of a computer-aided modelling framework that supports the user during model development, documentation, analysis, identification, application and re-use with the goal to increase the efficiency of the modelling...

  20. Computer Aided Multi-Data Fusion Dismount Modeling

    Science.gov (United States)

    2012-03-22

    dependent on a particular environmental condition. They are costly, cumbersome, and involve dedicated software practices and particular knowledge to operate...allow manipulation of 2D matrices, like Microsoft Excel or Libre Office. The second alternative is to modify an already created model (MEM). The model... software . Therefore, with the described computer aided multi-data dismount model the researcher will be able to attach signatures to any desired

  1. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  2. One-dimensional computational modeling on nuclear reactor problems

    International Nuclear Information System (INIS)

    Alves Filho, Hermes; Baptista, Josue Costa; Trindade, Luiz Fernando Santos; Heringer, Juan Diego dos Santos

    2013-01-01

    In this article, we present a computational modeling, which gives us a dynamic view of some applications of Nuclear Engineering, specifically in the power distribution and the effective multiplication factor (keff) calculations. We work with one-dimensional problems of deterministic neutron transport theory, with the linearized Boltzmann equation in the discrete ordinates (SN) formulation, independent of time, with isotropic scattering and then built a software (Simulator) for modeling computational problems used in a typical calculations. The program used in the implementation of the simulator was Matlab, version 7.0. (author)

  3. Visual Attention Modeling for Stereoscopic Video: A Benchmark and Computational Model.

    Science.gov (United States)

    Fang, Yuming; Zhang, Chi; Li, Jing; Lei, Jianjun; Perreira Da Silva, Matthieu; Le Callet, Patrick

    2017-10-01

    In this paper, we investigate the visual attention modeling for stereoscopic video from the following two aspects. First, we build one large-scale eye tracking database as the benchmark of visual attention modeling for stereoscopic video. The database includes 47 video sequences and their corresponding eye fixation data. Second, we propose a novel computational model of visual attention for stereoscopic video based on Gestalt theory. In the proposed model, we extract the low-level features, including luminance, color, texture, and depth, from discrete cosine transform coefficients, which are used to calculate feature contrast for the spatial saliency computation. The temporal saliency is calculated by the motion contrast from the planar and depth motion features in the stereoscopic video sequences. The final saliency is estimated by fusing the spatial and temporal saliency with uncertainty weighting, which is estimated by the laws of proximity, continuity, and common fate in Gestalt theory. Experimental results show that the proposed method outperforms the state-of-the-art stereoscopic video saliency detection models on our built large-scale eye tracking database and one other database (DML-ITRACK-3D).

  4. Computer modeling of commercial refrigerated warehouse facilities

    International Nuclear Information System (INIS)

    Nicoulin, C.V.; Jacobs, P.C.; Tory, S.

    1997-01-01

    The use of computer models to simulate the energy performance of large commercial refrigeration systems typically found in food processing facilities is an area of engineering practice that has seen little development to date. Current techniques employed in predicting energy consumption by such systems have focused on temperature bin methods of analysis. Existing simulation tools such as DOE2 are designed to model commercial buildings and grocery store refrigeration systems. The HVAC and Refrigeration system performance models in these simulations tools model equipment common to commercial buildings and groceries, and respond to energy-efficiency measures likely to be applied to these building types. The applicability of traditional building energy simulation tools to model refrigerated warehouse performance and analyze energy-saving options is limited. The paper will present the results of modeling work undertaken to evaluate energy savings resulting from incentives offered by a California utility to its Refrigerated Warehouse Program participants. The TRNSYS general-purpose transient simulation model was used to predict facility performance and estimate program savings. Custom TRNSYS components were developed to address modeling issues specific to refrigerated warehouse systems, including warehouse loading door infiltration calculations, an evaporator model, single-state and multi-stage compressor models, evaporative condenser models, and defrost energy requirements. The main focus of the paper will be on the modeling approach. The results from the computer simulations, along with overall program impact evaluation results, will also be presented

  5. SPLAI: Computational Finite Element Model for Sensor Networks

    Directory of Open Access Journals (Sweden)

    Ruzana Ishak

    2006-01-01

    Full Text Available Wireless sensor network refers to a group of sensors, linked by a wireless medium to perform distributed sensing task. The primary interest is their capability in monitoring the physical environment through the deployment of numerous tiny, intelligent, wireless networked sensor nodes. Our interest consists of a sensor network, which includes a few specialized nodes called processing elements that can perform some limited computational capabilities. In this paper, we propose a model called SPLAI that allows the network to compute a finite element problem where the processing elements are modeled as the nodes in the linear triangular approximation problem. Our model also considers the case of some failures of the sensors. A simulation model to visualize this network has been developed using C++ on the Windows environment.

  6. Computer models for fading channels with applications to digital transmission

    Science.gov (United States)

    Loo, Chun; Secord, Norman

    1991-11-01

    The authors describe computer models for Rayleigh, Rician, log-normal, and land-mobile-satellite fading channels. All computer models for the fading channels are based on the manipulation of a white Gaussian random process. This process is approximated by a sum of sinusoids with random phase angle. These models compare very well with analytical models in terms of their probability distribution of envelope and phase of the fading signal. For the land mobile satellite fading channel, results of level crossing rate and average fade duration are given. These results show that the computer models can provide a good coarse estimate of the time statistic of the faded signal. Also, for the land-mobile-satellite fading channel, the results show that a 3-pole Butterworth shaping filter should be used with the model. An example of the application of the land-mobile-satellite fading-channel model to predict the performance of a differential phase-shift keying signal is described.

  7. Computer-Based Molecular Modelling: Finnish School Teachers' Experiences and Views

    Science.gov (United States)

    Aksela, Maija; Lundell, Jan

    2008-01-01

    Modern computer-based molecular modelling opens up new possibilities for chemistry teaching at different levels. This article presents a case study seeking insight into Finnish school teachers' use of computer-based molecular modelling in teaching chemistry, into the different working and teaching methods used, and their opinions about necessary…

  8. Towards a Computational Model of the Self-attribution of Agency

    NARCIS (Netherlands)

    Hindriks, K.V.; Wiggers, P.; Jonker, C.M.; Haselager, W.F.G.; Mehrotra, K.G.; Mohan, C.K.; Oh, J.C.; Varshney, P.K.; Ali, M.

    2011-01-01

    In this paper, a first step towards a computational model of the self-attribution of agency is presented, based on Wegner’s theory of apparent mental causation. A model to compute a feeling of doing based on first-order Bayesian network theory is introduced that incorporates the main contributing

  9. Towards a computational model of the self-attribution of agency

    NARCIS (Netherlands)

    Hindriks, K.V.; Wiggers, P.; Jonker, C.M.; Haselager, W.F.G.; Olivier, P.; Kray, C.

    2007-01-01

    In this paper, a first step towards a computational model of the self-attribution of agency is presented, based on Wegner’s theory of apparent mental causation. A model to compute a feeling of doing based on first-order Bayesian network theory is introduced that incorporates the main contributing

  10. Ewe: a computer model for ultrasonic inspection

    International Nuclear Information System (INIS)

    Douglas, S.R.; Chaplin, K.R.

    1991-11-01

    The computer program EWE simulates the propagation of elastic waves in solids and liquids. It has been applied to ultrasonic testing to study the echoes generated by cracks and other types of defects. A discussion of the elastic wave equations is given, including the first-order formulation, shear and compression waves, surface waves and boundaries, numerical method of solution, models for cracks and slot defects, input wave generation, returning echo construction, and general computer issues

  11. Soft Tissue Biomechanical Modeling for Computer Assisted Surgery

    CERN Document Server

    2012-01-01

      This volume focuses on the biomechanical modeling of biological tissues in the context of Computer Assisted Surgery (CAS). More specifically, deformable soft tissues are addressed since they are the subject of the most recent developments in this field. The pioneering works on this CAS topic date from the 1980's, with applications in orthopaedics and biomechanical models of bones. More recently, however, biomechanical models of soft tissues have been proposed since most of the human body is made of soft organs that can be deformed by the surgical gesture. Such models are much more complicated to handle since the tissues can be subject to large deformations (non-linear geometrical framework) as well as complex stress/strain relationships (non-linear mechanical framework). Part 1 of the volume presents biomechanical models that have been developed in a CAS context and used during surgery. This is particularly new since most of the soft tissues models already proposed concern Computer Assisted Planning, with ...

  12. An approximate fractional Gaussian noise model with computational cost

    KAUST Repository

    Sørbye, Sigrunn H.

    2017-09-18

    Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood-based approach is ${\\\\mathcal O}(n^{2})$, exploiting the Toeplitz structure of the covariance matrix. In most realistic cases, we do not observe the fGn process directly but only through indirect Gaussian observations, so the Toeplitz structure is easily lost and the computational cost increases to ${\\\\mathcal O}(n^{3})$. This paper presents an approximate fGn model of ${\\\\mathcal O}(n)$ computational cost, both with direct or indirect Gaussian observations, with or without conditioning. This is achieved by approximating fGn with a weighted sum of independent first-order autoregressive processes, fitting the parameters of the approximation to match the autocorrelation function of the fGn model. The resulting approximation is stationary despite being Markov and gives a remarkably accurate fit using only four components. The performance of the approximate fGn model is demonstrated in simulations and two real data examples.

  13. Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations

    Science.gov (United States)

    Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa

    2013-01-01

    The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…

  14. Validation of a phytoremediation computer model

    Energy Technology Data Exchange (ETDEWEB)

    Corapcioglu, M Y; Sung, K; Rhykerd, R L; Munster, C; Drew, M [Texas A and M Univ., College Station, TX (United States)

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg[sub -1

  15. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  16. Depth-Averaged Non-Hydrostatic Hydrodynamic Model Using a New Multithreading Parallel Computing Method

    Directory of Open Access Journals (Sweden)

    Ling Kang

    2017-03-01

    Full Text Available Compared to the hydrostatic hydrodynamic model, the non-hydrostatic hydrodynamic model can accurately simulate flows that feature vertical accelerations. The model’s low computational efficiency severely restricts its wider application. This paper proposes a non-hydrostatic hydrodynamic model based on a multithreading parallel computing method. The horizontal momentum equation is obtained by integrating the Navier–Stokes equations from the bottom to the free surface. The vertical momentum equation is approximated by the Keller-box scheme. A two-step method is used to solve the model equations. A parallel strategy based on block decomposition computation is utilized. The original computational domain is subdivided into two subdomains that are physically connected via a virtual boundary technique. Two sub-threads are created and tasked with the computation of the two subdomains. The producer–consumer model and the thread lock technique are used to achieve synchronous communication between sub-threads. The validity of the model was verified by solitary wave propagation experiments over a flat bottom and slope, followed by two sinusoidal wave propagation experiments over submerged breakwater. The parallel computing method proposed here was found to effectively enhance computational efficiency and save 20%–40% computation time compared to serial computing. The parallel acceleration rate and acceleration efficiency are approximately 1.45% and 72%, respectively. The parallel computing method makes a contribution to the popularization of non-hydrostatic models.

  17. Computer modeling of ORNL storage tank sludge mobilization and mixing

    International Nuclear Information System (INIS)

    Terrones, G.; Eyler, L.L.

    1993-09-01

    This report presents and analyzes the results of the computer modeling of mixing and mobilization of sludge in horizontal, cylindrical storage tanks using submerged liquid jets. The computer modeling uses the TEMPEST computational fluid dynamics computer program. The horizontal, cylindrical storage tank configuration is similar to the Melton Valley Storage Tanks (MVST) at Oak Ridge National (ORNL). The MVST tank contents exhibit non-homogeneous, non-Newtonian rheology characteristics. The eventual goals of the simulations are to determine under what conditions sludge mobilization using submerged liquid jets is feasible in tanks of this configuration, and to estimate mixing times required to approach homogeneity of the contents of the tanks

  18. Computational implementation of the multi-mechanism deformation coupled fracture model for salt

    International Nuclear Information System (INIS)

    Koteras, J.R.; Munson, D.E.

    1996-01-01

    The Multi-Mechanism Deformation (M-D) model for creep in rock salt has been used in three-dimensional computations for the Waste Isolation Pilot Plant (WIPP), a potential waste, repository. These computational studies are relied upon to make key predictions about long-term behavior of the repository. Recently, the M-D model was extended to include creep-induced damage. The extended model, the Multi-Mechanism Deformation Coupled Fracture (MDCF) model, is considerably more complicated than the M-D model and required a different technology from that of the M-D model for a computational implementation

  19. Computational modeling of plasma-flow switched foil implosions

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1985-01-01

    A ''plasma-flow'', or ''commutator'', switch has been proposed as a means of achieving high dI/dt in a radially imploding metallic foil plasma. In this concept, an axially moving foil provides the initial coaxial gun discharge path for the prime power source and provides and ''integral'' inductive storage of magnetic energy. As the axially moving foil reaches the end of the coaxial gun, a radially imploding load foil is switched into the circuit. The authors have begun two-dimensional computer modeling of the two-foil implosion system. They use a magnetohydrodynamic (MHD) model which includes tabulated state and transport properties of the metallic foil material. Moving numerical grids are used to achieve adequate resolution of the moving foils. A variety of radiation models are used to compute the radiation generated when the imploding load foil converges on axis. These computations are attempting to examine the interaction of the switching foil with the load foil. In particular, they examine the relationship between foil placement and implosion quality

  20. Comparing the performance of SIMD computers by running large air pollution models

    DEFF Research Database (Denmark)

    Brown, J.; Hansen, Per Christian; Wasniewski, J.

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on these computers. Using a realistic large-scale model, we gained detailed insight about the performance of the computers involved when used to solve large-scale scientific...... problems that involve several types of numerical computations. The computers used in our study are the Connection Machines CM-200 and CM-5, and the MasPar MP-2216...

  1. Basic definitions for discrete modeling of computer worms epidemics

    Directory of Open Access Journals (Sweden)

    Pedro Guevara López

    2015-01-01

    Full Text Available The information technologies have evolved in such a way that communication between computers or hosts has become common, so much that the worldwide organization (governments and corporations depends on it; what could happen if these computers stop working for a long time is catastrophic. Unfortunately, networks are attacked by malware such as viruses and worms that could collapse the system. This has served as motivation for the formal study of computer worms and epidemics to develop strategies for prevention and protection; this is why in this paper, before analyzing epidemiological models, a set of formal definitions based on set theory and functions is proposed for describing 21 concepts used in the study of worms. These definitions provide a basis for future qualitative research on the behavior of computer worms, and quantitative for the study of their epidemiological models.

  2. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  3. The Effect of Computer Models as Formative Assessment on Student Understanding of the Nature of Models

    Science.gov (United States)

    Park, Mihwa; Liu, Xiufeng; Smith, Erica; Waight, Noemi

    2017-01-01

    This study reports the effect of computer models as formative assessment on high school students' understanding of the nature of models. Nine high school teachers integrated computer models and associated formative assessments into their yearlong high school chemistry course. A pre-test and post-test of students' understanding of the nature of…

  4. Undergraduate students’ challenges with computational modelling in physics

    Directory of Open Access Journals (Sweden)

    Simen A. Sørby

    2012-12-01

    Full Text Available In later years, computational perspectives have become essential parts in several of the University of Oslo’s natural science studies. In this paper we discuss some main findings from a qualitative study of the computational perspectives’ impact on the students’ work with their first course in physics– mechanics – and their learning and meaning making of its contents. Discussions of the students’ learning of physics are based on sociocultural theory, which originates in Vygotsky and Bakhtin, and subsequent physics education research. Results imply that the greatest challenge for students when working with computational assignments is to combine knowledge from previously known, but separate contexts. Integrating knowledge of informatics, numerical and analytical mathematics and conceptual understanding of physics appears as a clear challenge for the students. We also observe alack of awareness concerning the limitations of physical modelling. The students need help with identifying the appropriate knowledge system or “tool set”, for the different tasks at hand; they need helpto create a plan for their modelling and to become aware of its limits. In light of this, we propose thatan instructive and dialogic text as basis for the exercises, in which the emphasis is on specification, clarification and elaboration, would be of potential great aid for students who are new to computational modelling.

  5. Handbook of nature-inspired and innovative computing integrating classical models with emerging technologies

    CERN Document Server

    2006-01-01

    As computing devices proliferate, demand increases for an understanding of emerging computing paradigms and models based on natural phenomena. This handbook explores the connection between nature-inspired and traditional computational paradigms. It presents computing paradigms and models based on natural phenomena.

  6. Computational comparison of quantum-mechanical models for multistep direct reactions

    International Nuclear Information System (INIS)

    Koning, A.J.; Akkermans, J.M.

    1993-01-01

    We have carried out a computational comparison of all existing quantum-mechanical models for multistep direct (MSD) reactions. The various MSD models, including the so-called Feshbach-Kerman-Koonin, Tamura-Udagawa-Lenske and Nishioka-Yoshida-Weidenmueller models, have been implemented in a single computer system. All model calculations thus use the same set of parameters and the same numerical techniques; only one adjustable parameter is employed. The computational results have been compared with experimental energy spectra and angular distributions for several nuclear reactions, namely, 90 Zr(p,p') at 80 MeV, 209 Bi(p,p') at 62 MeV, and 93 Nb(n,n') at 25.7 MeV. In addition, the results have been compared with the Kalbach systematics and with semiclassical exciton model calculations. All quantum MSD models provide a good fit to the experimental data. In addition, they reproduce the systematics very well and are clearly better than semiclassical model calculations. We furthermore show that the calculated predictions do not differ very strongly between the various quantum MSD models, leading to the conclusion that the simplest MSD model (the Feshbach-Kerman-Koonin model) is adequate for the analysis of experimental data

  7. An integrated introduction to computer graphics and geometric modeling

    CERN Document Server

    Goldman, Ronald

    2009-01-01

    … this book may be the first book on geometric modelling that also covers computer graphics. In addition, it may be the first book on computer graphics that integrates a thorough introduction to 'freedom' curves and surfaces and to the mathematical foundations for computer graphics. … the book is well suited for an undergraduate course. … The entire book is very well presented and obviously written by a distinguished and creative researcher and educator. It certainly is a textbook I would recommend. …-Computer-Aided Design, 42, 2010… Many books concentrate on computer programming and soon beco

  8. A security model for saas in cloud computing

    International Nuclear Information System (INIS)

    Abbas, R.; Farooq, A.

    2016-01-01

    Cloud computing is a type of computing that relies on sharing computing resources rather than having local servers or personal devices to handle applications. It has many service modes like Software as-a-Service (SaaS), Platform-as-a-Service (PaaS), Infrastructure-as-a-Service (IaaS). In SaaS model, service providers install and activate the applications in cloud and cloud customers access the software from cloud. So, the user does not have the need to purchase and install a particular software on his/her machine. While using SaaS model, there are multiple security issues and problems like Data security, Data breaches, Network security, Authentication and authorization, Data integrity, Availability, Web application security and Backup which are faced by users. Many researchers minimize these security problems by putting in hard work. A large work has been done to resolve these problems but there are a lot of issues that persist and need to overcome. In this research work, we have developed a security model that improves the security of data according to the desire of the End-user. The proposed model for different data security options can be helpful to increase the data security through which trade-off between functionalities can be optimized for private and public data. (author)

  9. Computer modeling of liquid crystals

    International Nuclear Information System (INIS)

    Al-Barwani, M.S.

    1999-01-01

    In this thesis, we investigate several aspects of the behaviour of liquid crystal molecules near interfaces using computer simulation. We briefly discuss experiment, theoretical and computer simulation studies of some of the liquid crystal interfaces. We then describe three essentially independent research topics. The first of these concerns extensive simulations of a liquid crystal formed by long flexible molecules. We examined the bulk behaviour of the model and its structure. Studies of a film of smectic liquid crystal surrounded by vapour were also carried out. Extensive simulations were also done for a long-molecule/short-molecule mixture, studies were then carried out to investigate the liquid-vapour interface of the mixture. Next, we report the results of large scale simulations of soft-spherocylinders of two different lengths. We examined the bulk coexistence of the nematic and isotropic phases of the model. Once the bulk coexistence behaviour was known, properties of the nematic-isotropic interface were investigated. This was done by fitting order parameter and density profiles to appropriate mathematical functions and calculating the biaxial order parameter. We briefly discuss the ordering at the interfaces and make attempts to calculate the surface tension. Finally, in our third project, we study the effects of different surface topographies on creating bistable nematic liquid crystal devices. This was carried out using a model based on the discretisation of the free energy on a lattice. We use simulation to find the lowest energy states and investigate if they are degenerate in energy. We also test our model by studying the Frederiks transition and comparing with analytical and other simulation results. (author)

  10. 1st European-Middle Asian Conference on Computer Modelling 2015

    CERN Document Server

    Kolosov, Dmitrii; Snášel, Václav; Karakeyev, Taalaybek; Abraham, Ajith

    2016-01-01

    This volume of Advances in Intelligent Systems and Computing contains papers presented at the 1st European-Middle Asian Conference on Computer Modelling, EMACOM 2015. This international conference was conceived as a brand new scientific and social event of mutual collaboration between the VSB - Technical University of Ostrava (Ostrava, Czech Republic) and the Kyrgyz National University named after J. Balasagyn (Bishkek, Kyrgyz Republic). The aim of EMACOM 2015 was to present the latest development in the field of computer-aided modelling as an essential aspect of research and development of innovative systems and their applications. The conference showed that together with simulations, various modeling techniques, enabled and encouraged by the rapid development of high-performance computing platforms, are crucial for cost-efficient design, verification, and prototyping of solutions in many diverse industrial fields spanning the whole range from manufacturing, mining, machinery, and automotive industries to in...

  11. Visual and Computational Modelling of Minority Games

    Directory of Open Access Journals (Sweden)

    Robertas Damaševičius

    2017-02-01

    Full Text Available The paper analyses the Minority Game and focuses on analysis and computational modelling of several variants (variable payoff, coalition-based and ternary voting of Minority Game using UAREI (User-Action-Rule-Entities-Interface model. UAREI is a model for formal specification of software gamification, and the UAREI visual modelling language is a language used for graphical representation of game mechanics. The URAEI model also provides the embedded executable modelling framework to evaluate how the rules of the game will work for the players in practice. We demonstrate flexibility of UAREI model for modelling different variants of Minority Game rules for game design.

  12. Computational quench model applicable to the SMES/CICC

    Science.gov (United States)

    Luongo, Cesar A.; Chang, Chih-Lien; Partain, Kenneth D.

    1994-07-01

    A computational quench model accounting for the hydraulic peculiarities of the 200 kA SMES cable-in-conduit conductor has been developed. The model is presented and used to simulate the quench on the SMES-ETM. Conclusions are drawn concerning quench detection and protection. A plan for quench model validation is presented.

  13. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

    International Nuclear Information System (INIS)

    Foster, C.

    2001-01-01

    The development of facilities to deal with the disposition of nuclear materials at an acceptable level of Occupational Radiation Exposure (ORE) is a significant issue facing the nuclear community. One solution is to minimize the worker's exposure though the use of automated systems. However, the adoption of automated systems for these tasks is hampered by the challenging requirements that these systems must meet in order to be cost effective solutions in the hazardous nuclear materials processing environment. Retrofitting current glove box technologies with automation systems represents potential near-term technology that can be applied to reduce worker ORE associated with work in nuclear materials processing facilities. Successful deployment of automation systems for these applications requires the development of testing and deployment strategies to ensure the highest level of safety and effectiveness. Historically, safety tests are conducted with glove box mock-ups around the finished design. This late detection of problems leads to expensive redesigns and costly deployment delays. With wide spread availability of computers and cost effective simulation software it is possible to discover and fix problems early in the design stages. Computer simulators can easily create a complete model of the system allowing a safe medium for testing potential failures and design shortcomings. The majority of design specification is now done on computer and moving that information to a model is relatively straightforward. With a complete model and results from a Failure Mode Effect Analysis (FMEA), redesigns can be worked early. Additional issues such as user accessibility, component replacement, and alignment problems can be tackled early in the virtual environment provided by computer simulation. In this case, a commercial simulation package is used to simulate a lathe process operation at the Los Alamos National Laboratory (LANL). The Lathe process operation is indicative of

  14. Computer model for harmonic ultrasound imaging.

    Science.gov (United States)

    Li, Y; Zagzebski, J A

    2000-01-01

    Harmonic ultrasound imaging has received great attention from ultrasound scanner manufacturers and researchers. In this paper, we present a computer model that can generate realistic harmonic images. In this model, the incident ultrasound is modeled after the "KZK" equation, and the echo signal is modeled using linear propagation theory because the echo signal is much weaker than the incident pulse. Both time domain and frequency domain numerical solutions to the "KZK" equation were studied. Realistic harmonic images of spherical lesion phantoms were generated for scans by a circular transducer. This model can be a very useful tool for studying the harmonic buildup and dissipation processes in a nonlinear medium, and it can be used to investigate a wide variety of topics related to B-mode harmonic imaging.

  15. A novel low-parameter computational model to aid in-silico glycoengineering

    DEFF Research Database (Denmark)

    Spahn, Philipp N.; Hansen, Anders Holmgaard; Hansen, Henning Gram

    2015-01-01

    benefit from computational models that would better meet the requirements for industrial utilization. Here, we introduce a novel approach combining constraints-based and stochastic techniques to derive a computational model that can predict the effects of gene knockouts on protein glycoprofiles while...... it does not follow any direct equivalent of a genetic code. Instead, its complex biogenesis in the Golgi apparatus (Figure 1A) integrates a variety of influencing factors most of which are only incompletely understood. Various attempts have been undertaken so far to computationally model the process...

  16. English Writing Teaching Model Dependent on Computer Network Corpus Drive Model

    Directory of Open Access Journals (Sweden)

    Shi Lei

    2018-03-01

    Full Text Available At present, the mainstream lexicalized English writing methods take only the corpus dependence between words into consideration, without introducing the corpus collocation and other issues. “Drive” is a relatively essential feature of words. And once the drive structure of a word is determined, it will be relatively clear what kinds of words to collocate with, hence the structure of the sentence can be derived relatively directly. In this paper, the English writing model that relies on the computer network corpus drive model is put forward. In this model, rich English corpus is introduced in the decomposition of the rules and the calculation of the probability, which includes not only the corpus dependence information, but also the drive structure and other corpus collocation information. Improved computer network corpus drive model is used to carry out the English writing teaching experiment. The experimental results show that the precision and the recall rate are 88.76% and 87.43%, respectively. The F value of the comprehensive index is improved by 6.65% compared with the Collins headword driven English modes of writing.

  17. Computational fluid-dynamic model of laser-induced breakdown in air

    International Nuclear Information System (INIS)

    Dors, Ivan G.; Parigger, Christian G.

    2003-01-01

    Temperature and pressure profiles are computed by the use of a two-dimensional, axially symmetric, time-accurate computational fluid-dynamic model for nominal 10-ns optical breakdown laser pulses. The computational model includes a kinetics mechanism that implements plasma equilibrium kinetics in ionized regions and nonequilibrium, multistep, finite-rate reactions in nonionized regions. Fluid-physics phenomena following laser-induced breakdown are recorded with high-speed shadowgraph techniques. The predicted fluid phenomena are shown by direct comparison with experimental records to agree with the flow patterns that are characteristic of laser spark decay

  18. ANS main control complex three-dimensional computer model development

    International Nuclear Information System (INIS)

    Cleaves, J.E.; Fletcher, W.M.

    1993-01-01

    A three-dimensional (3-D) computer model of the Advanced Neutron Source (ANS) main control complex is being developed. The main control complex includes the main control room, the technical support center, the materials irradiation control room, computer equipment rooms, communications equipment rooms, cable-spreading rooms, and some support offices and breakroom facilities. The model will be used to provide facility designers and operations personnel with capabilities for fit-up/interference analysis, visual ''walk-throughs'' for optimizing maintain-ability, and human factors and operability analyses. It will be used to determine performance design characteristics, to generate construction drawings, and to integrate control room layout, equipment mounting, grounding equipment, electrical cabling, and utility services into ANS building designs. This paper describes the development of the initial phase of the 3-D computer model for the ANS main control complex and plans for its development and use

  19. An ODP computational model of a cooperative binding object

    Science.gov (United States)

    Logé, Christophe; Najm, Elie; Chen, Ken

    1997-12-01

    A next generation of systems that should appear will have to manage simultaneously several geographically distributed users. These systems belong to the class of computer-supported cooperative work systems (CSCW). The development of such complex systems requires rigorous development methods and flexible open architectures. Open distributed processing (ODP) is a standardization effort that aims at providing such architectures. ODP features appropriate abstraction levels and a clear articulation between requirements, programming and infrastructure support. ODP advocates the use of formal methods for the specification of systems and components. The computational model, an object-based model, one of the abstraction levels identified within ODP, plays a central role in the global architecture. In this model, basic objects can be composed with communication and distribution abstractions (called binding objects) to form a computational specification of distributed systems, or applications. Computational specifications can then be mapped (in a mechanism akin to compilation) onto an engineering solution. We use an ODP-inspired method to computationally specify a cooperative system. We start from a general purpose component that we progressively refine into a collection of basic and binding objects. We focus on two issues of a co-authoring application, namely, dynamic reconfiguration and multiview synchronization. We discuss solutions for these issues and formalize them using the MT-LOTOS specification language that is currently studied in the ISO standardization formal description techniques group.

  20. Phantoms and computational models in therapy, diagnosis and protection

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    The development of realistic body phantoms and computational models is strongly dependent on the availability of comprehensive human anatomical data. This information is often missing, incomplete or not easily available. Therefore, emphasis is given in the Report to organ and body masses and geometries. The influence of age, sex and ethnic origins in human anatomy is considered. Suggestions are given on how suitable anatomical data can be either extracted from published information or obtained from measurements on the local population. Existing types of phantoms and computational models used with photons, electrons, protons and neutrons are reviewed in this Report. Specifications of those considered important to the maintenance and development of reliable radiation dosimetry and measurement are given. The information provided includes a description of the phantom or model, together with diagrams or photographs and physical dimensions. The tissues within body sections are identified and the tissue substitutes used or recommended are listed. The uses of the phantom or model in radiation dosimetry and measurement are outlined. The Report deals predominantly with phantom and computational models representing the human anatomy, with a short Section devoted to animal phantoms in radiobiology

  1. A Novel Computer Virus Propagation Model under Security Classification

    Directory of Open Access Journals (Sweden)

    Qingyi Zhu

    2017-01-01

    Full Text Available In reality, some computers have specific security classification. For the sake of safety and cost, the security level of computers will be upgraded with increasing of threats in networks. Here we assume that there exists a threshold value which determines when countermeasures should be taken to level up the security of a fraction of computers with low security level. And in some specific realistic environments the propagation network can be regarded as fully interconnected. Inspired by these facts, this paper presents a novel computer virus dynamics model considering the impact brought by security classification in full interconnection network. By using the theory of dynamic stability, the existence of equilibria and stability conditions is analysed and proved. And the above optimal threshold value is given analytically. Then, some numerical experiments are made to justify the model. Besides, some discussions and antivirus measures are given.

  2. Computer-controlled mechanical lung model for application in pulmonary function studies

    NARCIS (Netherlands)

    A.F.M. Verbraak (Anton); J.E.W. Beneken; J.M. Bogaard (Jan); A. Versprille (Adrian)

    1995-01-01

    textabstractA computer controlled mechanical lung model has been developed for testing lung function equipment, validation of computer programs and simulation of impaired pulmonary mechanics. The construction, function and some applications are described. The physical model is constructed from two

  3. The Next Generation ARC Middleware and ATLAS Computing Model

    International Nuclear Information System (INIS)

    Filipčič, Andrej; Cameron, David; Konstantinov, Aleksandr; Karpenko, Dmytro; Smirnova, Oxana

    2012-01-01

    The distributed NDGF Tier-1 and associated NorduGrid clusters are well integrated into the ATLAS computing environment but follow a slightly different paradigm than other ATLAS resources. The current paradigm does not divide the sites as in the commonly used hierarchical model, but rather treats them as a single storage endpoint and a pool of distributed computing nodes. The next generation ARC middleware with its several new technologies provides new possibilities in development of the ATLAS computing model, such as pilot jobs with pre-cached input files, automatic job migration between the sites, integration of remote sites without connected storage elements, and automatic brokering for jobs with non-standard resource requirements. ARC's data transfer model provides an automatic way for the computing sites to participate in ATLAS’ global task management system without requiring centralised brokering or data transfer services. The powerful API combined with Python and Java bindings can easily be used to build new services for job control and data transfer. Integration of the ARC core into the EMI middleware provides a natural way to implement the new services using the ARC components

  4. A Computer Model for Analyzing Volatile Removal Assembly

    Science.gov (United States)

    Guo, Boyun

    2010-01-01

    A computer model simulates reactional gas/liquid two-phase flow processes in porous media. A typical process is the oxygen/wastewater flow in the Volatile Removal Assembly (VRA) in the Closed Environment Life Support System (CELSS) installed in the International Space Station (ISS). The volatile organics in the wastewater are combusted by oxygen gas to form clean water and carbon dioxide, which is solved in the water phase. The model predicts the oxygen gas concentration profile in the reactor, which is an indicator of reactor performance. In this innovation, a mathematical model is included in the computer model for calculating the mass transfer from the gas phase to the liquid phase. The amount of mass transfer depends on several factors, including gas-phase concentration, distribution, and reaction rate. For a given reactor dimension, these factors depend on pressure and temperature in the reactor and composition and flow rate of the influent.

  5. A stylized computational model of the head for the reference Japanese male

    International Nuclear Information System (INIS)

    Yamauchi, M.; Ishikawa, M.; Hoshi, M.

    2005-01-01

    Computational models of human anatomy, along with Monte Carlo radiation transport simulations, have been used by Snyder et al. [MIRD Pamphlet No. 5, revised (The Society of Nuclear Medicine, New York, 1978)], Cristy and Eckerman [ORNL/TM-8381/VI, Oak Ridge National Laboratory, Oak Ridge, TN (1987)] and Zubal et al. [Med. Phys. 21, 299-302 (1994)] to estimate internal organ doses from internal and external radiation sources. These were created using physiological data from Caucasoid subjects but not from other races. There is a need for research to determine whether the obvious differences from the Caucasoid anatomy make these models unsuitable for estimating the absorbed dose in other races such as the Mongoloid. We used the cranial region of the adult Japanese male to represent the Mongoloid race. This region contains organs that are highly sensitive to radiation. The cranial region of a physical phantom produced by KYOTO KAGAKU Co., LTD. using numerical data from a Japanese Reference Man [Tanaka, Nippon Acta. Radiol. 48, 509-513 (1988)] was used to supply the data for the geometry of a stylized computational model. Our computational model was constructed with equations rather than voxel-based, in order to deal with as small a number of parameters as possible in the computer simulation experiment. The accuracy of our computational model was checked by comparing simulated experimental results obtained with MCNP4C with actual doses measured with thermoluminescence dosimeters (TLDs) inside the physical phantom from which our computational model was constructed. The TLDs, whose margin of error is less than ±10%, were arranged at six positions. Co-60 was used as the radiation source. The irradiated dose was 2 Gy in terms of air kerma. In the computer simulation experiments, we used our computational model and Cristy's computational model, whose component data are those of the tissue substitute materials and of the human body as published in ICRU Report 46. The

  6. A Framework for Understanding Physics Students' Computational Modeling Practices

    Science.gov (United States)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  7. Universal quantum computation by scattering in the Fermi–Hubbard model

    International Nuclear Information System (INIS)

    Bao, Ning; Hayden, Patrick; Salton, Grant; Thomas, Nathaniel

    2015-01-01

    The Hubbard model may be the simplest model of particles interacting on a lattice, but simulation of its dynamics remains beyond the reach of current numerical methods. In this article, we show that general quantum computations can be encoded into the physics of wave packets propagating through a planar graph, with scattering interactions governed by the fermionic Hubbard model. Therefore, simulating the model on planar graphs is as hard as simulating quantum computation. We give two different arguments, demonstrating that the simulation is difficult both for wave packets prepared as excitations of the fermionic vacuum, and for hole wave packets at filling fraction one-half in the limit of strong coupling. In the latter case, which is described by the t-J model, there is only reflection and no transmission in the scattering events, as would be the case for classical hard spheres. In that sense, the construction provides a quantum mechanical analog of the Fredkin–Toffoli billiard ball computer. (paper)

  8. Two-parametric model of electron beam in computational dosimetry for radiation processing

    International Nuclear Information System (INIS)

    Lazurik, V.M.; Lazurik, V.T.; Popov, G.; Zimek, Z.

    2016-01-01

    Computer simulation of irradiation process of various materials with electron beam (EB) can be applied to correct and control the performances of radiation processing installations. Electron beam energy measurements methods are described in the international standards. The obtained results of measurements can be extended by implementation computational dosimetry. Authors have developed the computational method for determination of EB energy on the base of two-parametric fitting of semi-empirical model for the depth dose distribution initiated by mono-energetic electron beam. The analysis of number experiments show that described method can effectively consider random displacements arising from the use of aluminum wedge with a continuous strip of dosimetric film and minimize the magnitude uncertainty value of the electron energy evaluation, calculated from the experimental data. Two-parametric fitting method is proposed for determination of the electron beam model parameters. These model parameters are as follow: E 0 – energy mono-energetic and mono-directional electron source, X 0 – the thickness of the aluminum layer, located in front of irradiated object. That allows obtain baseline data related to the characteristic of the electron beam, which can be later on applied for computer modeling of the irradiation process. Model parameters which are defined in the international standards (like E p – the most probably energy and R p – practical range) can be linked with characteristics of two-parametric model (E 0 , X 0 ), which allows to simulate the electron irradiation process. The obtained data from semi-empirical model were checked together with the set of experimental results. The proposed two-parametric model for electron beam energy evaluation and estimation of accuracy for computational dosimetry methods on the base of developed model are discussed. - Highlights: • Experimental and computational methods of electron energy evaluation. • Development

  9. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  10. Dynamical Trust and Reputation Computation Model for B2C E-Commerce

    Directory of Open Access Journals (Sweden)

    Bo Tian

    2015-10-01

    Full Text Available Trust is one of the most important factors that influence the successful application of network service environments, such as e-commerce, wireless sensor networks, and online social networks. Computation models associated with trust and reputation have been paid special attention in both computer societies and service science in recent years. In this paper, a dynamical computation model of reputation for B2C e-commerce is proposed. Firstly, conceptions associated with trust and reputation are introduced, and the mathematical formula of trust for B2C e-commerce is given. Then a dynamical computation model of reputation is further proposed based on the conception of trust and the relationship between trust and reputation. In the proposed model, classical varying processes of reputation of B2C e-commerce are discussed. Furthermore, the iterative trust and reputation computation models are formulated via a set of difference equations based on the closed-loop feedback mechanism. Finally, a group of numerical simulation experiments are performed to illustrate the proposed model of trust and reputation. Experimental results show that the proposed model is effective in simulating the dynamical processes of trust and reputation for B2C e-commerce.

  11. Getting computer models to communicate; Faire communiquer les modeles numeriques

    Energy Technology Data Exchange (ETDEWEB)

    Caremoli, Ch. [Electricite de France (EDF), 75 - Paris (France). Dept. Mecanique et Modeles Numeriques; Erhard, P. [Electricite de France (EDF), 75 - Paris (France). Dept. Physique des Reacteurs

    1999-07-01

    Today's computers have the processing power to deliver detailed and global simulations of complex industrial processes such as the operation of a nuclear reactor core. So should we be producing new, global numerical models to take full advantage of this new-found power? If so, it would be a long-term job. There is, however, another solution; to couple the existing validated numerical models together so that they work as one. (authors)

  12. Computer model for economic study of unbleached kraft paperboard production

    Science.gov (United States)

    Peter J. Ince

    1984-01-01

    Unbleached kraft paperboard is produced from wood fiber in an industrial papermaking process. A highly specific and detailed model of the process is presented. The model is also presented as a working computer program. A user of the computer program will provide data on physical parameters of the process and on prices of material inputs and outputs. The program is then...

  13. A Situative Space Model for Mobile Mixed-Reality Computing

    DEFF Research Database (Denmark)

    Pederson, Thomas; Janlert, Lars-Erik; Surie, Dipak

    2011-01-01

    This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time.......This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time....

  14. Computer-Aided Multiscale Modelling for Chemical Process Engineering

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Gani, Rafiqul

    2007-01-01

    Chemical processes are generally modeled through monoscale approaches, which, while not adequate, satisfy a useful role in product-process design. In this case, use of a multi-dimensional and multi-scale model-based approach has importance in product-process development. A computer-aided framework...

  15. A Computational Model of Fraction Arithmetic

    Science.gov (United States)

    Braithwaite, David W.; Pyke, Aryn A.; Siegler, Robert S.

    2017-01-01

    Many children fail to master fraction arithmetic even after years of instruction, a failure that hinders their learning of more advanced mathematics as well as their occupational success. To test hypotheses about why children have so many difficulties in this area, we created a computational model of fraction arithmetic learning and presented it…

  16. Advanced computational modelling for drying processes – A review

    International Nuclear Information System (INIS)

    Defraeye, Thijs

    2014-01-01

    Highlights: • Understanding the product dehydration process is a key aspect in drying technology. • Advanced modelling thereof plays an increasingly important role for developing next-generation drying technology. • Dehydration modelling should be more energy-oriented. • An integrated “nexus” modelling approach is needed to produce more energy-smart products. • Multi-objective process optimisation requires development of more complete multiphysics models. - Abstract: Drying is one of the most complex and energy-consuming chemical unit operations. R and D efforts in drying technology have skyrocketed in the past decades, as new drivers emerged in this industry next to procuring prime product quality and high throughput, namely reduction of energy consumption and carbon footprint as well as improving food safety and security. Solutions are sought in optimising existing technologies or developing new ones which increase energy and resource efficiency, use renewable energy, recuperate waste heat and reduce product loss, thus also the embodied energy therein. Novel tools are required to push such technological innovations and their subsequent implementation. Particularly computer-aided drying process engineering has a large potential to develop next-generation drying technology, including more energy-smart and environmentally-friendly products and dryers systems. This review paper deals with rapidly emerging advanced computational methods for modelling dehydration of porous materials, particularly for foods. Drying is approached as a combined multiphysics, multiscale and multiphase problem. These advanced methods include computational fluid dynamics, several multiphysics modelling methods (e.g. conjugate modelling), multiscale modelling and modelling of material properties and the associated propagation of material property variability. Apart from the current challenges for each of these, future perspectives should be directed towards material property

  17. Computer modeling for optimal placement of gloveboxes

    International Nuclear Information System (INIS)

    Hench, K.W.; Olivas, J.D.; Finch, P.R.

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units

  18. Computer modeling for optimal placement of gloveboxes

    Energy Technology Data Exchange (ETDEWEB)

    Hench, K.W.; Olivas, J.D. [Los Alamos National Lab., NM (United States); Finch, P.R. [New Mexico State Univ., Las Cruces, NM (United States)

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units.

  19. Computer Simulation (Microcultures): An Effective Model for Multicultural Education.

    Science.gov (United States)

    Nelson, Jorge O.

    This paper presents a rationale for using high-fidelity computer simulation in planning for and implementing effective multicultural education strategies. Using computer simulation, educators can begin to understand and plan for the concept of cultural sensitivity in delivering instruction. The model promises to emphasize teachers' understanding…

  20. Cloud Computing, Tieto Cloud Server Model

    OpenAIRE

    Suikkanen, Saara

    2013-01-01

    The purpose of this study is to find out what is cloud computing. To be able to make wise decisions when moving to cloud or considering it, companies need to understand what cloud is consists of. Which model suits best to they company, what should be taken into account before moving to cloud, what is the cloud broker role and also SWOT analysis of cloud? To be able to answer customer requirements and business demands, IT companies should develop and produce new service models. IT house T...

  1. Computing Models of M-type Host Stars and their Panchromatic Spectral Output

    Science.gov (United States)

    Linsky, Jeffrey; Tilipman, Dennis; France, Kevin

    2018-06-01

    We have begun a program of computing state-of-the-art model atmospheres from the photospheres to the coronae of M stars that are the host stars of known exoplanets. For each model we are computing the emergent radiation at all wavelengths that are critical for assessingphotochemistry and mass-loss from exoplanet atmospheres. In particular, we are computing the stellar extreme ultraviolet radiation that drives hydrodynamic mass loss from exoplanet atmospheres and is essential for determing whether an exoplanet is habitable. The model atmospheres are computed with the SSRPM radiative transfer/statistical equilibrium code developed by Dr. Juan Fontenla. The code solves for the non-LTE statistical equilibrium populations of 18,538 levels of 52 atomic and ion species and computes the radiation from all species (435,986 spectral lines) and about 20,000,000 spectral lines of 20 diatomic species.The first model computed in this program was for the modestly active M1.5 V star GJ 832 by Fontenla et al. (ApJ 830, 152 (2016)). We will report on a preliminary model for the more active M5 V star GJ 876 and compare this model and its emergent spectrum with GJ 832. In the future, we will compute and intercompare semi-empirical models and spectra for all of the stars observed with the HST MUSCLES Treasury Survey, the Mega-MUSCLES Treasury Survey, and additional stars including Proxima Cen and Trappist-1.This multiyear theory program is supported by a grant from the Space Telescope Science Institute.

  2. How computational models can help unlock biological systems.

    Science.gov (United States)

    Brodland, G Wayne

    2015-12-01

    With computation models playing an ever increasing role in the advancement of science, it is important that researchers understand what it means to model something; recognize the implications of the conceptual, mathematical and algorithmic steps of model construction; and comprehend what models can and cannot do. Here, we use examples to show that models can serve a wide variety of roles, including hypothesis testing, generating new insights, deepening understanding, suggesting and interpreting experiments, tracing chains of causation, doing sensitivity analyses, integrating knowledge, and inspiring new approaches. We show that models can bring together information of different kinds and do so across a range of length scales, as they do in multi-scale, multi-faceted embryogenesis models, some of which connect gene expression, the cytoskeleton, cell properties, tissue mechanics, morphogenetic movements and phenotypes. Models cannot replace experiments nor can they prove that particular mechanisms are at work in a given situation. But they can demonstrate whether or not a proposed mechanism is sufficient to produce an observed phenomenon. Although the examples in this article are taken primarily from the field of embryo mechanics, most of the arguments and discussion are applicable to any form of computational modelling. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  3. Computational Software to Fit Seismic Data Using Epidemic-Type Aftershock Sequence Models and Modeling Performance Comparisons

    Science.gov (United States)

    Chu, A.

    2016-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work implements three of the homogeneous ETAS models described in Ogata (1998). With a model's log-likelihood function, my software finds the Maximum-Likelihood Estimates (MLEs) of the model's parameters to estimate the homogeneous background rate and the temporal and spatial parameters that govern triggering effects. EM-algorithm is employed for its advantages of stability and robustness (Veen and Schoenberg, 2008). My work also presents comparisons among the three models in robustness, convergence speed, and implementations from theory to computing practice. Up-to-date regional seismic data of seismic active areas such as Southern California and Japan are used to demonstrate the comparisons. Data analysis has been done using computer languages Java and R. Java has the advantages of being strong-typed and easiness of controlling memory resources, while R has the advantages of having numerous available functions in statistical computing. Comparisons are also made between the two programming languages in convergence and stability, computational speed, and easiness of implementation. Issues that may affect convergence such as spatial shapes are discussed.

  4. Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments

    Science.gov (United States)

    Lane, Peter C. R.; Gobet, Fernand

    2013-03-01

    Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.

  5. GRAVTool, Advances on the Package to Compute Geoid Model path by the Remove-Compute-Restore Technique, Following Helmert's Condensation Method

    Science.gov (United States)

    Marotta, G. S.

    2017-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astrogeodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove Compute Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and Global Geopotential Model (GGM), respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and adjust these models to one local vertical datum. This research presents the advances on the package called GRAVTool to compute geoid models path by the RCR, following Helmert's condensation method, and its application in a study area. The studied area comprehends the federal district of Brazil, with 6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show a geoid model computed by the GRAVTool package, after analysis of the density, DTM and GGM values, more adequate to the reference values used on the study area. The accuracy of the computed model (σ = ± 0.058 m, RMS = 0.067 m, maximum = 0.124 m and minimum = -0.155 m), using density value of 2.702 g/cm³ ±0.024 g/cm³, DTM SRTM Void Filled 3 arc-second and GGM EIGEN-6C4 up to degree and order 250, matches the uncertainty (σ =± 0.073) of 26 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.076 m, RMS = 0.098 m, maximum = 0.320 m and minimum = -0.061 m).

  6. Computationally-optimized bone mechanical modeling from high-resolution structural images.

    Directory of Open Access Journals (Sweden)

    Jeremy F Magland

    Full Text Available Image-based mechanical modeling of the complex micro-structure of human bone has shown promise as a non-invasive method for characterizing bone strength and fracture risk in vivo. In particular, elastic moduli obtained from image-derived micro-finite element (μFE simulations have been shown to correlate well with results obtained by mechanical testing of cadaveric bone. However, most existing large-scale finite-element simulation programs require significant computing resources, which hamper their use in common laboratory and clinical environments. In this work, we theoretically derive and computationally evaluate the resources needed to perform such simulations (in terms of computer memory and computation time, which are dependent on the number of finite elements in the image-derived bone model. A detailed description of our approach is provided, which is specifically optimized for μFE modeling of the complex three-dimensional architecture of trabecular bone. Our implementation includes domain decomposition for parallel computing, a novel stopping criterion, and a system for speeding up convergence by pre-iterating on coarser grids. The performance of the system is demonstrated on a dual quad-core Xeon 3.16 GHz CPUs equipped with 40 GB of RAM. Models of distal tibia derived from 3D in-vivo MR images in a patient comprising 200,000 elements required less than 30 seconds to converge (and 40 MB RAM. To illustrate the system's potential for large-scale μFE simulations, axial stiffness was estimated from high-resolution micro-CT images of a voxel array of 90 million elements comprising the human proximal femur in seven hours CPU time. In conclusion, the system described should enable image-based finite-element bone simulations in practical computation times on high-end desktop computers with applications to laboratory studies and clinical imaging.

  7. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1990-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems

  8. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1991-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab

  9. Computational Modeling in Tissue Engineering

    CERN Document Server

    2013-01-01

    One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in:  (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...

  10. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer......-format and COM-objects, are incorporated to allow the export and import of mathematical models; 5) a user interface that provides the work-flow and data-flow to guide the user through the different modelling tasks....

  11. A Generative Computer Model for Preliminary Design of Mass Housing

    Directory of Open Access Journals (Sweden)

    Ahmet Emre DİNÇER

    2014-05-01

    Full Text Available Today, we live in what we call the “Information Age”, an age in which information technologies are constantly being renewed and developed. Out of this has emerged a new approach called “Computational Design” or “Digital Design”. In addition to significantly influencing all fields of engineering, this approach has come to play a similar role in all stages of the design process in the architectural field. In providing solutions for analytical problems in design such as cost estimate, circulation systems evaluation and environmental effects, which are similar to engineering problems, this approach is being used in the evaluation, representation and presentation of traditionally designed buildings. With developments in software and hardware technology, it has evolved as the studies based on design of architectural products and production implementations with digital tools used for preliminary design stages. This paper presents a digital model which may be used in the preliminary stage of mass housing design with Cellular Automata, one of generative design systems based on computational design approaches. This computational model, developed by scripts of 3Ds Max software, has been implemented on a site plan design of mass housing, floor plan organizations made by user preferences and facade designs. By using the developed computer model, many alternative housing types could be rapidly produced. The interactive design tool of this computational model allows the user to transfer dimensional and functional housing preferences by means of the interface prepared for model. The results of the study are discussed in the light of innovative architectural approaches.

  12. Image analysis and modeling in medical image computing. Recent developments and advances.

    Science.gov (United States)

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body

  13. A Perspective on Computational Human Performance Models as Design Tools

    Science.gov (United States)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  14. Computational Modelling of Piston Ring Dynamics in 3D

    Directory of Open Access Journals (Sweden)

    Dlugoš Jozef

    2014-12-01

    Full Text Available Advanced computational models of a piston assembly based on the level of virtual prototypes require a detailed description of piston ring behaviour. Considering these requirements, the piston rings operate in regimes that cannot, in general, be simplified into an axisymmetric model. The piston and the cylinder liner do not have a perfect round shape, mainly due to machining tolerances and external thermo-mechanical loads. If the ring cannot follow the liner deformations, a local loss of contact occurs resulting in blow-by and increased consumption of lubricant oil in the engine. Current computational models are unable to implement such effects. The paper focuses on the development of a flexible 3D piston ring model based on the Timoshenko beam theory using the multibody system (MBS. The MBS model is compared to the finite element method (FEM solution.

  15. Computationally efficient statistical differential equation modeling using homogenization

    Science.gov (United States)

    Hooten, Mevin B.; Garlick, Martha J.; Powell, James A.

    2013-01-01

    Statistical models using partial differential equations (PDEs) to describe dynamically evolving natural systems are appearing in the scientific literature with some regularity in recent years. Often such studies seek to characterize the dynamics of temporal or spatio-temporal phenomena such as invasive species, consumer-resource interactions, community evolution, and resource selection. Specifically, in the spatial setting, data are often available at varying spatial and temporal scales. Additionally, the necessary numerical integration of a PDE may be computationally infeasible over the spatial support of interest. We present an approach to impose computationally advantageous changes of support in statistical implementations of PDE models and demonstrate its utility through simulation using a form of PDE known as “ecological diffusion.” We also apply a statistical ecological diffusion model to a data set involving the spread of mountain pine beetle (Dendroctonus ponderosae) in Idaho, USA.

  16. The deterministic computational modelling of radioactivity

    International Nuclear Information System (INIS)

    Damasceno, Ralf M.; Barros, Ricardo C.

    2009-01-01

    This paper describes a computational applicative (software) that modelling the simply radioactive decay, the stable nuclei decay, and tbe chain decay directly coupled with superior limit of thirteen radioactive decays, and a internal data bank with the decay constants of the various existent decays, facilitating considerably the use of program by people who does not have access to the program are not connected to the nuclear area; this makes access of the program to people that do not have acknowledgment of that area. The paper presents numerical results for typical problem-models

  17. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C

    2013-01-01

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  18. Computational fluid dynamics modelling of displacement natural ventilation.

    OpenAIRE

    Ji, Yingchun

    2005-01-01

    Natural ventilation is widely recognised as contributing towards low-energy building design. The requirement to reduce energy usage in new buildings has rejuvenated interest in natural ventilation. This thesis deals with computer modelling of natural displacement ventilation driven either by buoyancy or buoyancy combined with wind forces. Two benchmarks have been developed using computational fluid dynamics (CFD) in order to evaluate the accuracy with which CFD is able to mo...

  19. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  20. Computer-aided modelling template: Concept and application

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2015-01-01

    decomposition technique which identifies generic steps and workflow involved, the computer-aided template concept has been developed. This concept is implemented as a software tool, which provides a user-friendly interface for following the workflow steps and guidance through the steps providing additional......Modelling is an important enabling technology in modern chemical engineering applications. A template-based approach is presented in this work to facilitate the construction and documentation of the models and enable their maintenance for reuse in a wider application range. Based on a model...

  1. Computational Modeling of Ablation on an Irradiated Target

    Science.gov (United States)

    Mehmedagic, Igbal; Thangam, Siva

    2017-11-01

    Computational modeling of pulsed nanosecond laser interaction with an irradiated metallic target is presented. The model formulation involves ablation of the metallic target irradiated by pulsed high intensity laser at normal atmospheric conditions. Computational findings based on effective representation and prediction of the heat transfer, melting and vaporization of the targeting material as well as plume formation and expansion are presented along with its relevance for the development of protective shields. In this context, the available results for a representative irradiation from 1064 nm laser pulse is used to analyze various ablation mechanisms, variable thermo-physical and optical properties, plume expansion and surface geometry. Funded in part by U. S. Army ARDEC, Picatinny Arsenal, NJ.

  2. Challenges in Soft Computing: Case Study with Louisville MSD CSO Modeling

    Science.gov (United States)

    Ormsbee, L.; Tufail, M.

    2005-12-01

    The principal constituents of soft computing include fuzzy logic, neural computing, evolutionary computation, machine learning, and probabilistic reasoning. There are numerous applications of these constituents (both individually and combination of two or more) in the area of water resources and environmental systems. These range from development of data driven models to optimal control strategies to assist in more informed and intelligent decision making process. Availability of data is critical to such applications and having scarce data may lead to models that do not represent the response function over the entire domain. At the same time, too much data has a tendency to lead to over-constraining of the problem. This paper will describe the application of a subset of these soft computing techniques (neural computing and genetic algorithms) to the Beargrass Creek watershed in Louisville, Kentucky. The application include development of inductive models as substitutes for more complex process-based models to predict water quality of key constituents (such as dissolved oxygen) and use them in an optimization framework for optimal load reductions. Such a process will facilitate the development of total maximum daily loads for the impaired water bodies in the watershed. Some of the challenges faced in this application include 1) uncertainty in data sets, 2) model application, and 3) development of cause-and-effect relationships between water quality constituents and watershed parameters through use of inductive models. The paper will discuss these challenges and how they affect the desired goals of the project.

  3. Applied modelling and computing in social science

    CERN Document Server

    Povh, Janez

    2015-01-01

    In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

  4. Computational fluid dynamic modelling of cavitation

    Science.gov (United States)

    Deshpande, Manish; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    Models in sheet cavitation in cryogenic fluids are developed for use in Euler and Navier-Stokes codes. The models are based upon earlier potential-flow models but enable the cavity inception point, length, and shape to be determined as part of the computation. In the present paper, numerical solutions are compared with experimental measurements for both pressure distribution and cavity length. Comparisons between models are also presented. The CFD model provides a relatively simple modification to an existing code to enable cavitation performance predictions to be included. The analysis also has the added ability of incorporating thermodynamic effects of cryogenic fluids into the analysis. Extensions of the current two-dimensional steady state analysis to three-dimensions and/or time-dependent flows are, in principle, straightforward although geometrical issues become more complicated. Linearized models, however offer promise of providing effective cavitation modeling in three-dimensions. This analysis presents good potential for improved understanding of many phenomena associated with cavity flows.

  5. Computer modeling of inelastic wave propagation in porous rock

    International Nuclear Information System (INIS)

    Cheney, J.A.; Schatz, J.F.; Snell, C.

    1979-01-01

    Computer modeling of wave propagation in porous rock has several important applications. Among them are prediction of fragmentation and permeability changes to be caused by chemical explosions used for in situ resource recovery, and the understanding of nuclear explosion effects such as seismic wave generation, containment, and site hardness. Of interest in all these applications are the distance from the source to which inelastic effects persist and the amount of porosity change within the inelastic region. In order to study phenomena related to these applications, the Cam Clay family of models developed at Cambridge University was used to develop a similar model that is applicable to wave propagation in porous rock. That model was incorporated into a finite-difference wave propagation computer code SOC. 10 figures, 1 table

  6. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  7. Computational Fluid Dynamics Modeling of Bacillus anthracis ...

    Science.gov (United States)

    Journal Article Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived from computed tomography (CT) or µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation-exhalation breathing conditions using average species-specific minute volumes. Four different exposure scenarios were modeled in the rabbit based upon experimental inhalation studies. For comparison, human simulations were conducted at the highest exposure concentration used during the rabbit experimental exposures. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Despite the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the upper conducting airways of the human at the same air concentration of anthrax spores. This greater deposition of spores in the upper airways in the human resulted in lower penetration and deposition in the tracheobronchial airways and the deep lung than that predict

  8. Impact of changing computer technology on hydrologic and water resource modeling

    OpenAIRE

    Loucks, D.P.; Fedra, K.

    1987-01-01

    The increasing availability of substantial computer power at relatively low costs and the increasing ease of using computer graphics, of communicating with other computers and data bases, and of programming using high-level problem-oriented computer languages, is providing new opportunities and challenges for those developing and using hydrologic and water resources models. This paper reviews some of the progress made towards the development and application of computer support systems designe...

  9. Computer-Aided Model Based Analysis for Design and Operation of a Copolymerization Process

    DEFF Research Database (Denmark)

    Lopez-Arenas, Maria Teresa; Sales-Cruz, Alfonso Mauricio; Gani, Rafiqul

    2006-01-01

    . This will allow analysis of the process behaviour, contribute to a better understanding of the polymerization process, help to avoid unsafe conditions of operation, and to develop operational and optimizing control strategies. In this work, through a computer-aided modeling system ICAS-MoT, two first......The advances in computer science and computational algorithms for process modelling, process simulation, numerical methods and design/synthesis algorithms, makes it advantageous and helpful to employ computer-aided modelling systems and tools for integrated process analysis. This is illustrated......-principles models have been investigated with respect to design and operational issues for solution copolymerization reactors in general, and for the methyl methacrylate/vinyl acetate system in particular. The Model 1 is taken from literature and is commonly used for low conversion region, while the Model 2 has...

  10. Coupling of EIT with computational lung modeling for predicting patient-specific ventilatory responses.

    Science.gov (United States)

    Roth, Christian J; Becher, Tobias; Frerichs, Inéz; Weiler, Norbert; Wall, Wolfgang A

    2017-04-01

    Providing optimal personalized mechanical ventilation for patients with acute or chronic respiratory failure is still a challenge within a clinical setting for each case anew. In this article, we integrate electrical impedance tomography (EIT) monitoring into a powerful patient-specific computational lung model to create an approach for personalizing protective ventilatory treatment. The underlying computational lung model is based on a single computed tomography scan and able to predict global airflow quantities, as well as local tissue aeration and strains for any ventilation maneuver. For validation, a novel "virtual EIT" module is added to our computational lung model, allowing to simulate EIT images based on the patient's thorax geometry and the results of our numerically predicted tissue aeration. Clinically measured EIT images are not used to calibrate the computational model. Thus they provide an independent method to validate the computational predictions at high temporal resolution. The performance of this coupling approach has been tested in an example patient with acute respiratory distress syndrome. The method shows good agreement between computationally predicted and clinically measured airflow data and EIT images. These results imply that the proposed framework can be used for numerical prediction of patient-specific responses to certain therapeutic measures before applying them to an actual patient. In the long run, definition of patient-specific optimal ventilation protocols might be assisted by computational modeling. NEW & NOTEWORTHY In this work, we present a patient-specific computational lung model that is able to predict global and local ventilatory quantities for a given patient and any selected ventilation protocol. For the first time, such a predictive lung model is equipped with a virtual electrical impedance tomography module allowing real-time validation of the computed results with the patient measurements. First promising results

  11. Computational Models for Nonlinear Aeroelastic Systems, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate new and efficient computational methods of modeling nonlinear aeroelastic systems. The...

  12. Validation of a phytoremediation computer model

    International Nuclear Information System (INIS)

    Corapcioglu, M.Y.; Sung, K.; Rhykerd, R.L.; Munster, C.; Drew, M.

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg -1 ] TNT, PBB and chrysene. Vegetated and unvegetated treatments were conducted in triplicate to obtain data regarding contaminant concentrations in the soil, plant roots, root distribution, microbial activity, plant water use and soil moisture. When given the parameters of time and depth, the model successfully predicted contaminant concentrations under actual field conditions. Other model parameters are currently being evaluated. 15 refs., 2 figs

  13. Computer models for optimizing radiation therapy

    International Nuclear Information System (INIS)

    Duechting, W.

    1998-01-01

    The aim of this contribution is to outline how methods of system analysis, control therapy and modelling can be applied to simulate normal and malignant cell growth and to optimize cancer treatment as for instance radiation therapy. Based on biological observations and cell kinetic data, several types of models have been developed describing the growth of tumor spheroids and the cell renewal of normal tissue. The irradiation model is represented by the so-called linear-quadratic model describing the survival fraction as a function of the dose. Based thereon, numerous simulation runs for different treatment schemes can be performed. Thus, it is possible to study the radiation effect on tumor and normal tissue separately. Finally, this method enables a computer-assisted recommendation for an optimal patient-specific treatment schedule prior to clinical therapy. (orig.) [de

  14. Quantum computing with photons: introduction to the circuit model, the one-way quantum computer, and the fundamental principles of photonic experiments

    International Nuclear Information System (INIS)

    Barz, Stefanie

    2015-01-01

    Quantum physics has revolutionized our understanding of information processing and enables computational speed-ups that are unattainable using classical computers. This tutorial reviews the fundamental tools of photonic quantum information processing. The basics of theoretical quantum computing are presented and the quantum circuit model as well as measurement-based models of quantum computing are introduced. Furthermore, it is shown how these concepts can be implemented experimentally using photonic qubits, where information is encoded in the photons’ polarization. (tutorial)

  15. Dynamical Models for Computer Viruses Propagation

    Directory of Open Access Journals (Sweden)

    José R. C. Piqueira

    2008-01-01

    Full Text Available Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network.

  16. Computational Modeling of Photonic Crystal Microcavity Single-Photon Emitters

    Science.gov (United States)

    Saulnier, Nicole A.

    Conventional cryptography is based on algorithms that are mathematically complex and difficult to solve, such as factoring large numbers. The advent of a quantum computer would render these schemes useless. As scientists work to develop a quantum computer, cryptographers are developing new schemes for unconditionally secure cryptography. Quantum key distribution has emerged as one of the potential replacements of classical cryptography. It relics on the fact that measurement of a quantum bit changes the state of the bit and undetected eavesdropping is impossible. Single polarized photons can be used as the quantum bits, such that a quantum system would in some ways mirror the classical communication scheme. The quantum key distribution system would include components that create, transmit and detect single polarized photons. The focus of this work is on the development of an efficient single-photon source. This source is comprised of a single quantum dot inside of a photonic crystal microcavity. To better understand the physics behind the device, a computational model is developed. The model uses Finite-Difference Time-Domain methods to analyze the electromagnetic field distribution in photonic crystal microcavities. It uses an 8-band k · p perturbation theory to compute the energy band structure of the epitaxially grown quantum dots. We discuss a method that combines the results of these two calculations for determining the spontaneous emission lifetime of a quantum dot in bulk material or in a microcavity. The computational models developed in this thesis are used to identify and characterize microcavities for potential use in a single-photon source. The computational tools developed are also used to investigate novel photonic crystal microcavities that incorporate 1D distributed Bragg reflectors for vertical confinement. It is found that the spontaneous emission enhancement in the quasi-3D cavities can be significantly greater than in traditional suspended slab

  17. A climatological model for risk computations incorporating site- specific dry deposition influences

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.

    1991-07-01

    A gradient-flux dry deposition module was developed for use in a climatological atmospheric transport model, the Multimedia Environmental Pollutant Assessment System (MEPAS). The atmospheric pathway model computes long-term average contaminant air concentration and surface deposition patterns surrounding a potential release site incorporating location-specific dry deposition influences. Gradient-flux formulations are used to incorporate site and regional data in the dry deposition module for this atmospheric sector-average climatological model. Application of these formulations provide an effective means of accounting for local surface roughness in deposition computations. Linkage to a risk computation module resulted in a need for separate regional and specific surface deposition computations. 13 refs., 4 figs., 2 tabs

  18. Computer-Aided Construction of Chemical Kinetic Models

    Energy Technology Data Exchange (ETDEWEB)

    Green, William H. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2014-12-31

    The combustion chemistry of even simple fuels can be extremely complex, involving hundreds or thousands of kinetically significant species. The most reasonable way to deal with this complexity is to use a computer not only to numerically solve the kinetic model, but also to construct the kinetic model in the first place. Because these large models contain so many numerical parameters (e.g. rate coefficients, thermochemistry) one never has sufficient data to uniquely determine them all experimentally. Instead one must work in “predictive” mode, using theoretical rather than experimental values for many of the numbers in the model, and as appropriate refining the most sensitive numbers through experiments. Predictive chemical kinetics is exactly what is needed for computer-aided design of combustion systems based on proposed alternative fuels, particularly for early assessment of the value and viability of proposed new fuels before those fuels are commercially available. This project was aimed at making accurate predictive chemical kinetics practical; this is a challenging goal which requires a range of science advances. The project spanned a wide range from quantum chemical calculations on individual molecules and elementary-step reactions, through the development of improved rate/thermo calculation procedures, the creation of algorithms and software for constructing and solving kinetic simulations, the invention of methods for model-reduction while maintaining error control, and finally comparisons with experiment. Many of the parameters in the models were derived from quantum chemistry calculations, and the models were compared with experimental data measured in our lab or in collaboration with others.

  19. New weighted sum of gray gases model applicable to Computational Fluid Dynamics (CFD) modeling of oxy-fuel combustion

    DEFF Research Database (Denmark)

    Yin, Chungen; Johansen, Lars Christian Riis; Rosendahl, Lasse

    2010-01-01

    gases model (WSGGM) is derived, which is applicable to computational fluid dynamics (CFD) modeling of both air-fuel and oxy-fuel combustion. First, a computer code is developed to evaluate the emissivity of any gas mixture at any condition by using the exponential wide band model (EWBM...

  20. Computational Modeling of Micrometastatic Breast Cancer Radiation Dose Response

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Daniel L.; Debeb, Bisrat G. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Morgan Welch Inflammatory Breast Cancer Research Program and Clinic, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Thames, Howard D. [Department of Biostatistics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Woodward, Wendy A., E-mail: wwoodward@mdanderson.org [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Morgan Welch Inflammatory Breast Cancer Research Program and Clinic, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States)

    2016-09-01

    Purpose: Prophylactic cranial irradiation (PCI) involves giving radiation to the entire brain with the goals of reducing the incidence of brain metastasis and improving overall survival. Experimentally, we have demonstrated that PCI prevents brain metastases in a breast cancer mouse model. We developed a computational model to expand on and aid in the interpretation of our experimental results. Methods and Materials: MATLAB was used to develop a computational model of brain metastasis and PCI in mice. Model input parameters were optimized such that the model output would match the experimental number of metastases per mouse from the unirradiated group. An independent in vivo–limiting dilution experiment was performed to validate the model. The effect of whole brain irradiation at different measurement points after tumor cells were injected was evaluated in terms of the incidence, number of metastases, and tumor burden and was then compared with the corresponding experimental data. Results: In the optimized model, the correlation between the number of metastases per mouse and the experimental fits was >95. Our attempt to validate the model with a limiting dilution assay produced 99.9% correlation with respect to the incidence of metastases. The model accurately predicted the effect of whole-brain irradiation given 3 weeks after cell injection but substantially underestimated its effect when delivered 5 days after cell injection. The model further demonstrated that delaying whole-brain irradiation until the development of gross disease introduces a dose threshold that must be reached before a reduction in incidence can be realized. Conclusions: Our computational model of mouse brain metastasis and PCI correlated strongly with our experiments with unirradiated mice. The results further suggest that early treatment of subclinical disease is more effective than irradiating established disease.

  1. Mathematical model of accelerator output characteristics and their calculation on a computer

    International Nuclear Information System (INIS)

    Mishulina, O.A.; Ul'yanina, M.N.; Kornilova, T.V.

    1975-01-01

    A mathematical model is described of output characteristics of a linear accelerator. The model is a system of differential equations. Presence of phase limitations is a specific feature of setting the problem which makes it possible to ensure higher simulation accuracy and determine a capture coefficient. An algorithm is elaborated of computing output characteristics based upon the mathematical model suggested. A capture coefficient, coordinate expectation characterizing an average phase value of the beam particles, coordinate expectation characterizing an average value of the reverse relative velocity of the beam particles as well as dispersion of these coordinates are output characteristics of the accelerator. Calculation methods of the accelerator output characteristics are described in detail. The computations have been performed on the BESM-6 computer, the characteristics computing time being 2 min 20 sec. Relative error of parameter computation averages 10 -2

  2. Computational neural network regression model for Host based Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Gautam

    2016-09-01

    Full Text Available The current scenario of information gathering and storing in secure system is a challenging task due to increasing cyber-attacks. There exists computational neural network techniques designed for intrusion detection system, which provide security to single machine and entire network's machine. In this paper, we have used two types of computational neural network models, namely, Generalized Regression Neural Network (GRNN model and Multilayer Perceptron Neural Network (MPNN model for Host based Intrusion Detection System using log files that are generated by a single personal computer. The simulation results show correctly classified percentage of normal and abnormal (intrusion class using confusion matrix. On the basis of results and discussion, we found that the Host based Intrusion Systems Model (HISM significantly improved the detection accuracy while retaining minimum false alarm rate.

  3. Defining epidemics in computer simulation models: How do definitions influence conclusions?

    Directory of Open Access Journals (Sweden)

    Carolyn Orbann

    2017-06-01

    Full Text Available Computer models have proven to be useful tools in studying epidemic disease in human populations. Such models are being used by a broader base of researchers, and it has become more important to ensure that descriptions of model construction and data analyses are clear and communicate important features of model structure. Papers describing computer models of infectious disease often lack a clear description of how the data are aggregated and whether or not non-epidemic runs are excluded from analyses. Given that there is no concrete quantitative definition of what constitutes an epidemic within the public health literature, each modeler must decide on a strategy for identifying epidemics during simulation runs. Here, an SEIR model was used to test the effects of how varying the cutoff for considering a run an epidemic changes potential interpretations of simulation outcomes. Varying the cutoff from 0% to 15% of the model population ever infected with the illness generated significant differences in numbers of dead and timing variables. These results are important for those who use models to form public health policy, in which questions of timing or implementation of interventions might be answered using findings from computer simulation models.

  4. Computational Models for Nonlinear Aeroelastic Systems, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate a new and efficient computational method of modeling nonlinear aeroelastic systems. The...

  5. A neuro-fuzzy computing technique for modeling hydrological time series

    Science.gov (United States)

    Nayak, P. C.; Sudheer, K. P.; Rangan, D. M.; Ramasastri, K. S.

    2004-05-01

    Intelligent computing tools such as artificial neural network (ANN) and fuzzy logic approaches are proven to be efficient when applied individually to a variety of problems. Recently there has been a growing interest in combining both these approaches, and as a result, neuro-fuzzy computing techniques have evolved. This approach has been tested and evaluated in the field of signal processing and related areas, but researchers have only begun evaluating the potential of this neuro-fuzzy hybrid approach in hydrologic modeling studies. This paper presents the application of an adaptive neuro fuzzy inference system (ANFIS) to hydrologic time series modeling, and is illustrated by an application to model the river flow of Baitarani River in Orissa state, India. An introduction to the ANFIS modeling approach is also presented. The advantage of the method is that it does not require the model structure to be known a priori, in contrast to most of the time series modeling techniques. The results showed that the ANFIS forecasted flow series preserves the statistical properties of the original flow series. The model showed good performance in terms of various statistical indices. The results are highly promising, and a comparative analysis suggests that the proposed modeling approach outperforms ANNs and other traditional time series models in terms of computational speed, forecast errors, efficiency, peak flow estimation etc. It was observed that the ANFIS model preserves the potential of the ANN approach fully, and eases the model building process.

  6. Grid computing in large pharmaceutical molecular modeling.

    Science.gov (United States)

    Claus, Brian L; Johnson, Stephen R

    2008-07-01

    Most major pharmaceutical companies have employed grid computing to expand their compute resources with the intention of minimizing additional financial expenditure. Historically, one of the issues restricting widespread utilization of the grid resources in molecular modeling is the limited set of suitable applications amenable to coarse-grained parallelization. Recent advances in grid infrastructure technology coupled with advances in application research and redesign will enable fine-grained parallel problems, such as quantum mechanics and molecular dynamics, which were previously inaccessible to the grid environment. This will enable new science as well as increase resource flexibility to load balance and schedule existing workloads.

  7. Human performance models for computer-aided engineering

    Science.gov (United States)

    Elkind, Jerome I. (Editor); Card, Stuart K. (Editor); Hochberg, Julian (Editor); Huey, Beverly Messick (Editor)

    1989-01-01

    This report discusses a topic important to the field of computational human factors: models of human performance and their use in computer-based engineering facilities for the design of complex systems. It focuses on a particular human factors design problem -- the design of cockpit systems for advanced helicopters -- and on a particular aspect of human performance -- vision and related cognitive functions. By focusing in this way, the authors were able to address the selected topics in some depth and develop findings and recommendations that they believe have application to many other aspects of human performance and to other design domains.

  8. A Parallel and Distributed Surrogate Model Implementation for Computational Steering

    KAUST Repository

    Butnaru, Daniel; Buse, Gerrit; Pfluger, Dirk

    2012-01-01

    of the input parameters. Such an exploration process is however not possible if the simulation is computationally too expensive. For these cases we present in this paper a scalable computational steering approach utilizing a fast surrogate model as substitute

  9. Computational Aerodynamic Modeling of Small Quadcopter Vehicles

    Science.gov (United States)

    Yoon, Seokkwan; Ventura Diaz, Patricia; Boyd, D. Douglas; Chan, William M.; Theodore, Colin R.

    2017-01-01

    High-fidelity computational simulations have been performed which focus on rotor-fuselage and rotor-rotor aerodynamic interactions of small quad-rotor vehicle systems. The three-dimensional unsteady Navier-Stokes equations are solved on overset grids using high-order accurate schemes, dual-time stepping, low Mach number preconditioning, and hybrid turbulence modeling. Computational results for isolated rotors are shown to compare well with available experimental data. Computational results in hover reveal the differences between a conventional configuration where the rotors are mounted above the fuselage and an unconventional configuration where the rotors are mounted below the fuselage. Complex flow physics in forward flight is investigated. The goal of this work is to demonstrate that understanding of interactional aerodynamics can be an important factor in design decisions regarding rotor and fuselage placement for next-generation multi-rotor drones.

  10. Computational fluid dynamics modelling in cardiovascular medicine.

    Science.gov (United States)

    Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2016-01-01

    This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges. Published by the BMJ Publishing Group Limited. For permission

  11. What can be learned from computer modeling? Comparing expository and modeling approaches to teaching dynamic systems behavior

    NARCIS (Netherlands)

    van Borkulo, S.P.; van Joolingen, W.R.; Savelsbergh, E.R.; de Jong, T.

    2012-01-01

    Computer modeling has been widely promoted as a means to attain higher order learning outcomes. Substantiating these benefits, however, has been problematic due to a lack of proper assessment tools. In this study, we compared computer modeling with expository instruction, using a tailored assessment

  12. Ex Vivo Methods for Informing Computational Models of the Mitral Valve

    OpenAIRE

    Bloodworth, Charles H.; Pierce, Eric L.; Easley, Thomas F.; Drach, Andrew; Khalighi, Amir H.; Toma, Milan; Jensen, Morten O.; Sacks, Michael S.; Yoganathan, Ajit P.

    2016-01-01

    Computational modeling of the mitral valve (MV) has potential applications for determining optimal MV repair techniques and risk of recurrent mitral regurgitation. Two key concerns for informing these models are (1) sensitivity of model performance to the accuracy of the input geometry, and, (2) acquisition of comprehensive data sets against which the simulation can be validated across clinically relevant geometries. Addressing the first concern, ex vivo micro-computed tomography (microCT) wa...

  13. FCJ-131 Pervasive Computing and Prosopopoietic Modelling – Notes on computed function and creative action

    Directory of Open Access Journals (Sweden)

    Anders Michelsen

    2011-12-01

    Full Text Available This article treats the philosophical underpinnings of the notions of ubiquity and pervasive computing from a historical perspective. The current focus on these notions reflects the ever increasing impact of new media and the underlying complexity of computed function in the broad sense of ICT that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997 terms, ‘invisible’, on the horizon, ’calm’, it also points to a much more important and slightly different perspective: that of creative action upon novel forms of artifice. Most importantly for this article, ubiquity and pervasive computing is seen to point to the continuous existence throughout the computational heritage since the mid-20th century of a paradoxical distinction/complicity between the technical organisation of computed function and the human Being, in the sense of creative action upon such function. This paradoxical distinction/complicity promotes a chiastic (Merleau-Ponty relationship of extension of one into the other. It also indicates a generative creation that itself points to important issues of ontology with methodological implications for the design of computing. In this article these implications will be conceptualised as prosopopoietic modeling on the basis of Bernward Joerges introduction of the classical rhetoric term of ’prosopopoeia’ into the debate on large technological systems. First, the paper introduces the paradoxical distinction/complicity by debating Gilbert Simondon’s notion of a ‘margin of indeterminacy’ vis-a-vis computing. Second, it debates the idea of prosopopoietic modeling, pointing to a principal role of the paradoxical distinction/complicity within the computational heritage in three cases: a. Prosopopoietic

  14. Computational Modeling of Complex Protein Activity Networks

    NARCIS (Netherlands)

    Schivo, Stefano; Leijten, Jeroen; Karperien, Marcel; Post, Janine N.; Prignet, Claude

    2017-01-01

    Because of the numerous entities interacting, the complexity of the networks that regulate cell fate makes it impossible to analyze and understand them using the human brain alone. Computational modeling is a powerful method to unravel complex systems. We recently described the development of a

  15. Predictive Behavior of a Computational Foot/Ankle Model through Artificial Neural Networks.

    Science.gov (United States)

    Chande, Ruchi D; Hargraves, Rosalyn Hobson; Ortiz-Robinson, Norma; Wayne, Jennifer S

    2017-01-01

    Computational models are useful tools to study the biomechanics of human joints. Their predictive performance is heavily dependent on bony anatomy and soft tissue properties. Imaging data provides anatomical requirements while approximate tissue properties are implemented from literature data, when available. We sought to improve the predictive capability of a computational foot/ankle model by optimizing its ligament stiffness inputs using feedforward and radial basis function neural networks. While the former demonstrated better performance than the latter per mean square error, both networks provided reasonable stiffness predictions for implementation into the computational model.

  16. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    . To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient......'s CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns......Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging...

  17. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs).

    Science.gov (United States)

    Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong

    2014-01-01

    Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  18. Phenomenological optical potentials and optical model computer codes

    International Nuclear Information System (INIS)

    Prince, A.

    1980-01-01

    An introduction to the Optical Model is presented. Starting with the purpose and nature of the physical problems to be analyzed, a general formulation and the various phenomenological methods of solution are discussed. This includes the calculation of observables based on assumed potentials such as local and non-local and their forms, e.g. Woods-Saxon, folded model etc. Also discussed are the various calculational methods and model codes employed to describe nuclear reactions in the spherical and deformed regions (e.g. coupled-channel analysis). An examination of the numerical solutions and minimization techniques associated with the various codes, is briefly touched upon. Several computer programs are described for carrying out the calculations. The preparation of input, (formats and options), determination of model parameters and analysis of output are described. The class is given a series of problems to carry out using the available computer. Interpretation and evaluation of the samples includes the effect of varying parameters, and comparison of calculations with the experimental data. Also included is an intercomparison of the results from the various model codes, along with their advantages and limitations. (author)

  19. Computational Models and Emergent Properties of Respiratory Neural Networks

    Science.gov (United States)

    Lindsey, Bruce G.; Rybak, Ilya A.; Smith, Jeffrey C.

    2012-01-01

    Computational models of the neural control system for breathing in mammals provide a theoretical and computational framework bringing together experimental data obtained from different animal preparations under various experimental conditions. Many of these models were developed in parallel and iteratively with experimental studies and provided predictions guiding new experiments. This data-driven modeling approach has advanced our understanding of respiratory network architecture and neural mechanisms underlying generation of the respiratory rhythm and pattern, including their functional reorganization under different physiological conditions. Models reviewed here vary in neurobiological details and computational complexity and span multiple spatiotemporal scales of respiratory control mechanisms. Recent models describe interacting populations of respiratory neurons spatially distributed within the Bötzinger and pre-Bötzinger complexes and rostral ventrolateral medulla that contain core circuits of the respiratory central pattern generator (CPG). Network interactions within these circuits along with intrinsic rhythmogenic properties of neurons form a hierarchy of multiple rhythm generation mechanisms. The functional expression of these mechanisms is controlled by input drives from other brainstem components, including the retrotrapezoid nucleus and pons, which regulate the dynamic behavior of the core circuitry. The emerging view is that the brainstem respiratory network has rhythmogenic capabilities at multiple levels of circuit organization. This allows flexible, state-dependent expression of different neural pattern-generation mechanisms under various physiological conditions, enabling a wide repertoire of respiratory behaviors. Some models consider control of the respiratory CPG by pulmonary feedback and network reconfiguration during defensive behaviors such as cough. Future directions in modeling of the respiratory CPG are considered. PMID:23687564

  20. Security Issues Model on Cloud Computing: A Case of Malaysia

    OpenAIRE

    Komeil Raisian; Jamaiah Yahaya

    2015-01-01

    By developing the cloud computing, viewpoint of many people regarding the infrastructure architectures, software distribution and improvement model changed significantly. Cloud computing associates with the pioneering deployment architecture, which could be done through grid calculating, effectiveness calculating and autonomic calculating. The fast transition towards that, has increased the worries regarding a critical issue for the effective transition of cloud computing. From the security v...

  1. Computer Modeling of the Effects of Atmospheric Conditions on Sound Signatures

    Science.gov (United States)

    2016-02-01

    simulation. 11 5. References 1. Attenborough K. Sound propagation in the atmosphere. In: Rossing TD, editor. Springer handbook of...ARL-TR-7602 ● FEB 2016 US Army Research Laboratory Computer Modeling of the Effects of Atmospheric Conditions on Sound ...Laboratory Computer Modeling of the Effects of Atmospheric Conditions on Sound Signatures by Sarah Wagner Science and Engineering Apprentice

  2. An approximate fractional Gaussian noise model with computational cost

    KAUST Repository

    Sø rbye, Sigrunn H.; Myrvoll-Nilsen, Eirik; Rue, Haavard

    2017-01-01

    Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood

  3. Approach to Computer Implementation of Mathematical Model of 3-Phase Induction Motor

    Science.gov (United States)

    Pustovetov, M. Yu

    2018-03-01

    This article discusses the development of the computer model of an induction motor based on the mathematical model in a three-phase stator reference frame. It uses an approach that allows combining during preparation of the computer model dual methods: means of visual programming circuitry (in the form of electrical schematics) and logical one (in the form of block diagrams). The approach enables easy integration of the model of an induction motor as part of more complex models of electrical complexes and systems. The developed computer model gives the user access to the beginning and the end of a winding of each of the three phases of the stator and rotor. This property is particularly important when considering the asymmetric modes of operation or when powered by the special circuitry of semiconductor converters.

  4. Computer Models and Automata Theory in Biology and Medicine

    CERN Document Server

    Baianu, I C

    2004-01-01

    The applications of computers to biological and biomedical problem solving goes back to the very beginnings of computer science, automata theory [1], and mathematical biology [2]. With the advent of more versatile and powerful computers, biological and biomedical applications of computers have proliferated so rapidly that it would be virtually impossible to compile a comprehensive review of all developments in this field. Limitations of computer simulations in biology have also come under close scrutiny, and claims have been made that biological systems have limited information processing power [3]. Such general conjectures do not, however, deter biologists and biomedical researchers from developing new computer applications in biology and medicine. Microprocessors are being widely employed in biological laboratories both for automatic data acquisition/processing and modeling; one particular area, which is of great biomedical interest, involves fast digital image processing and is already established for rout...

  5. Computational model of a whole tree combustor

    Energy Technology Data Exchange (ETDEWEB)

    Bryden, K.M.; Ragland, K.W. [Univ. of Wisconsin, Madison, WI (United States)

    1993-12-31

    A preliminary computational model has been developed for the whole tree combustor and compared to test results. In the simulation model presented hardwood logs, 15 cm in diameter are burned in a 4 m deep fuel bed. Solid and gas temperature, solid and gas velocity, CO, CO{sub 2}, H{sub 2}O, HC and O{sub 2} profiles are calculated. This deep, fixed bed combustor obtains high energy release rates per unit area due to the high inlet air velocity and extended reaction zone. The lowest portion of the overall bed is an oxidizing region and the remainder of the bed acts as a gasification and drying region. The overfire air region completes the combustion. Approximately 40% of the energy is released in the lower oxidizing region. The wood consumption rate obtained from the computational model is 4,110 kg/m{sup 2}-hr which matches well the consumption rate of 3,770 kg/m{sup 2}-hr observed during the peak test period of the Aurora, MN test. The predicted heat release rate is 16 MW/m{sup 2} (5.0*10{sup 6} Btu/hr-ft{sup 2}).

  6. Computational 3-D Model of the Human Respiratory System

    Science.gov (United States)

    We are developing a comprehensive, morphologically-realistic computational model of the human respiratory system that can be used to study the inhalation, deposition, and clearance of contaminants, while being adaptable for age, race, gender, and health/disease status. The model ...

  7. Interactive computer graphics for bio-stereochemical modelling

    Indian Academy of Sciences (India)

    Proc, Indian Acad. Sci., Vol. 87 A (Chem. Sci.), No. 4, April 1978, pp. 95-113, (e) printed in India. Interactive computer graphics for bio-stereochemical modelling. ROBERT REIN, SHLOMONIR, KAREN HAYDOCK and. ROBERTD MACELROY. Department of Experimental Pathology, Roswell Park Memorial Institute,. 666 Elm ...

  8. Simulation model of load balancing in distributed computing systems

    Science.gov (United States)

    Botygin, I. A.; Popov, V. N.; Frolov, S. G.

    2017-02-01

    The availability of high-performance computing, high speed data transfer over the network and widespread of software for the design and pre-production in mechanical engineering have led to the fact that at the present time the large industrial enterprises and small engineering companies implement complex computer systems for efficient solutions of production and management tasks. Such computer systems are generally built on the basis of distributed heterogeneous computer systems. The analytical problems solved by such systems are the key models of research, but the system-wide problems of efficient distribution (balancing) of the computational load and accommodation input, intermediate and output databases are no less important. The main tasks of this balancing system are load and condition monitoring of compute nodes, and the selection of a node for transition of the user’s request in accordance with a predetermined algorithm. The load balancing is one of the most used methods of increasing productivity of distributed computing systems through the optimal allocation of tasks between the computer system nodes. Therefore, the development of methods and algorithms for computing optimal scheduling in a distributed system, dynamically changing its infrastructure, is an important task.

  9. Computational brain models: Advances from system biology and future challenges

    Directory of Open Access Journals (Sweden)

    George E. Barreto

    2015-02-01

    Full Text Available Computational brain models focused on the interactions between neurons and astrocytes, modeled via metabolic reconstructions, are reviewed. The large source of experimental data provided by the -omics techniques and the advance/application of computational and data-management tools are being fundamental. For instance, in the understanding of the crosstalk between these cells, the key neuroprotective mechanisms mediated by astrocytes in specific metabolic scenarios (1 and the identification of biomarkers for neurodegenerative diseases (2,3. However, the modeling of these interactions demands a clear view of the metabolic and signaling pathways implicated, but most of them are controversial and are still under evaluation (4. Hence, to gain insight into the complexity of these interactions a current view of the main pathways implicated in the neuron-astrocyte communication processes have been made from recent experimental reports and reviews. Furthermore, target problems, limitations and main conclusions have been identified from metabolic models of the brain reported from 2010. Finally, key aspects to take into account into the development of a computational model of the brain and topics that could be approached from a systems biology perspective in future research are highlighted.

  10. Some Comparisons of Complexity in Dictionary-Based and Linear Computational Models

    Czech Academy of Sciences Publication Activity Database

    Gnecco, G.; Kůrková, Věra; Sanguineti, M.

    2011-01-01

    Roč. 24, č. 2 (2011), s. 171-182 ISSN 0893-6080 R&D Project s: GA ČR GA201/08/1744 Grant - others:CNR - AV ČR project 2010-2012(XE) Complexity of Neural-Network and Kernel Computational Models Institutional research plan: CEZ:AV0Z10300504 Keywords : linear approximation schemes * variable-basis approximation schemes * model complexity * worst-case errors * neural networks * kernel models Subject RIV: IN - Informatics, Computer Science Impact factor: 2.182, year: 2011

  11. Method of generating a computer readable model

    DEFF Research Database (Denmark)

    2008-01-01

    A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element. The met......A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element....... The method comprises encoding a first and a second one of the construction elements as corresponding data structures, each representing the connection elements of the corresponding construction element, and each of the connection elements having associated with it a predetermined connection type. The method...... further comprises determining a first connection element of the first construction element and a second connection element of the second construction element located in a predetermined proximity of each other; and retrieving connectivity information of the corresponding connection types of the first...

  12. A survey on computational intelligence approaches for predictive modeling in prostate cancer

    OpenAIRE

    Cosma, G; Brown, D; Archer, M; Khan, M; Pockley, AG

    2017-01-01

    Predictive modeling in medicine involves the development of computational models which are capable of analysing large amounts of data in order to predict healthcare outcomes for individual patients. Computational intelligence approaches are suitable when the data to be modelled are too complex forconventional statistical techniques to process quickly and eciently. These advanced approaches are based on mathematical models that have been especially developed for dealing with the uncertainty an...

  13. "Let's get physical": advantages of a physical model over 3D computer models and textbooks in learning imaging anatomy.

    Science.gov (United States)

    Preece, Daniel; Williams, Sarah B; Lam, Richard; Weller, Renate

    2013-01-01

    Three-dimensional (3D) information plays an important part in medical and veterinary education. Appreciating complex 3D spatial relationships requires a strong foundational understanding of anatomy and mental 3D visualization skills. Novel learning resources have been introduced to anatomy training to achieve this. Objective evaluation of their comparative efficacies remains scarce in the literature. This study developed and evaluated the use of a physical model in demonstrating the complex spatial relationships of the equine foot. It was hypothesized that the newly developed physical model would be more effective for students to learn magnetic resonance imaging (MRI) anatomy of the foot than textbooks or computer-based 3D models. Third year veterinary medicine students were randomly assigned to one of three teaching aid groups (physical model; textbooks; 3D computer model). The comparative efficacies of the three teaching aids were assessed through students' abilities to identify anatomical structures on MR images. Overall mean MRI assessment scores were significantly higher in students utilizing the physical model (86.39%) compared with students using textbooks (62.61%) and the 3D computer model (63.68%) (P < 0.001), with no significant difference between the textbook and 3D computer model groups (P = 0.685). Student feedback was also more positive in the physical model group compared with both the textbook and 3D computer model groups. Our results suggest that physical models may hold a significant advantage over alternative learning resources in enhancing visuospatial and 3D understanding of complex anatomical architecture, and that 3D computer models have significant limitations with regards to 3D learning. © 2013 American Association of Anatomists.

  14. Multi-objective reverse logistics model for integrated computer waste management.

    Science.gov (United States)

    Ahluwalia, Poonam Khanijo; Nema, Arvind K

    2006-12-01

    This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.

  15. Computational models of the pulmonary circulation: Insights and the move towards clinically directed studies

    Science.gov (United States)

    Tawhai, Merryn H.; Clark, Alys R.; Burrowes, Kelly S.

    2011-01-01

    Biophysically-based computational models provide a tool for integrating and explaining experimental data, observations, and hypotheses. Computational models of the pulmonary circulation have evolved from minimal and efficient constructs that have been used to study individual mechanisms that contribute to lung perfusion, to sophisticated multi-scale and -physics structure-based models that predict integrated structure-function relationships within a heterogeneous organ. This review considers the utility of computational models in providing new insights into the function of the pulmonary circulation, and their application in clinically motivated studies. We review mathematical and computational models of the pulmonary circulation based on their application; we begin with models that seek to answer questions in basic science and physiology and progress to models that aim to have clinical application. In looking forward, we discuss the relative merits and clinical relevance of computational models: what important features are still lacking; and how these models may ultimately be applied to further increasing our understanding of the mechanisms occurring in disease of the pulmonary circulation. PMID:22034608

  16. International Nuclear Model personal computer (PCINM): Model documentation

    International Nuclear Information System (INIS)

    1992-08-01

    The International Nuclear Model (INM) was developed to assist the Energy Information Administration (EIA), U.S. Department of Energy (DOE) in producing worldwide projections of electricity generation, fuel cycle requirements, capacities, and spent fuel discharges from commercial nuclear reactors. The original INM was developed, maintained, and operated on a mainframe computer system. In spring 1992, a streamlined version of INM was created for use on a microcomputer utilizing CLIPPER and PCSAS software. This new version is known as PCINM. This documentation is based on the new PCINM version. This document is designed to satisfy the requirements of several categories of users of the PCINM system including technical analysts, theoretical modelers, and industry observers. This document assumes the reader is familiar with the nuclear fuel cycle and each of its components. This model documentation contains four chapters and seven appendices. Chapter Two presents the model overview containing the PCINM structure and process flow, the areas for which projections are made, and input data and output reports. Chapter Three presents the model technical specifications showing all model equations, algorithms, and units of measure. Chapter Four presents an overview of all parameters, variables, and assumptions used in PCINM. The appendices present the following detailed information: variable and parameter listings, variable and equation cross reference tables, source code listings, file layouts, sample report outputs, and model run procedures. 2 figs

  17. Mathematical modeling and computational prediction of cancer drug resistance.

    Science.gov (United States)

    Sun, Xiaoqiang; Hu, Bin

    2017-06-23

    Diverse forms of resistance to anticancer drugs can lead to the failure of chemotherapy. Drug resistance is one of the most intractable issues for successfully treating cancer in current clinical practice. Effective clinical approaches that could counter drug resistance by restoring the sensitivity of tumors to the targeted agents are urgently needed. As numerous experimental results on resistance mechanisms have been obtained and a mass of high-throughput data has been accumulated, mathematical modeling and computational predictions using systematic and quantitative approaches have become increasingly important, as they can potentially provide deeper insights into resistance mechanisms, generate novel hypotheses or suggest promising treatment strategies for future testing. In this review, we first briefly summarize the current progress of experimentally revealed resistance mechanisms of targeted therapy, including genetic mechanisms, epigenetic mechanisms, posttranslational mechanisms, cellular mechanisms, microenvironmental mechanisms and pharmacokinetic mechanisms. Subsequently, we list several currently available databases and Web-based tools related to drug sensitivity and resistance. Then, we focus primarily on introducing some state-of-the-art computational methods used in drug resistance studies, including mechanism-based mathematical modeling approaches (e.g. molecular dynamics simulation, kinetic model of molecular networks, ordinary differential equation model of cellular dynamics, stochastic model, partial differential equation model, agent-based model, pharmacokinetic-pharmacodynamic model, etc.) and data-driven prediction methods (e.g. omics data-based conventional screening approach for node biomarkers, static network approach for edge biomarkers and module biomarkers, dynamic network approach for dynamic network biomarkers and dynamic module network biomarkers, etc.). Finally, we discuss several further questions and future directions for the use of

  18. The European computer model for optronic system performance prediction (ECOMOS)

    NARCIS (Netherlands)

    Kessler, S.; Bijl, P.; Labarre, L.; Repasi, E.; Wittenstein, W.; Bürsing, H.

    2017-01-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The

  19. Two-Language, Two-Paradigm Introductory Computing Curriculum Model and its Implementation

    OpenAIRE

    Zanev, Vladimir; Radenski, Atanas

    2011-01-01

    This paper analyzes difficulties with the introduction of object-oriented concepts in introductory computing education and then proposes a two-language, two-paradigm curriculum model that alleviates such difficulties. Our two-language, two-paradigm curriculum model begins with teaching imperative programming using Python programming language, continues with teaching object-oriented computing using Java, and concludes with teaching object-oriented data structures with Java.

  20. Computational and Game-Theoretic Approaches for Modeling Bounded Rationality

    NARCIS (Netherlands)

    L. Waltman (Ludo)

    2011-01-01

    textabstractThis thesis studies various computational and game-theoretic approaches to economic modeling. Unlike traditional approaches to economic modeling, the approaches studied in this thesis do not rely on the assumption that economic agents behave in a fully rational way. Instead, economic

  1. Computational modeling for fluid flow and interfacial transport

    CERN Document Server

    Shyy, Wei

    2006-01-01

    Practical applications and examples highlight this treatment of computational modeling for handling complex flowfields. A reference for researchers and graduate students of many different backgrounds, it also functions as a text for learning essential computation elements.Drawing upon his own research, the author addresses both macroscopic and microscopic features. He begins his three-part treatment with a survey of the basic concepts of finite difference schemes for solving parabolic, elliptic, and hyperbolic partial differential equations. The second part concerns issues related to computati

  2. Dynamical Trust and Reputation Computation Model for B2C E-Commerce

    OpenAIRE

    Bo Tian; Kecheng Liu; Yuanzhong Chen

    2015-01-01

    Trust is one of the most important factors that influence the successful application of network service environments, such as e-commerce, wireless sensor networks, and online social networks. Computation models associated with trust and reputation have been paid special attention in both computer societies and service science in recent years. In this paper, a dynamical computation model of reputation for B2C e-commerce is proposed. Firstly, conceptions associated with trust and reputation are...

  3. Computer Modeling of Platinum Reforming Reactors | Momoh ...

    African Journals Online (AJOL)

    This paper, instead of using a theoretical approach has considered a computer model as means of assessing the reformate composition for three-stage fixed bed reactors in platforming unit. This is done by identifying many possible hydrocarbon transformation reactions that are peculiar to the process unit, identify the ...

  4. Multi-resolution voxel phantom modeling: a high-resolution eye model for computational dosimetry.

    Science.gov (United States)

    Caracappa, Peter F; Rhodes, Ashley; Fiedler, Derek

    2014-09-21

    Voxel models of the human body are commonly used for simulating radiation dose with a Monte Carlo radiation transport code. Due to memory limitations, the voxel resolution of these computational phantoms is typically too large to accurately represent the dimensions of small features such as the eye. Recently reduced recommended dose limits to the lens of the eye, which is a radiosensitive tissue with a significant concern for cataract formation, has lent increased importance to understanding the dose to this tissue. A high-resolution eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and combined with an existing set of whole-body models to form a multi-resolution voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole-body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.

  5. Predictive Behavior of a Computational Foot/Ankle Model through Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Ruchi D. Chande

    2017-01-01

    Full Text Available Computational models are useful tools to study the biomechanics of human joints. Their predictive performance is heavily dependent on bony anatomy and soft tissue properties. Imaging data provides anatomical requirements while approximate tissue properties are implemented from literature data, when available. We sought to improve the predictive capability of a computational foot/ankle model by optimizing its ligament stiffness inputs using feedforward and radial basis function neural networks. While the former demonstrated better performance than the latter per mean square error, both networks provided reasonable stiffness predictions for implementation into the computational model.

  6. Improved object optimal synthetic description, modeling, learning, and discrimination by GEOGINE computational kernel

    Science.gov (United States)

    Fiorini, Rodolfo A.; Dacquino, Gianfranco

    2005-03-01

    GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous

  7. Positioning graphical objects on computer screens: a three-phase model.

    Science.gov (United States)

    Pastel, Robert

    2011-02-01

    This experiment identifies and models phases during the positioning of graphical objects (called cursors in this article) on computer displays. The human computer-interaction community has traditionally used Fitts' law to model selection in graphical user interfaces, whereas human factors experiments have found the single-component Fitts' law inadequate to model positioning of real objects. Participants (N=145) repeatedly positioned variably sized square cursors within variably sized rectangular targets using computer mice. The times for the cursor to just touch the target, for the cursor to enter the target, and for participants to indicate positioning completion were observed. The positioning tolerances were varied from very precise and difficult to imprecise and easy. The time for the cursor to touch the target was proportional to the initial cursor-target distance. The time for the cursor to completely enter the target after touching was proportional to the logarithms of cursor size divided by target tolerances. The time for participants to indicate positioning after entering was inversely proportional to the tolerance. A three-phase model defined by regions--distant, proximate, and inside the target--was proposed and could model the positioning tasks. The three-phase model provides a framework for ergonomists to evaluate new positioning techniques and can explain their deficiencies. The model provides a means to analyze tasks and enhance interaction during positioning.

  8. Modeling Techniques for a Computational Efficient Dynamic Turbofan Engine Model

    Directory of Open Access Journals (Sweden)

    Rory A. Roberts

    2014-01-01

    Full Text Available A transient two-stream engine model has been developed. Individual component models developed exclusively in MATLAB/Simulink including the fan, high pressure compressor, combustor, high pressure turbine, low pressure turbine, plenum volumes, and exit nozzle have been combined to investigate the behavior of a turbofan two-stream engine. Special attention has been paid to the development of transient capabilities throughout the model, increasing physics model, eliminating algebraic constraints, and reducing simulation time through enabling the use of advanced numerical solvers. The lessening of computation time is paramount for conducting future aircraft system-level design trade studies and optimization. The new engine model is simulated for a fuel perturbation and a specified mission while tracking critical parameters. These results, as well as the simulation times, are presented. The new approach significantly reduces the simulation time.

  9. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs.

    Directory of Open Access Journals (Sweden)

    Qifan Kuang

    Full Text Available Early and accurate identification of adverse drug reactions (ADRs is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs.In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper.Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  10. An efficient hysteresis modeling methodology and its implementation in field computation applications

    Energy Technology Data Exchange (ETDEWEB)

    Adly, A.A., E-mail: adlyamr@gmail.com [Electrical Power and Machines Dept., Faculty of Engineering, Cairo University, Giza 12613 (Egypt); Abd-El-Hafiz, S.K. [Engineering Mathematics Department, Faculty of Engineering, Cairo University, Giza 12613 (Egypt)

    2017-07-15

    Highlights: • An approach to simulate hysteresis while taking shape anisotropy into consideration. • Utilizing the ensemble of triangular sub-regions hysteresis models in field computation. • A novel tool capable of carrying out field computation while keeping track of hysteresis losses. • The approach may be extended for 3D tetra-hedra sub-volumes. - Abstract: Field computation in media exhibiting hysteresis is crucial to a variety of applications such as magnetic recording processes and accurate determination of core losses in power devices. Recently, Hopfield neural networks (HNN) have been successfully configured to construct scalar and vector hysteresis models. This paper presents an efficient hysteresis modeling methodology and its implementation in field computation applications. The methodology is based on the application of the integral equation approach on discretized triangular magnetic sub-regions. Within every triangular sub-region, hysteresis properties are realized using a 3-node HNN. Details of the approach and sample computation results are given in the paper.

  11. Petri Net Modeling of Computer Virus Life Cycle | Ikekonwu ...

    African Journals Online (AJOL)

    Virus life cycle, which refers to the stages of development of a computer virus, is presented as a suitable area for the application of Petri nets. Petri nets a powerful modeling tool in the field of dynamic system analysis is applied to model the virus life cycle. Simulation of the derived model is also presented. The intention of ...

  12. Navier-Stokes Computations With One-Equation Turbulence Model for Flows Along Concave Wall Surfaces

    Science.gov (United States)

    Wang, Chi R.

    2005-01-01

    This report presents the use of a time-marching three-dimensional compressible Navier-Stokes equation numerical solver with a one-equation turbulence model to simulate the flow fields developed along concave wall surfaces without and with a downstream extension flat wall surface. The 3-D Navier- Stokes numerical solver came from the NASA Glenn-HT code. The one-equation turbulence model was derived from the Spalart and Allmaras model. The computational approach was first calibrated with the computations of the velocity and Reynolds shear stress profiles of a steady flat plate boundary layer flow. The computational approach was then used to simulate developing boundary layer flows along concave wall surfaces without and with a downstream extension wall. The author investigated the computational results of surface friction factors, near surface velocity components, near wall temperatures, and a turbulent shear stress component in terms of turbulence modeling, computational mesh configurations, inlet turbulence level, and time iteration step. The computational results were compared with existing measurements of skin friction factors, velocity components, and shear stresses of the developing boundary layer flows. With a fine computational mesh and a one-equation model, the computational approach could predict accurately the skin friction factors, near surface velocity and temperature, and shear stress within the flows. The computed velocity components and shear stresses also showed the vortices effect on the velocity variations over a concave wall. The computed eddy viscosities at the near wall locations were also compared with the results from a two equation turbulence modeling technique. The inlet turbulence length scale was found to have little effect on the eddy viscosities at locations near the concave wall surface. The eddy viscosities, from the one-equation and two-equation modeling, were comparable at most stream-wise stations. The present one

  13. Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy.

    Science.gov (United States)

    Morimoto, Satoshi; Remijn, Gerard B; Nakajima, Yoshitaka

    2016-01-01

    Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord.

  14. Computer-Aided Template for Model Reuse, Development and Maintenance

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2014-01-01

    A template-based approach for model development is presented in this work. Based on a model decomposition technique, the computer-aided template concept has been developed. This concept is implemented as a software tool , which provides a user-friendly interface for following the workflow steps...

  15. Urbancontext: A Management Model For Pervasive Environments In User-Oriented Urban Computing

    Directory of Open Access Journals (Sweden)

    Claudia L. Zuniga-Canon

    2014-01-01

    Full Text Available Nowadays, urban computing has gained a lot of interest for guiding the evolution of citiesinto intelligent environments. These environments are appropriated for individuals’ inter-actions changing in their behaviors. These changes require new approaches that allow theunderstanding of how urban computing systems should be modeled.In this work we present UrbanContext, a new model for designing of urban computingplatforms that applies the theory of roles to manage the individual’s context in urban envi-ronments. The theory of roles helps to understand the individual’s behavior within a socialenvironment, allowing to model urban computing systems able to adapt to individuals statesand their needs.UrbanContext collects data in urban atmospheres and classifies individuals’ behaviorsaccording to their change of roles, to optimize social interaction and offer secure services.Likewise, UrbanContext serves as a generic model to provide interoperability, and to facilitatethe design, implementation and expansion of urban computing systems.

  16. Fast parallel algorithm for three-dimensional distance-driven model in iterative computed tomography reconstruction

    International Nuclear Information System (INIS)

    Chen Jian-Lin; Li Lei; Wang Lin-Yuan; Cai Ai-Long; Xi Xiao-Qi; Zhang Han-Ming; Li Jian-Xin; Yan Bin

    2015-01-01

    The projection matrix model is used to describe the physical relationship between reconstructed object and projection. Such a model has a strong influence on projection and backprojection, two vital operations in iterative computed tomographic reconstruction. The distance-driven model (DDM) is a state-of-the-art technology that simulates forward and back projections. This model has a low computational complexity and a relatively high spatial resolution; however, it includes only a few methods in a parallel operation with a matched model scheme. This study introduces a fast and parallelizable algorithm to improve the traditional DDM for computing the parallel projection and backprojection operations. Our proposed model has been implemented on a GPU (graphic processing unit) platform and has achieved satisfactory computational efficiency with no approximation. The runtime for the projection and backprojection operations with our model is approximately 4.5 s and 10.5 s per loop, respectively, with an image size of 256×256×256 and 360 projections with a size of 512×512. We compare several general algorithms that have been proposed for maximizing GPU efficiency by using the unmatched projection/backprojection models in a parallel computation. The imaging resolution is not sacrificed and remains accurate during computed tomographic reconstruction. (paper)

  17. Image based Monte Carlo modeling for computational phantom

    International Nuclear Information System (INIS)

    Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.

    2013-01-01

    Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)

  18. Case studies in Gaussian process modelling of computer codes

    International Nuclear Information System (INIS)

    Kennedy, Marc C.; Anderson, Clive W.; Conti, Stefano; O'Hagan, Anthony

    2006-01-01

    In this paper we present a number of recent applications in which an emulator of a computer code is created using a Gaussian process model. Tools are then applied to the emulator to perform sensitivity analysis and uncertainty analysis. Sensitivity analysis is used both as an aid to model improvement and as a guide to how much the output uncertainty might be reduced by learning about specific inputs. Uncertainty analysis allows us to reflect output uncertainty due to unknown input parameters, when the finished code is used for prediction. The computer codes themselves are currently being developed within the UK Centre for Terrestrial Carbon Dynamics

  19. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  20. Computer modelling for better diagnosis and therapy of patients by cardiac resynchronisation therapy

    NARCIS (Netherlands)

    Pluijmert, Marieke; Lumens, Joost; Potse, Mark; Delhaas, Tammo; Auricchio, Angelo; Prinzen, Frits W

    2015-01-01

    Mathematical or computer models have become increasingly popular in biomedical science. Although they are a simplification of reality, computer models are able to link a multitude of processes to each other. In the fields of cardiac physiology and cardiology, models can be used to describe the

  1. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    OpenAIRE

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality...

  2. Plasticity modeling & computation

    CERN Document Server

    Borja, Ronaldo I

    2013-01-01

    There have been many excellent books written on the subject of plastic deformation in solids, but rarely can one find a textbook on this subject. “Plasticity Modeling & Computation” is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids. It adopts a simple narrative style that is not mathematically overbearing, and has been written to emulate a professor giving a lecture on this subject inside a classroom. Each section is written to provide a balance between the relevant equations and the explanations behind them. Where relevant, sections end with one or more exercises designed to reinforce the understanding of the “lecture.” Color figures enhance the presentation and make the book very pleasant to read. For professors planning to use this textbook for their classes, the contents are sufficient for Parts A and B that can be taught in sequence over a period of two semesters or quarters.

  3. Computer Hardware, Advanced Mathematics and Model Physics pilot project final report

    International Nuclear Information System (INIS)

    1992-05-01

    The Computer Hardware, Advanced Mathematics and Model Physics (CHAMMP) Program was launched in January, 1990. A principal objective of the program has been to utilize the emerging capabilities of massively parallel scientific computers in the challenge of regional scale predictions of decade-to-century climate change. CHAMMP has already demonstrated the feasibility of achieving a 10,000 fold increase in computational throughput for climate modeling in this decade. What we have also recognized, however, is the need for new algorithms and computer software to capitalize on the radically new computing architectures. This report describes the pilot CHAMMP projects at the DOE National Laboratories and the National Center for Atmospheric Research (NCAR). The pilot projects were selected to identify the principal challenges to CHAMMP and to entrain new scientific computing expertise. The success of some of these projects has aided in the definition of the CHAMMP scientific plan. Many of the papers in this report have been or will be submitted for publication in the open literature. Readers are urged to consult with the authors directly for questions or comments about their papers

  4. LMFBR models for the ORIGEN2 computer code

    International Nuclear Information System (INIS)

    Croff, A.G.; McAdoo, J.W.; Bjerke, M.A.

    1981-10-01

    Reactor physics calculations have led to the development of nine liquid-metal fast breeder reactor (LMFBR) models for the ORIGEN2 computer code. Four of the models are based on the U-Pu fuel cycle, two are based on the Th-U-Pu fuel cycle, and three are based on the Th- 238 U fuel cycle. The reactor models are based on cross sections taken directly from the reactor physics codes. Descriptions of the reactor models as well as values for the ORIGEN2 flux parameters THERM, RES, and FAST are given

  5. High-throughput landslide modelling using computational grids

    Science.gov (United States)

    Wallace, M.; Metson, S.; Holcombe, L.; Anderson, M.; Newbold, D.; Brook, N.

    2012-04-01

    Landslides are an increasing problem in developing countries. Multiple landslides can be triggered by heavy rainfall resulting in loss of life, homes and critical infrastructure. Through computer simulation of individual slopes it is possible to predict the causes, timing and magnitude of landslides and estimate the potential physical impact. Geographical scientists at the University of Bristol have developed software that integrates a physically-based slope hydrology and stability model (CHASM) with an econometric model (QUESTA) in order to predict landslide risk over time. These models allow multiple scenarios to be evaluated for each slope, accounting for data uncertainties, different engineering interventions, risk management approaches and rainfall patterns. Individual scenarios can be computationally intensive, however each scenario is independent and so multiple scenarios can be executed in parallel. As more simulations are carried out the overhead involved in managing input and output data becomes significant. This is a greater problem if multiple slopes are considered concurrently, as is required both for landslide research and for effective disaster planning at national levels. There are two critical factors in this context: generated data volumes can be in the order of tens of terabytes, and greater numbers of simulations result in long total runtimes. Users of such models, in both the research community and in developing countries, need to develop a means for handling the generation and submission of landside modelling experiments, and the storage and analysis of the resulting datasets. Additionally, governments in developing countries typically lack the necessary computing resources and infrastructure. Consequently, knowledge that could be gained by aggregating simulation results from many different scenarios across many different slopes remains hidden within the data. To address these data and workload management issues, University of Bristol particle

  6. LION: A dynamic computer model for the low-latitude ionosphere

    Directory of Open Access Journals (Sweden)

    J. A. Bittencourt

    2007-11-01

    Full Text Available A realistic fully time-dependent computer model, denominated LION (Low-latitude Ionospheric model, that simulates the dynamic behavior of the low-latitude ionosphere is presented. The time evolution and spatial distribution of the ionospheric particle densities and velocities are computed by numerically solving the time-dependent, coupled, nonlinear system of continuity and momentum equations for the ions O+, O2+, NO+, N2+ and N+, taking into account photoionization of the atmospheric species by the solar extreme ultraviolet radiation, chemical and ionic production and loss reactions, and plasma transport processes, including the ionospheric effects of thermospheric neutral winds, plasma diffusion and electromagnetic E×B plasma drifts. The Earth's magnetic field is represented by a tilted centered magnetic dipole. This set of coupled nonlinear equations is solved along a given magnetic field line in a Lagrangian frame of reference moving vertically, in the magnetic meridian plane, with the electromagnetic E×B plasma drift velocity. The spatial and time distribution of the thermospheric neutral wind velocities and the pattern of the electromagnetic drifts are taken as known quantities, given through specified analytical or empirical models. The model simulation results are presented in the form of computer-generated color maps and reproduce the typical ionization distribution and time evolution normally observed in the low-latitude ionosphere, including details of the equatorial Appleton anomaly dynamics. The specific effects on the ionosphere due to changes in the thermospheric neutral winds and the electromagnetic plasma drifts can be investigated using different wind and drift models, including the important longitudinal effects associated with magnetic declination dependence and latitudinal separation between geographic and

  7. LION: A dynamic computer model for the low-latitude ionosphere

    Directory of Open Access Journals (Sweden)

    J. A. Bittencourt

    2007-11-01

    Full Text Available A realistic fully time-dependent computer model, denominated LION (Low-latitude Ionospheric model, that simulates the dynamic behavior of the low-latitude ionosphere is presented. The time evolution and spatial distribution of the ionospheric particle densities and velocities are computed by numerically solving the time-dependent, coupled, nonlinear system of continuity and momentum equations for the ions O+, O2+, NO+, N2+ and N+, taking into account photoionization of the atmospheric species by the solar extreme ultraviolet radiation, chemical and ionic production and loss reactions, and plasma transport processes, including the ionospheric effects of thermospheric neutral winds, plasma diffusion and electromagnetic E×B plasma drifts. The Earth's magnetic field is represented by a tilted centered magnetic dipole. This set of coupled nonlinear equations is solved along a given magnetic field line in a Lagrangian frame of reference moving vertically, in the magnetic meridian plane, with the electromagnetic E×B plasma drift velocity. The spatial and time distribution of the thermospheric neutral wind velocities and the pattern of the electromagnetic drifts are taken as known quantities, given through specified analytical or empirical models. The model simulation results are presented in the form of computer-generated color maps and reproduce the typical ionization distribution and time evolution normally observed in the low-latitude ionosphere, including details of the equatorial Appleton anomaly dynamics. The specific effects on the ionosphere due to changes in the thermospheric neutral winds and the electromagnetic plasma drifts can be investigated using different wind and drift models, including the important longitudinal effects associated with magnetic declination dependence and latitudinal separation between geographic and geomagnetic equators. The model runs in a normal personal computer (PC and generates color maps illustrating the

  8. Computer models and output, Spartan REM: Appendix B

    Science.gov (United States)

    Marlowe, D. S.; West, E. J.

    1984-01-01

    A computer model of the Spartan Release Engagement Mechanism (REM) is presented in a series of numerical charts and engineering drawings. A crack growth analysis code is used to predict the fracture mechanics of critical components.

  9. The origins of computer weather prediction and climate modeling

    International Nuclear Information System (INIS)

    Lynch, Peter

    2008-01-01

    Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. A fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed

  10. Computer-assisted modeling: Contributions of computational approaches to elucidating macromolecular structure and function: Final report

    International Nuclear Information System (INIS)

    Walton, S.

    1987-01-01

    The Committee, asked to provide an assessment of computer-assisted modeling of molecular structure, has highlighted the signal successes and the significant limitations for a broad panoply of technologies and has projected plausible paths of development over the next decade. As with any assessment of such scope, differing opinions about present or future prospects were expressed. The conclusions and recommendations, however, represent a consensus of our views of the present status of computational efforts in this field

  11. A Computational Model of Cellular Engraftment on Lung Scaffolds.

    Science.gov (United States)

    Pothen, Joshua J; Rajendran, Vignesh; Wagner, Darcy; Weiss, Daniel J; Smith, Bradford J; Ma, Baoshun; Bates, Jason H T

    2016-01-01

    The possibility that stem cells might be used to regenerate tissue is now being investigated for a variety of organs, but these investigations are still essentially exploratory and have few predictive tools available to guide experimentation. We propose, in this study, that the field of lung tissue regeneration might be better served by predictive tools that treat stem cells as agents that obey certain rules of behavior governed by both their phenotype and their environment. Sufficient knowledge of these rules of behavior would then, in principle, allow lung tissue development to be simulated computationally. Toward this end, we developed a simple agent-based computational model to simulate geographic patterns of cells seeded onto a lung scaffold. Comparison of the simulated patterns to those observed experimentally supports the hypothesis that mesenchymal stem cells proliferate preferentially toward the scaffold boundary, whereas alveolar epithelial cells do not. This demonstrates that a computational model of this type has the potential to assist in the discovery of rules of cellular behavior.

  12. Computer Modeling of Radiation Portal Monitors for Homeland Security Applications

    International Nuclear Information System (INIS)

    Pagh, Richard T.; Kouzes, Richard T.; McConn, Ronald J.; Robinson, Sean M.; Schweppe, John E.; Siciliano, Edward R.

    2005-01-01

    Radiation Portal Monitors (RPMs) are currently being used at our nation's borders to detect potential nuclear threats. At the Pacific Northwest National Laboratory (PNNL), realistic computer models of RPMs are being developed to simulate the screening of vehicles and cargo. Detailed models of the detection equipment, vehicles, cargo containers, cargos, and radioactive sources are being used to determine the optimal configuration of detectors. These models can also be used to support work to optimize alarming algorithms so that they maximize sensitivity for items of interest while minimizing nuisance alarms triggered by legitimate radioactive material in the commerce stream. Proposed next-generation equipment is also being modeled to quantify performance and capability improvements to detect potential nuclear threats. A discussion of the methodology used to perform computer modeling for RPMs will be provided. In addition, the efforts to validate models used to perform these scenario analyses will be described. Finally, areas where improved modeling capability is needed will be discussed as a guide to future development efforts

  13. Modelling Emission from Building Materials with Computational Fluid Dynamics

    DEFF Research Database (Denmark)

    Topp, Claus; Nielsen, Peter V.; Heiselberg, Per

    This paper presents a numerical model that by means of computational fluid dynamics (CFD) is capable of dealing with both pollutant transport across the boundary layer and internal diffusion in the source without prior knowledge of which is the limiting process. The model provides the concentration...

  14. Single-server blind quantum computation with quantum circuit model

    Science.gov (United States)

    Zhang, Xiaoqian; Weng, Jian; Li, Xiaochun; Luo, Weiqi; Tan, Xiaoqing; Song, Tingting

    2018-06-01

    Blind quantum computation (BQC) enables the client, who has few quantum technologies, to delegate her quantum computation to a server, who has strong quantum computabilities and learns nothing about the client's quantum inputs, outputs and algorithms. In this article, we propose a single-server BQC protocol with quantum circuit model by replacing any quantum gate with the combination of rotation operators. The trap quantum circuits are introduced, together with the combination of rotation operators, such that the server is unknown about quantum algorithms. The client only needs to perform operations X and Z, while the server honestly performs rotation operators.

  15. A distributed multiscale computation of a tightly coupled model using the Multiscale Modeling Language

    NARCIS (Netherlands)

    Borgdorff, J.; Bona-Casas, C.; Mamonski, M.; Kurowski, K.; Piontek, T.; Bosak, B.; Rycerz, K.; Ciepiela, E.; Gubala, T.; Harezlak, D.; Bubak, M.; Lorenz, E.; Hoekstra, A.G.

    2012-01-01

    Nature is observed at all scales; with multiscale modeling, scientists bring together several scales for a holistic analysis of a phenomenon. The models on these different scales may require significant but also heterogeneous computational resources, creating the need for distributed multiscale

  16. The Effects of Computer-assisted and Distance Learning of Geometric Modeling

    Directory of Open Access Journals (Sweden)

    Omer Faruk Sozcu

    2013-01-01

    Full Text Available The effects of computer-assisted and distance learning of geometric modeling and computer aided geometric design are studied. It was shown that computer algebra systems and dynamic geometric environments can be considered as excellent tools for teaching mathematical concepts of mentioned areas, and distance education technologies would be indispensable for consolidation of successfully passed topics

  17. How Computer Music Modeling and Retrieval Interface with Music-And-Meaning Studies

    DEFF Research Database (Denmark)

    Grund, Cynthia M.

    2007-01-01

      Inspired by the interest generated as the result of a panel discussion dealing with cross-disciplinarity and computer music modeling and retrieval (CMMR) at CMMR 2005 - "Play!" - in Pisa, a panel discussion on  current issues of a cross-disciplinary character has been organized for ICMC07/CMMR...... 2007. Eight panelists will be dealing with the two questions: a) What are current issues within music-and-meaning studies, the examination of which mandates development of new techniques within computer music modeling and retrieval?  and b) Do current techniques within computer music modeling...... and retrieval give rise to new questions within music-and-meaning studies?...

  18. Computational model of surface ablation from tokamak disruptions

    International Nuclear Information System (INIS)

    Ehst, D.; Hassanein, A.

    1993-10-01

    Energy transfer to material surfaces is dominated by photon radiation through low temperature plasma vapors if tokamak disruptions are due to low kinetic energy particles ( < 100 eV). Simple models of radiation transport are derived and incorporated into a fast-running computer routine to model this process. The results of simulations are in fair agreement with plasma gun erosion tests on several metal targets

  19. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-01-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  20. A quasi-particle model for computational nuclei

    International Nuclear Information System (INIS)

    Boal, D.H.; Glosli, J.N.

    1988-03-01

    A model Hamiltonian is derived which provides a computationally efficient means of representing nuclei. The Hamiltonian includes both coulomb and isospin dependent terms, and incorporates antisymmetrization effects through a momentum dependent potential. Unlike many other classical or semiclassical models, the nuclei of this simulation have a well-defined ground state with a a non-vanishing 2 >. It is shown that the binding energies per nucleon and r.m.s. radii of these ground states are close to the measured values over a wide mass range

  1. Nested Incremental Modeling in the Development of Computational Theories: The CDP+ Model of Reading Aloud

    Science.gov (United States)

    Perry, Conrad; Ziegler, Johannes C.; Zorzi, Marco

    2007-01-01

    At least 3 different types of computational model have been shown to account for various facets of both normal and impaired single word reading: (a) the connectionist triangle model, (b) the dual-route cascaded model, and (c) the connectionist dual process model. Major strengths and weaknesses of these models are identified. In the spirit of…

  2. Design Of Computer Based Test Using The Unified Modeling Language

    Science.gov (United States)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  3. Computational model of collagen turnover in carotid arteries during hypertension.

    Science.gov (United States)

    Sáez, P; Peña, E; Tarbell, J M; Martínez, M A

    2015-02-01

    It is well known that biological tissues adapt their properties because of different mechanical and chemical stimuli. The goal of this work is to study the collagen turnover in the arterial tissue of hypertensive patients through a coupled computational mechano-chemical model. Although it has been widely studied experimentally, computational models dealing with the mechano-chemical approach are not. The present approach can be extended easily to study other aspects of bone remodeling or collagen degradation in heart diseases. The model can be divided into three different stages. First, we study the smooth muscle cell synthesis of different biological substances due to over-stretching during hypertension. Next, we study the mass-transport of these substances along the arterial wall. The last step is to compute the turnover of collagen based on the amount of these substances in the arterial wall which interact with each other to modify the turnover rate of collagen. We simulate this process in a finite element model of a real human carotid artery. The final results show the well-known stiffening of the arterial wall due to the increase in the collagen content. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Effects of Task Performance and Task Complexity on the Validity of Computational Models of Attention

    NARCIS (Netherlands)

    Koning, L. de; Maanen, P.P. van; Dongen, K. van

    2008-01-01

    Computational models of attention can be used as a component of decision support systems. For accurate support, a computational model of attention has to be valid and robust. The effects of task performance and task complexity on the validity of three different computational models of attention were

  5. Cardioplegia heat exchanger design modelling using computational fluid dynamics.

    Science.gov (United States)

    van Driel, M R

    2000-11-01

    A new cardioplegia heat exchanger has been developed by Sorin Biomedica. A three-dimensional computer-aided design (CAD) model was optimized using computational fluid dynamics (CFD) modelling. CFD optimization techniques have commonly been applied to velocity flow field analysis, but CFD analysis was also used in this study to predict the heat exchange performance of the design before prototype fabrication. The iterative results of the optimization and the actual heat exchange performance of the final configuration are presented in this paper. Based on the behaviour of this model, both the water and blood fluid flow paths of the heat exchanger were optimized. The simulation predicted superior heat exchange performance using an optimal amount of energy exchange surface area, reducing the total contact surface area, the device priming volume and the material costs. Experimental results confirm the empirical results predicted by the CFD analysis.

  6. Computer model for large-scale offshore wind-power systems

    Energy Technology Data Exchange (ETDEWEB)

    Dambolena, I G [Bucknell Univ., Lewisburg, PA; Rikkers, R F; Kaminsky, F C

    1977-01-01

    A computer-based planning model has been developed to evaluate the cost and simulate the performance of offshore wind-power systems. In these systems, the electricity produced by wind generators either satisfies directly demand or produces hydrogen by water electrolysis. The hydrogen is stored and later used to produce electricity in fuel cells. Using as inputs basic characteristics of the system and historical or computer-generated time series for wind speed and electricity demand, the model simulates system performance over time. A history of the energy produced and the discounted annual cost of the system are used to evaluate alternatives. The output also contains information which is useful in pointing towards more favorable design alternatives. Use of the model to analyze a specific wind-power system for New England indicates that electric energy could perhaps be generated at a competitive cost.

  7. The contribution of high-performance computing and modelling for industrial development

    CSIR Research Space (South Africa)

    Sithole, Happy

    2017-10-01

    Full Text Available Performance Computing and Modelling for Industrial Development Dr Happy Sithole and Dr Onno Ubbink 2 Strategic context • High-performance computing (HPC) combined with machine Learning and artificial intelligence present opportunities to non...

  8. Qudit quantum computation in the Jaynes-Cummings model

    DEFF Research Database (Denmark)

    Mischuck, Brian; Mølmer, Klaus

    2013-01-01

    We have developed methods for performing qudit quantum computation in the Jaynes-Cummings model with the qudits residing in a finite subspace of individual harmonic oscillator modes, resonantly coupled to a spin-1/2 system. The first method determines analytical control sequences for the one......- and two-qudit gates necessary for universal quantum computation by breaking down the desired unitary transformations into a series of state preparations implemented with the Law-Eberly scheme [ Law and Eberly Phys. Rev. Lett. 76 1055 (1996)]. The second method replaces some of the analytical pulse...

  9. Using Computational and Mechanical Models to Study Animal Locomotion

    OpenAIRE

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locom...

  10. Actors: A Model of Concurrent Computation in Distributed Systems.

    Science.gov (United States)

    1985-06-01

    Artificial Intelligence Labora- tory of the Massachusetts Institute of Technology. Support for the labora- tory’s aritificial intelligence research is...RD-A157 917 ACTORS: A MODEL OF CONCURRENT COMPUTATION IN 1/3- DISTRIBUTED SY𔃿TEMS(U) MASSACHUSETTS INST OF TECH CRMBRIDGE ARTIFICIAL INTELLIGENCE ...Computation In Distributed Systems Gui A. Aghai MIT Artificial Intelligence Laboratory Thsdocument ha. been cipp-oved I= pblicrelease and sale; itsI

  11. Computational model of gamma irradiation room at ININ

    Science.gov (United States)

    Rodríguez-Romo, Suemi; Patlan-Cardoso, Fernando; Ibáñez-Orozco, Oscar; Vergara Martínez, Francisco Javier

    2018-03-01

    In this paper, we present a model of the gamma irradiation room at the National Institute of Nuclear Research (ININ is its acronym in Spanish) in Mexico to improve the use of physics in dosimetry for human protection. We deal with air-filled ionization chambers and scientific computing made in house and framed in both the GEANT4 scheme and our analytical approach to characterize the irradiation room. This room is the only secondary dosimetry facility in Mexico. Our aim is to optimize its experimental designs, facilities, and industrial applications of physical radiation. The computational results provided by our model are supported by all the known experimental data regarding the performance of the ININ gamma irradiation room and allow us to predict the values of the main variables related to this fully enclosed space to within an acceptable margin of error.

  12. The BaBar experiment's distributed computing model

    International Nuclear Information System (INIS)

    Boutigny, D.

    2001-01-01

    In order to face the expected increase in statistics between now and 2005, the BaBar experiment at SLAC is evolving its computing model toward a distributed multitier system. It is foreseen that data will be spread among Tier-A centers and deleted from the SLAC center. A uniform computing environment is being deployed in the centers, the network bandwidth is continuously increased and data distribution tools has been designed in order to reach a transfer rate of ∼100 TB of data per year. In parallel, smaller Tier-B and C sites receive subsets of data, presently in Kanga-ROOT format and later in Objectivity format. GRID tools will be used for remote job submission

  13. The BaBar Experiment's Distributed Computing Model

    International Nuclear Information System (INIS)

    Gowdy, Stephen J.

    2002-01-01

    In order to face the expected increase in statistics between now and 2005, the BaBar experiment at SLAC is evolving its computing model toward a distributed multi-tier system. It is foreseen that data will be spread among Tier-A centers and deleted from the SLAC center. A uniform computing environment is being deployed in the centers, the network bandwidth is continuously increased and data distribution tools has been designed in order to reach a transfer rate of ∼100 TB of data per year. In parallel, smaller Tier-B and C sites receive subsets of data, presently in Kanga-ROOT[1] format and later in Objectivity[2] format. GRID tools will be used for remote job submission

  14. Correlation Educational Model in Primary Education Curriculum of Mathematics and Computer Science

    Science.gov (United States)

    Macinko Kovac, Maja; Eret, Lidija

    2012-01-01

    This article gives insight into methodical correlation model of teaching mathematics and computer science. The model shows the way in which the related areas of computer science and mathematics can be supplemented, if it transforms the way of teaching and creates a "joint" lessons. Various didactic materials are designed, in which all…

  15. Developing a computationally efficient dynamic multilevel hybrid optimization scheme using multifidelity model interactions.

    Energy Technology Data Exchange (ETDEWEB)

    Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Castro, Joseph Pete Jr. (; .); Giunta, Anthony Andrew

    2006-01-01

    Many engineering application problems use optimization algorithms in conjunction with numerical simulators to search for solutions. The formulation of relevant objective functions and constraints dictate possible optimization algorithms. Often, a gradient based approach is not possible since objective functions and constraints can be nonlinear, nonconvex, non-differentiable, or even discontinuous and the simulations involved can be computationally expensive. Moreover, computational efficiency and accuracy are desirable and also influence the choice of solution method. With the advent and increasing availability of massively parallel computers, computational speed has increased tremendously. Unfortunately, the numerical and model complexities of many problems still demand significant computational resources. Moreover, in optimization, these expenses can be a limiting factor since obtaining solutions often requires the completion of numerous computationally intensive simulations. Therefore, we propose a multifidelity optimization algorithm (MFO) designed to improve the computational efficiency of an optimization method for a wide range of applications. In developing the MFO algorithm, we take advantage of the interactions between multi fidelity models to develop a dynamic and computational time saving optimization algorithm. First, a direct search method is applied to the high fidelity model over a reduced design space. In conjunction with this search, a specialized oracle is employed to map the design space of this high fidelity model to that of a computationally cheaper low fidelity model using space mapping techniques. Then, in the low fidelity space, an optimum is obtained using gradient or non-gradient based optimization, and it is mapped back to the high fidelity space. In this paper, we describe the theory and implementation details of our MFO algorithm. We also demonstrate our MFO method on some example problems and on two applications: earth penetrators and

  16. Improving science and mathematics education with computational modelling in interactive engagement environments

    Science.gov (United States)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  17. Development of computational small animal models and their applications in preclinical imaging and therapy research.

    Science.gov (United States)

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  18. Development of computational small animal models and their applications in preclinical imaging and therapy research

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Tianwu [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva 4 CH-1211 (Switzerland); Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva 4 CH-1211 (Switzerland); Geneva Neuroscience Center, Geneva University, Geneva CH-1205 (Switzerland); Department of Nuclear Medicine and Molecular Imaging, University of Groningen, University Medical Center Groningen, Groningen 9700 RB (Netherlands)

    2016-01-15

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  19. Development of computational small animal models and their applications in preclinical imaging and therapy research

    International Nuclear Information System (INIS)

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future

  20. Mathematical and computational modeling with applications in natural and social sciences, engineering, and the arts

    CERN Document Server

    Melnik, Roderick

    2015-01-01

    Illustrates the application of mathematical and computational modeling in a variety of disciplines With an emphasis on the interdisciplinary nature of mathematical and computational modeling, Mathematical and Computational Modeling: With Applications in the Natural and Social Sciences, Engineering, and the Arts features chapters written by well-known, international experts in these fields and presents readers with a host of state-of-the-art achievements in the development of mathematical modeling and computational experiment methodology. The book is a valuable guide to the methods, ideas,

  1. Computational modeling of intraocular gas dynamics

    International Nuclear Information System (INIS)

    Noohi, P; Abdekhodaie, M J; Cheng, Y L

    2015-01-01

    The purpose of this study was to develop a computational model to simulate the dynamics of intraocular gas behavior in pneumatic retinopexy (PR) procedure. The presented model predicted intraocular gas volume at any time and determined the tolerance angle within which a patient can maneuver and still gas completely covers the tear(s). Computational fluid dynamics calculations were conducted to describe PR procedure. The geometrical model was constructed based on the rabbit and human eye dimensions. SF_6 in the form of pure and diluted with air was considered as the injected gas. The presented results indicated that the composition of the injected gas affected the gas absorption rate and gas volume. After injection of pure SF_6, the bubble expanded to 2.3 times of its initial volume during the first 23 h, but when diluted SF_6 was used, no significant expansion was observed. Also, head positioning for the treatment of retinal tear influenced the rate of gas absorption. Moreover, the determined tolerance angle depended on the bubble and tear size. More bubble expansion and smaller retinal tear caused greater tolerance angle. For example, after 23 h, for the tear size of 2 mm the tolerance angle of using pure SF_6 is 1.4 times more than that of using diluted SF_6 with 80% air. Composition of the injected gas and conditions of the tear in PR may dramatically affect the gas absorption rate and gas volume. Quantifying these effects helps to predict the tolerance angle and improve treatment efficiency. (paper)

  2. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    Science.gov (United States)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  3. CSDMS2.0: Computational Infrastructure for Community Surface Dynamics Modeling

    Science.gov (United States)

    Syvitski, J. P.; Hutton, E.; Peckham, S. D.; Overeem, I.; Kettner, A.

    2012-12-01

    The Community Surface Dynamic Modeling System (CSDMS) is an NSF-supported, international and community-driven program that seeks to transform the science and practice of earth-surface dynamics modeling. CSDMS integrates a diverse community of more than 850 geoscientists representing 360 international institutions (academic, government, industry) from 60 countries and is supported by a CSDMS Interagency Committee (22 Federal agencies), and a CSDMS Industrial Consortia (18 companies). CSDMS presently distributes more 200 Open Source models and modeling tools, access to high performance computing clusters in support of developing and running models, and a suite of products for education and knowledge transfer. CSDMS software architecture employs frameworks and services that convert stand-alone models into flexible "plug-and-play" components to be assembled into larger applications. CSDMS2.0 will support model applications within a web browser, on a wider variety of computational platforms, and on other high performance computing clusters to ensure robustness and sustainability of the framework. Conversion of stand-alone models into "plug-and-play" components will employ automated wrapping tools. Methods for quantifying model uncertainty are being adapted as part of the modeling framework. Benchmarking data is being incorporated into the CSDMS modeling framework to support model inter-comparison. Finally, a robust mechanism for ingesting and utilizing semantic mediation databases is being developed within the Modeling Framework. Six new community initiatives are being pursued: 1) an earth - ecosystem modeling initiative to capture ecosystem dynamics and ensuing interactions with landscapes, 2) a geodynamics initiative to investigate the interplay among climate, geomorphology, and tectonic processes, 3) an Anthropocene modeling initiative, to incorporate mechanistic models of human influences, 4) a coastal vulnerability modeling initiative, with emphasis on deltas and

  4. The European computer model for optronic system performance prediction (ECOMOS)

    Science.gov (United States)

    Keßler, Stefan; Bijl, Piet; Labarre, Luc; Repasi, Endre; Wittenstein, Wolfgang; Bürsing, Helge

    2017-10-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The project involves close co-operation of defence and security industry and public research institutes from France, Germany, Italy, The Netherlands and Sweden. ECOMOS uses and combines well-accepted existing European tools to build up a strong competitive position. This includes two TA models: the analytical TRM4 model and the image-based TOD model. In addition, it uses the atmosphere model MATISSE. In this paper, the central idea of ECOMOS is exposed. The overall software structure and the underlying models are shown and elucidated. The status of the project development is given as well as a short discussion of validation tests and an outlook on the future potential of simulation for sensor assessment.

  5. Modelling the WWER-type reactor dynamics using a hybrid computer. Part 1

    International Nuclear Information System (INIS)

    Karpeta, C.

    Results of simulation studies into reactor and steam generator dynamics of a WWER type power plant are presented. Spatial kinetics of the reactor core is described by a nodal approximation to diffusion equations, xenon poisoning equations and heat transfer equations. The simulation of the reactor model dynamics was performed on a hybrid computer. Models of both a horizontal and a vertical steam generator were developed. The dynamics was investigated over a large range of power by computing the transients on a digital computer. (author)

  6. Three-Dimensional Computer-Assisted Two-Layer Elastic Models of the Face.

    Science.gov (United States)

    Ueda, Koichi; Shigemura, Yuka; Otsuki, Yuki; Fuse, Asuka; Mitsuno, Daisuke

    2017-11-01

    To make three-dimensional computer-assisted elastic models for the face, we decided on five requirements: (1) an elastic texture like skin and subcutaneous tissue; (2) the ability to take pen marking for incisions; (3) the ability to be cut with a surgical knife; (4) the ability to keep stitches in place for a long time; and (5) a layered structure. After testing many elastic solvents, we have made realistic three-dimensional computer-assisted two-layer elastic models of the face and cleft lip from the computed tomographic and magnetic resonance imaging stereolithographic data. The surface layer is made of polyurethane and the inner layer is silicone. Using this elastic model, we taught residents and young doctors how to make several typical local flaps and to perform cheiloplasty. They could experience realistic simulated surgery and understand three-dimensional movement of the flaps.

  7. A stochastic large deformation model for computational anatomy

    DEFF Research Database (Denmark)

    Arnaudon, Alexis; Holm, Darryl D.; Pai, Akshay Sadananda Uppinakudru

    2017-01-01

    In the study of shapes of human organs using computational anatomy, variations are found to arise from inter-subject anatomical differences, disease-specific effects, and measurement noise. This paper introduces a stochastic model for incorporating random variations into the Large Deformation...

  8. Parameter Estimation for a Computable General Equilibrium Model

    DEFF Research Database (Denmark)

    Arndt, Channing; Robinson, Sherman; Tarp, Finn

    We introduce a maximum entropy approach to parameter estimation for computable general equilibrium (CGE) models. The approach applies information theory to estimating a system of nonlinear simultaneous equations. It has a number of advantages. First, it imposes all general equilibrium constraints...

  9. Dependence of Computational Models on Input Dimension: Tractability of Approximation and Optimization Tasks

    Czech Academy of Sciences Publication Activity Database

    Kainen, P.C.; Kůrková, Věra; Sanguineti, M.

    2012-01-01

    Roč. 58, č. 2 (2012), s. 1203-1214 ISSN 0018-9448 R&D Projects: GA MŠk(CZ) ME10023; GA ČR GA201/08/1744; GA ČR GAP202/11/1368 Grant - others:CNR-AV ČR(CZ-IT) Project 2010–2012 Complexity of Neural -Network and Kernel Computational Models Institutional research plan: CEZ:AV0Z10300504 Keywords : dictionary-based computational models * high-dimensional approximation and optimization * model complexity * polynomial upper bounds Subject RIV: IN - Informatics, Computer Science Impact factor: 2.621, year: 2012

  10. Computer modelling for ecosystem service assessment: Chapter 4.4

    Science.gov (United States)

    Dunford, Robert; Harrison, Paula; Bagstad, Kenneth J.

    2017-01-01

    Computer models are simplified representations of the environment that allow biophysical, ecological, and/or socio-economic characteristics to be quantified and explored. Modelling approaches differ from mapping approaches (Chapter 5) as (i) they are not forcibly spatial (although many models do produce spatial outputs); (ii) they focus on understanding and quantifying the interactions between different components of social and/or environmental systems and (iii)

  11. A Computational Model for Observation in Quantum Mechanics.

    Science.gov (United States)

    1987-03-16

    Interferometer experiment ............. 17 2.3 The EPR Paradox experiment ................. 22 3 The Computational Model, an Overview 28 4 Implementation 34...40 4.4 Code for the EPR paradox experiment ............... 46 4.5 Code for the double slit interferometer experiment ..... .. 50 5 Conclusions 59 A...particle run counter to fact. The EPR paradox experiment (see section 2.3) is hard to resolve with this class of models, collectively called hidden

  12. Let Documents Talk to Each Other: A Computer Model for Connection of Short Documents.

    Science.gov (United States)

    Chen, Z.

    1993-01-01

    Discusses the integration of scientific texts through the connection of documents and describes a computer model that can connect short documents. Information retrieval and artificial intelligence are discussed; a prototype system of the model is explained; and the model is compared to other computer models. (17 references) (LRW)

  13. Recent progress and modern challenges in applied mathematics, modeling and computational science

    CERN Document Server

    Makarov, Roman; Belair, Jacques

    2017-01-01

    This volume is an excellent resource for professionals in various areas of applications of mathematics, modeling, and computational science. It focuses on recent progress and modern challenges in these areas. The volume provides a balance between fundamental theoretical and applied developments, emphasizing the interdisciplinary nature of modern trends and detailing state-of-the-art achievements in Applied Mathematics, Modeling, and Computational Science.  The chapters have been authored by international experts in their respective fields, making this book ideal for researchers in academia, practitioners, and graduate students. It can also serve as a reference in the diverse selected areas of applied mathematics, modelling, and computational sciences, and is ideal for interdisciplinary collaborations.

  14. Computational modeling of neural plasticity for self-organization of neural networks.

    Science.gov (United States)

    Chrol-Cannon, Joseph; Jin, Yaochu

    2014-11-01

    Self-organization in biological nervous systems during the lifetime is known to largely occur through a process of plasticity that is dependent upon the spike-timing activity in connected neurons. In the field of computational neuroscience, much effort has been dedicated to building up computational models of neural plasticity to replicate experimental data. Most recently, increasing attention has been paid to understanding the role of neural plasticity in functional and structural neural self-organization, as well as its influence on the learning performance of neural networks for accomplishing machine learning tasks such as classification and regression. Although many ideas and hypothesis have been suggested, the relationship between the structure, dynamics and learning performance of neural networks remains elusive. The purpose of this article is to review the most important computational models for neural plasticity and discuss various ideas about neural plasticity's role. Finally, we suggest a few promising research directions, in particular those along the line that combines findings in computational neuroscience and systems biology, and their synergetic roles in understanding learning, memory and cognition, thereby bridging the gap between computational neuroscience, systems biology and computational intelligence. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Computational Modeling of Ultrafast Pulse Propagation in Nonlinear Optical Materials

    Science.gov (United States)

    Goorjian, Peter M.; Agrawal, Govind P.; Kwak, Dochan (Technical Monitor)

    1996-01-01

    There is an emerging technology of photonic (or optoelectronic) integrated circuits (PICs or OEICs). In PICs, optical and electronic components are grown together on the same chip. rib build such devices and subsystems, one needs to model the entire chip. Accurate computer modeling of electromagnetic wave propagation in semiconductors is necessary for the successful development of PICs. More specifically, these computer codes would enable the modeling of such devices, including their subsystems, such as semiconductor lasers and semiconductor amplifiers in which there is femtosecond pulse propagation. Here, the computer simulations are made by solving the full vector, nonlinear, Maxwell's equations, coupled with the semiconductor Bloch equations, without any approximations. The carrier is retained in the description of the optical pulse, (i.e. the envelope approximation is not made in the Maxwell's equations), and the rotating wave approximation is not made in the Bloch equations. These coupled equations are solved to simulate the propagation of femtosecond optical pulses in semiconductor materials. The simulations describe the dynamics of the optical pulses, as well as the interband and intraband.

  16. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    The fields of sensitivity and uncertainty analysis have traditionally been dominated by statistical techniques when large-scale modeling codes are being analyzed. These methods are able to estimate sensitivities, generate response surfaces, and estimate response probability distributions given the input parameter probability distributions. Because the statistical methods are computationally costly, they are usually applied only to problems with relatively small parameter sets. Deterministic methods, on the other hand, are very efficient and can handle large data sets, but generally require simpler models because of the considerable programming effort required for their implementation. The first part of this paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. This second part of the paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. This paper is applicable to low-level radioactive waste disposal system performance assessment

  17. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  18. Computational modeling of pedunculopontine nucleus deep brain stimulation

    Science.gov (United States)

    Zitella, Laura M.; Mohsenian, Kevin; Pahwa, Mrinal; Gloeckner, Cory; Johnson, Matthew D.

    2013-08-01

    Objective. Deep brain stimulation (DBS) near the pedunculopontine nucleus (PPN) has been posited to improve medication-intractable gait and balance problems in patients with Parkinson's disease. However, clinical studies evaluating this DBS target have not demonstrated consistent therapeutic effects, with several studies reporting the emergence of paresthesia and oculomotor side effects. The spatial and pathway-specific extent to which brainstem regions are modulated during PPN-DBS is not well understood. Approach. Here, we describe two computational models that estimate the direct effects of DBS in the PPN region for human and translational non-human primate (NHP) studies. The three-dimensional models were constructed from segmented histological images from each species, multi-compartment neuron models and inhomogeneous finite element models of the voltage distribution in the brainstem during DBS. Main Results. The computational models predicted that: (1) the majority of PPN neurons are activated with -3 V monopolar cathodic stimulation; (2) surgical targeting errors of as little as 1 mm in both species decrement activation selectivity; (3) specifically, monopolar stimulation in caudal, medial, or anterior PPN activates a significant proportion of the superior cerebellar peduncle (up to 60% in the human model and 90% in the NHP model at -3 V) (4) monopolar stimulation in rostral, lateral or anterior PPN activates a large percentage of medial lemniscus fibers (up to 33% in the human model and 40% in the NHP model at -3 V) and (5) the current clinical cylindrical electrode design is suboptimal for isolating the modulatory effects to PPN neurons. Significance. We show that a DBS lead design with radially-segmented electrodes may yield improved functional outcome for PPN-DBS.

  19. Human Inspired Self-developmental Model of Neural Network (HIM): Introducing Content/Form Computing

    Science.gov (United States)

    Krajíček, Jiří

    This paper presents cross-disciplinary research between medical/psychological evidence on human abilities and informatics needs to update current models in computer science to support alternative methods for computation and communication. In [10] we have already proposed hypothesis introducing concept of human information model (HIM) as cooperative system. Here we continue on HIM design in detail. In our design, first we introduce Content/Form computing system which is new principle of present methods in evolutionary computing (genetic algorithms, genetic programming). Then we apply this system on HIM (type of artificial neural network) model as basic network self-developmental paradigm. Main inspiration of our natural/human design comes from well known concept of artificial neural networks, medical/psychological evidence and Sheldrake theory of "Nature as Alive" [22].

  20. An agent-based computational model for tuberculosis spreading on age-structured populations

    Science.gov (United States)

    Graciani Rodrigues, C. C.; Espíndola, Aquino L.; Penna, T. J. P.

    2015-06-01

    In this work we present an agent-based computational model to study the spreading of the tuberculosis (TB) disease on age-structured populations. The model proposed is a merge of two previous models: an agent-based computational model for the spreading of tuberculosis and a bit-string model for biological aging. The combination of TB with the population aging, reproduces the coexistence of health states, as seen in real populations. In addition, the universal exponential behavior of mortalities curves is still preserved. Finally, the population distribution as function of age shows the prevalence of TB mostly in elders, for high efficacy treatments.

  1. A Reputation-Based Identity Management Model for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Lifa Wu

    2015-01-01

    Full Text Available In the field of cloud computing, most research on identity management has concentrated on protecting user data. However, users typically leave a trail when they access cloud services, and the resulting user traceability can potentially lead to the leakage of sensitive user information. Meanwhile, malicious users can do harm to cloud providers through the use of pseudonyms. To solve these problems, we introduce a reputation mechanism and design a reputation-based identity management model for cloud computing. In the model, pseudonyms are generated based on a reputation signature so as to guarantee the untraceability of pseudonyms, and a mechanism that calculates user reputation is proposed, which helps cloud service providers to identify malicious users. Analysis verifies that the model can ensure that users access cloud services anonymously and that cloud providers assess the credibility of users effectively without violating user privacy.

  2. Parameter Estimation for a Computable General Equilibrium Model

    DEFF Research Database (Denmark)

    Arndt, Channing; Robinson, Sherman; Tarp, Finn

    2002-01-01

    We introduce a maximum entropy approach to parameter estimation for computable general equilibrium (CGE) models. The approach applies information theory to estimating a system of non-linear simultaneous equations. It has a number of advantages. First, it imposes all general equilibrium constraints...

  3. Computer model for noise in the dc Squid

    International Nuclear Information System (INIS)

    Tesche, C.D.; Clarke, J.

    1976-08-01

    A computer model for the dc SQUID is described which predicts signal and noise as a function of various SQUID parameters. Differential equations for the voltage across the SQUID including the Johnson noise in the shunted junctions are integrated stepwise in time

  4. Direct Monte Carlo dose calculation using polygon-surface computational human model

    International Nuclear Information System (INIS)

    Jeong, Jong Hwi; Kim, Chan Hyeong; Yeom, Yeon Su; Cho, Sungkoo; Chung, Min Suk; Cho, Kun-Woo

    2011-01-01

    In the present study, a voxel-type computational human model was converted to a polygon-surface model, after which it was imported directly to the Geant4 code without using a voxelization process, that is, without converting back to a voxel model. The original voxel model was also imported to the Geant4 code, in order to compare the calculated dose values and the computational speed. The average polygon size of the polygon-surface model was ∼0.5 cm 2 , whereas the voxel resolution of the voxel model was 1.981 × 1.981 × 2.0854 mm 3 . The results showed a good agreement between the calculated dose values of the two models. The polygon-surface model was, however, slower than the voxel model by a factor of 6–9 for the photon energies and irradiation geometries considered in the present study, which nonetheless is considered acceptable, considering that direct use of the polygon-surface model does not require a separate voxelization process. (author)

  5. Computational modeling of geometry dependent phonon transport in silicon nanostructures

    Science.gov (United States)

    Cheney, Drew A.

    Recent experiments have demonstrated that thermal properties of semiconductor nanostructures depend on nanostructure boundary geometry. Phonons are quantized mechanical vibrations that are the dominant carrier of heat in semiconductor materials and their aggregate behavior determine a nanostructure's thermal performance. Phonon-geometry scattering processes as well as waveguiding effects which result from coherent phonon interference are responsible for the shape dependence of thermal transport in these systems. Nanoscale phonon-geometry interactions provide a mechanism by which nanostructure geometry may be used to create materials with targeted thermal properties. However, the ability to manipulate material thermal properties via controlling nanostructure geometry is contingent upon first obtaining increased theoretical understanding of fundamental geometry induced phonon scattering processes and having robust analytical and computational models capable of exploring the nanostructure design space, simulating the phonon scattering events, and linking the behavior of individual phonon modes to overall thermal behavior. The overall goal of this research is to predict and analyze the effect of nanostructure geometry on thermal transport. To this end, a harmonic lattice-dynamics based atomistic computational modeling tool was created to calculate phonon spectra and modal phonon transmission coefficients in geometrically irregular nanostructures. The computational tool is used to evaluate the accuracy and regimes of applicability of alternative computational techniques based upon continuum elastic wave theory. The model is also used to investigate phonon transmission and thermal conductance in diameter modulated silicon nanowires. Motivated by the complexity of the transmission results, a simplified model based upon long wavelength beam theory was derived and helps explain geometry induced phonon scattering of low frequency nanowire phonon modes.

  6. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  7. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  8. The extended RBAC model based on grid computing

    Institute of Scientific and Technical Information of China (English)

    CHEN Jian-gang; WANG Ru-chuan; WANG Hai-yan

    2006-01-01

    This article proposes the extended role-based access control (RBAC) model for solving dynamic and multidomain problems in grid computing, The formulated description of the model has been provided. The introduction of context and the mapping relations of context-to-role and context-to-permission help the model adapt to dynamic property in grid environment.The multidomain role inheritance relation by the authorization agent service realizes the multidomain authorization amongst the autonomy domain. A function has been proposed for solving the role inheritance conflict during the establishment of the multidomain role inheritance relation.

  9. Modeling and simulation the computer science of illusion

    CERN Document Server

    Raczynski, Stanislaw

    2006-01-01

    Simulation is the art of using tools - physical or conceptual models, or computer hardware and software, to attempt to create the illusion of reality. The discipline has in recent years expanded to include the modelling of systems that rely on human factors and therefore possess a large proportion of uncertainty, such as social, economic or commercial systems. These new applications make the discipline of modelling and simulation a field of dynamic growth and new research. Stanislaw Raczynski outlines the considerable and promising research that is being conducted to counter the problems of

  10. A Computational Model of Cellular Engraftment on Lung Scaffolds

    Directory of Open Access Journals (Sweden)

    Joshua J. Pothen

    2016-10-01

    Full Text Available The possibility that stem cells might be used to regenerate tissue is now being investigated for a variety of organs, but these investigations are still essentially exploratory and have few predictive tools available to guide experimentation. We propose, in this study, that the field of lung tissue regeneration might be better served by predictive tools that treat stem cells as agents that obey certain rules of behavior governed by both their phenotype and their environment. Sufficient knowledge of these rules of behavior would then, in principle, allow lung tissue development to be simulated computationally. Toward this end, we developed a simple agent-based computational model to simulate geographic patterns of cells seeded onto a lung scaffold. Comparison of the simulated patterns to those observed experimentally supports the hypothesis that mesenchymal stem cells proliferate preferentially toward the scaffold boundary, whereas alveolar epithelial cells do not. This demonstrates that a computational model of this type has the potential to assist in the discovery of rules of cellular behavior

  11. Mesh influence on the fire computer modeling in nuclear power plants

    Directory of Open Access Journals (Sweden)

    D. Lázaro

    2018-04-01

    Full Text Available Fire computer models allow to study real fire scenarios consequences. Its use in nuclear power plants has increased with the new regulations to apply risk informed performance-based methods for the analysis and design of fire safety solutions. The selection of the cell side factor is very important in these kinds of models. The mesh must establish a compromise between the geometry adjustment, the resolution of the equations and the computation times. This paper aims to study the impact of several cell sizes, using the fire computer model FDS, to evaluate the relative affectation in the final simulation results. In order to validate that, we have employed several scenarios of interest for nuclear power plants. Conclusions offer relevant data for users and show some cell sizes that can be selected to guarantee the quality of the simulations and reduce the results uncertainty.

  12. Task-and-role-based access-control model for computational grid

    Institute of Scientific and Technical Information of China (English)

    LONG Tao; HONG Fan; WU Chi; SUN Ling-li

    2007-01-01

    Access control in a grid environment is a challenging issue because the heterogeneous nature and independent administration of geographically dispersed resources in grid require access control to use fine-grained policies. We established a task-and-role-based access-control model for computational grid (CG-TRBAC model), integrating the concepts of role-based access control (RBAC) and task-based access control (TBAC). In this model, condition restrictions are defined and concepts specifically tailored to Workflow Management System are simplified or omitted so that role assignment and security administration fit computational grid better than traditional models; permissions are mutable with the task status and system variables, and can be dynamically controlled. The CG-TRBAC model is proved flexible and extendible. It can implement different control policies. It embodies the security principle of least privilege and executes active dynamic authorization. A task attribute can be extended to satisfy different requirements in a real grid system.

  13. Computer-Aided Transformation of PDE Models: Languages, Representations, and a Calculus of Operations

    Science.gov (United States)

    2016-01-05

    Computer-aided transformation of PDE models: languages, representations, and a calculus of operations A domain-specific embedded language called...languages, representations, and a calculus of operations Report Title A domain-specific embedded language called ibvp was developed to model initial...Computer-aided transformation of PDE models: languages, representations, and a calculus of operations 1 Vision and background Physical and engineered systems

  14. Traffic Flow Prediction Model for Large-Scale Road Network Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhaosheng Yang

    2014-01-01

    Full Text Available To increase the efficiency and precision of large-scale road network traffic flow prediction, a genetic algorithm-support vector machine (GA-SVM model based on cloud computing is proposed in this paper, which is based on the analysis of the characteristics and defects of genetic algorithm and support vector machine. In cloud computing environment, firstly, SVM parameters are optimized by the parallel genetic algorithm, and then this optimized parallel SVM model is used to predict traffic flow. On the basis of the traffic flow data of Haizhu District in Guangzhou City, the proposed model was verified and compared with the serial GA-SVM model and parallel GA-SVM model based on MPI (message passing interface. The results demonstrate that the parallel GA-SVM model based on cloud computing has higher prediction accuracy, shorter running time, and higher speedup.

  15. Computational modelling of memory retention from synapse to behaviour

    Science.gov (United States)

    van Rossum, Mark C. W.; Shippi, Maria

    2013-03-01

    One of our most intriguing mental abilities is the capacity to store information and recall it from memory. Computational neuroscience has been influential in developing models and concepts of learning and memory. In this tutorial review we focus on the interplay between learning and forgetting. We discuss recent advances in the computational description of the learning and forgetting processes on synaptic, neuronal, and systems levels, as well as recent data that open up new challenges for statistical physicists.

  16. Computational modelling of memory retention from synapse to behaviour

    International Nuclear Information System (INIS)

    Van Rossum, Mark C W; Shippi, Maria

    2013-01-01

    One of our most intriguing mental abilities is the capacity to store information and recall it from memory. Computational neuroscience has been influential in developing models and concepts of learning and memory. In this tutorial review we focus on the interplay between learning and forgetting. We discuss recent advances in the computational description of the learning and forgetting processes on synaptic, neuronal, and systems levels, as well as recent data that open up new challenges for statistical physicists. (paper)

  17. Computational modeling of turn-taking dynamics in spoken conversations

    OpenAIRE

    Chowdhury, Shammur Absar

    2017-01-01

    The study of human interaction dynamics has been at the center for multiple research disciplines in- cluding computer and social sciences, conversational analysis and psychology, for over decades. Recent interest has been shown with the aim of designing computational models to improve human-machine interaction system as well as support humans in their decision-making process. Turn-taking is one of the key aspects of conversational dynamics in dyadic conversations and is an integral part of hu...

  18. A New Computationally Frugal Method For Sensitivity Analysis Of Environmental Models

    Science.gov (United States)

    Rakovec, O.; Hill, M. C.; Clark, M. P.; Weerts, A.; Teuling, R.; Borgonovo, E.; Uijlenhoet, R.

    2013-12-01

    Effective and efficient parameter sensitivity analysis methods are crucial to understand the behaviour of complex environmental models and use of models in risk assessment. This paper proposes a new computationally frugal method for analyzing parameter sensitivity: the Distributed Evaluation of Local Sensitivity Analysis (DELSA). The DELSA method can be considered a hybrid of local and global methods, and focuses explicitly on multiscale evaluation of parameter sensitivity across the parameter space. Results of the DELSA method are compared with the popular global, variance-based Sobol' method and the delta method. We assess the parameter sensitivity of both (1) a simple non-linear reservoir model with only two parameters, and (2) five different "bucket-style" hydrologic models applied to a medium-sized catchment (200 km2) in the Belgian Ardennes. Results show that in both the synthetic and real-world examples, the global Sobol' method and the DELSA method provide similar sensitivities, with the DELSA method providing more detailed insight at much lower computational cost. The ability to understand how sensitivity measures vary through parameter space with modest computational requirements provides exciting new opportunities.

  19. Computer - based modeling in extract sciences research -I ...

    African Journals Online (AJOL)

    Specifically, in the discipline of chemistry, it has been of great utility. Its use dates back to the 17th Century and includes such wide areas as computational chemistry, chemoinformatics, molecular mechanics, chemical dynamics, molecular dynamics, molecular graphics and algorithms. Modeling has been employed ...

  20. Computational modeling of a forward lunge

    DEFF Research Database (Denmark)

    Alkjær, Tine; Wieland, Maja Rose; Andersen, Michael Skipper

    2012-01-01

    during forward lunging. Thus, the purpose of the present study was to establish a musculoskeletal model of the forward lunge to computationally investigate the complete mechanical force equilibrium of the tibia during the movement to examine the loading pattern of the cruciate ligaments. A healthy female...... was selected from a group of healthy subjects who all performed a forward lunge on a force platform, targeting a knee flexion angle of 90°. Skin-markers were placed on anatomical landmarks on the subject and the movement was recorded by five video cameras. The three-dimensional kinematic data describing...... the forward lunge movement were extracted and used to develop a biomechanical model of the lunge movement. The model comprised two legs including femur, crus, rigid foot segments and the pelvis. Each leg had 35 independent muscle units, which were recruited according to a minimum fatigue criterion...

  1. Investigations of incorporating source directivity into room acoustics computer models to improve auralizations

    Science.gov (United States)

    Vigeant, Michelle C.

    Room acoustics computer modeling and auralizations are useful tools when designing or modifying acoustically sensitive spaces. In this dissertation, the input parameter of source directivity has been studied in great detail to determine first its effect in room acoustics computer models and secondly how to better incorporate the directional source characteristics into these models to improve auralizations. To increase the accuracy of room acoustics computer models, the source directivity of real sources, such as musical instruments, must be included in the models. The traditional method for incorporating source directivity into room acoustics computer models involves inputting the measured static directivity data taken every 10° in a sphere-shaped pattern around the source. This data can be entered into the room acoustics software to create a directivity balloon, which is used in the ray tracing algorithm to simulate the room impulse response. The first study in this dissertation shows that using directional sources over an omni-directional source in room acoustics computer models produces significant differences both in terms of calculated room acoustics parameters and auralizations. The room acoustics computer model was also validated in terms of accurately incorporating the input source directivity. A recently proposed technique for creating auralizations using a multi-channel source representation has been investigated with numerous subjective studies, applied to both solo instruments and an orchestra. The method of multi-channel auralizations involves obtaining multi-channel anechoic recordings of short melodies from various instruments and creating individual channel auralizations. These auralizations are then combined to create a total multi-channel auralization. Through many subjective studies, this process was shown to be effective in terms of improving the realism and source width of the auralizations in a number of cases, and also modeling different

  2. Computational models of music perception and cognition II: Domain-specific music processing

    Science.gov (United States)

    Purwins, Hendrik; Grachten, Maarten; Herrera, Perfecto; Hazan, Amaury; Marxer, Ricard; Serra, Xavier

    2008-09-01

    In Part I [Purwins H, Herrera P, Grachten M, Hazan A, Marxer R, Serra X. Computational models of music perception and cognition I: The perceptual and cognitive processing chain. Physics of Life Reviews 2008, in press, doi:10.1016/j.plrev.2008.03.004], we addressed the study of cognitive processes that underlie auditory perception of music, and their neural correlates. The aim of the present paper is to summarize empirical findings from music cognition research that are relevant to three prominent music theoretic domains: rhythm, melody, and tonality. Attention is paid to how cognitive processes like category formation, stimulus grouping, and expectation can account for the music theoretic key concepts in these domains, such as beat, meter, voice, consonance. We give an overview of computational models that have been proposed in the literature for a variety of music processing tasks related to rhythm, melody, and tonality. Although the present state-of-the-art in computational modeling of music cognition definitely provides valuable resources for testing specific hypotheses and theories, we observe the need for models that integrate the various aspects of music perception and cognition into a single framework. Such models should be able to account for aspects that until now have only rarely been addressed in computational models of music cognition, like the active nature of perception and the development of cognitive capacities from infancy to adulthood.

  3. Model Reduction of Computational Aerothermodynamics for Multi-Discipline Analysis in High Speed Flows

    Science.gov (United States)

    Crowell, Andrew Rippetoe

    This dissertation describes model reduction techniques for the computation of aerodynamic heat flux and pressure loads for multi-disciplinary analysis of hypersonic vehicles. NASA and the Department of Defense have expressed renewed interest in the development of responsive, reusable hypersonic cruise vehicles capable of sustained high-speed flight and access to space. However, an extensive set of technical challenges have obstructed the development of such vehicles. These technical challenges are partially due to both the inability to accurately test scaled vehicles in wind tunnels and to the time intensive nature of high-fidelity computational modeling, particularly for the fluid using Computational Fluid Dynamics (CFD). The aim of this dissertation is to develop efficient and accurate models for the aerodynamic heat flux and pressure loads to replace the need for computationally expensive, high-fidelity CFD during coupled analysis. Furthermore, aerodynamic heating and pressure loads are systematically evaluated for a number of different operating conditions, including: simple two-dimensional flow over flat surfaces up to three-dimensional flows over deformed surfaces with shock-shock interaction and shock-boundary layer interaction. An additional focus of this dissertation is on the implementation and computation of results using the developed aerodynamic heating and pressure models in complex fluid-thermal-structural simulations. Model reduction is achieved using a two-pronged approach. One prong focuses on developing analytical corrections to isothermal, steady-state CFD flow solutions in order to capture flow effects associated with transient spatially-varying surface temperatures and surface pressures (e.g., surface deformation, surface vibration, shock impingements, etc.). The second prong is focused on minimizing the computational expense of computing the steady-state CFD solutions by developing an efficient surrogate CFD model. The developed two

  4. A conceptual and computational model of moral decision making in human and artificial agents.

    Science.gov (United States)

    Wallach, Wendell; Franklin, Stan; Allen, Colin

    2010-07-01

    Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we

  5. Materials and nanosystems : interdisciplinary computational modeling at multiple scales

    International Nuclear Information System (INIS)

    Huber, S.E.

    2014-01-01

    Over the last five decades, computer simulation and numerical modeling have become valuable tools complementing the traditional pillars of science, experiment and theory. In this thesis, several applications of computer-based simulation and modeling shall be explored in order to address problems and open issues in chemical and molecular physics. Attention shall be paid especially to the different degrees of interrelatedness and multiscale-flavor, which may - at least to some extent - be regarded as inherent properties of computational chemistry. In order to do so, a variety of computational methods are used to study features of molecular systems which are of relevance in various branches of science and which correspond to different spatial and/or temporal scales. Proceeding from small to large measures, first, an application in astrochemistry, the investigation of spectroscopic and energetic aspects of carbonic acid isomers shall be discussed. In this respect, very accurate and hence at the same time computationally very demanding electronic structure methods like the coupled-cluster approach are employed. These studies are followed by the discussion of an application in the scope of plasma-wall interaction which is related to nuclear fusion research. There, the interactions of atoms and molecules with graphite surfaces are explored using density functional theory methods. The latter are computationally cheaper than coupled-cluster methods and thus allow the treatment of larger molecular systems, but yield less accuracy and especially reduced error control at the same time. The subsequently presented exploration of surface defects at low-index polar zinc oxide surfaces, which are of interest in materials science and surface science, is another surface science application. The necessity to treat even larger systems of several hundreds of atoms requires the use of approximate density functional theory methods. Thin gold nanowires consisting of several thousands of

  6. Conversion of IVA Human Computer Model to EVA Use and Evaluation and Comparison of the Result to Existing EVA Models

    Science.gov (United States)

    Hamilton, George S.; Williams, Jermaine C.

    1998-01-01

    This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.

  7. Revision history aware repositories of computational models of biological systems.

    Science.gov (United States)

    Miller, Andrew K; Yu, Tommy; Britten, Randall; Cooling, Mike T; Lawson, James; Cowan, Dougal; Garny, Alan; Halstead, Matt D B; Hunter, Peter J; Nickerson, David P; Nunns, Geo; Wimalaratne, Sarala M; Nielsen, Poul M F

    2011-01-14

    Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs) for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. We have extended the Physiome Model Repository software to be fully revision history aware

  8. Revision history aware repositories of computational models of biological systems

    Directory of Open Access Journals (Sweden)

    Nickerson David P

    2011-01-01

    Full Text Available Abstract Background Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. Results We have extended the Physiome Model

  9. Computational model for turbulent flow around a grid spacer with mixing vane

    International Nuclear Information System (INIS)

    Tsutomu Ikeno; Takeo Kajishima

    2005-01-01

    Turbulent mixing coefficient and pressure drop are important factors in subchannel analysis to predict onset of DNB. However, universal correlations are difficult since these factors are significantly affected by the geometry of subchannel and a grid spacer with mixing vane. Therefore, we propose a computational model to estimate these factors. Computational model: To represent the effect of geometry of grid spacer in computational model, we applied a large eddy simulation (LES) technique in couple with an improved immersed-boundary method. In our previous work (Ikeno, et al., NURETH-10), detailed properties of turbulence in subchannel were successfully investigated by developing the immersed boundary method in LES. In this study, additional improvements are given: new one-equation dynamic sub-grid scale (SGS) model is introduced to account for the complex geometry without any artificial modification; the higher order accuracy is maintained by consistent treatment for boundary conditions for velocity and pressure. NUMERICAL TEST AND DISCUSSION: Turbulent mixing coefficient and pressure drop are affected strongly by the arrangement and inclination of mixing vane. Therefore, computations are carried out for each of convolute and periodic arrangements, and for each of 30 degree and 20 degree inclinations. The difference in turbulent mixing coefficient due to these factors is reasonably predicted by our method. (An example of this numerical test is shown in Fig. 1.) Turbulent flow of the problem includes unsteady separation behind the mixing vane and vortex shedding in downstream. Anisotropic distribution of turbulent stress is also appeared in rod gap. Therefore, our computational model has advantage for assessing the influence of arrangement and inclination of mixing vane. By coarser computational mesh, one can screen several candidates for spacer design. Then, by finer mesh, more quantitative analysis is possible. By such a scheme, we believe this method is useful

  10. Converting differential-equation models of biological systems to membrane computing.

    Science.gov (United States)

    Muniyandi, Ravie Chandren; Zin, Abdullah Mohd; Sanders, J W

    2013-12-01

    This paper presents a method to convert the deterministic, continuous representation of a biological system by ordinary differential equations into a non-deterministic, discrete membrane computation. The dynamics of the membrane computation is governed by rewrite rules operating at certain rates. That has the advantage of applying accurately to small systems, and to expressing rates of change that are determined locally, by region, but not necessary globally. Such spatial information augments the standard differentiable approach to provide a more realistic model. A biological case study of the ligand-receptor network of protein TGF-β is used to validate the effectiveness of the conversion method. It demonstrates the sense in which the behaviours and properties of the system are better preserved in the membrane computing model, suggesting that the proposed conversion method may prove useful for biological systems in particular. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Customized computer models of eyes with intraocular lenses.

    Science.gov (United States)

    Rosales, P; Marcos, S

    2007-03-05

    We compared experimental wave aberrations in pseudophakic eyes with aspheric intraocular lenses (IOLs) to simulate aberrations from numerical ray tracing on customized computer eye models using corneal topography, angle lambda, ocular biometry, IOL geometry, and IOL tilt and decentration measured on the same eyes. We found high correlations between real and simulated aberrations even for the eye with only the cornea, and these increased on average when the IOL geometry and position were included. Relevant individual aberrations were well predicted by the complete eye model. Corneal spherical aberration and horizontal coma were compensated by the IOL, and in 58.3% of the cases IOL tilt and decentration contributed to compensation of horizontal coma. We conclude that customized computer eye models are a good representation of real eyes with IOLs and allow understanding of the relative contribution of optical, geometrical and surgically-related factors to image quality. Corneal spherical aberration is reduced by aspheric IOLs, although other corneal high order aberrations are still a major contributor to total aberrations in pseudophakic eyes. Tilt and decentration of the IOLs represent a relatively minor contribution of the overall optical quality of the eye.

  12. Quantitative, steady-state properties of Catania's computational model of the operant reserve.

    Science.gov (United States)

    Berg, John P; McDowell, J J

    2011-05-01

    Catania (2005) found that a computational model of the operant reserve (Skinner, 1938) produced realistic behavior in initial, exploratory analyses. Although Catania's operant reserve computational model demonstrated potential to simulate varied behavioral phenomena, the model was not systematically tested. The current project replicated and extended the Catania model, clarified its capabilities through systematic testing, and determined the extent to which it produces behavior corresponding to matching theory. Significant departures from both classic and modern matching theory were found in behavior generated by the model across all conditions. The results suggest that a simple, dynamic operant model of the reflex reserve does not simulate realistic steady state behavior. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Computer-aided and predictive models for design of controlled release of pesticides

    DEFF Research Database (Denmark)

    Suné, Nuria Muro; Gani, Rafiqul

    2004-01-01

    In the field of pesticide controlled release technology, a computer based model that can predict the delivery of the Active Ingredient (AI) from fabricated units is important for purposes of product design and marketing. A model for the release of an M from a microcapsule device is presented...... in this paper, together with a specific case study application to highlight its scope and significance. The paper also addresses the need for predictive models and proposes a computer aided modelling framework for achieving it through the development and introduction of reliable and predictive constitutive...... models. A group-contribution based model for one of the constitutive variables (AI solubility in polymers) is presented together with examples of application and validation....

  14. Development of Computational Models for Pyrochemical Electrorefiners of Nuclear Waste Transmutation Systems

    International Nuclear Information System (INIS)

    Kim, K. R.; Lee, H. S.; Hwang, I. S.

    2010-12-01

    The objective of this project is to develop multi-dimensional computational models in order to improve the operation of uranium electrorefiners currently used in pyroprocessing technology. These 2-D (US) and 3-D (ROK) mathematical models are based on the fundamental physical and chemical properties of the electrorefiner processes. The validated models by compiled and evaluated experimental data could provide better information for developing advanced electrorefiners for uranium recovery. The research results in this period are as follows: - Successfully assessed a common computational platform for the modeling work and identify spatial characterization requirements. - Successfully developed a 3-D electro-fluid dynamic electrorefiner model. - Successfully validated and benchmarked the two multi-dimensional models with compiled experimental data sets

  15. Computer Models in Biomechanics From Nano to Macro

    CERN Document Server

    Kuhl, Ellen

    2013-01-01

    This book contains a collection of papers that were presented at the IUTAM Symposium on “Computer Models in Biomechanics: From Nano to Macro” held at Stanford University, California, USA, from August 29 to September 2, 2011. It contains state-of-the-art papers on: - Protein and Cell Mechanics: coarse-grained model for unfolded proteins, collagen-proteoglycan structural interactions in the cornea, simulations of cell behavior on substrates - Muscle Mechanics: modeling approaches for Ca2+–regulated smooth muscle contraction, smooth muscle modeling using continuum thermodynamical frameworks, cross-bridge model describing the mechanoenergetics of actomyosin interaction, multiscale skeletal muscle modeling - Cardiovascular Mechanics: multiscale modeling of arterial adaptations by incorporating molecular mechanisms, cardiovascular tissue damage, dissection properties of aortic aneurysms, intracranial aneurysms, electromechanics of the heart, hemodynamic alterations associated with arterial remodeling followin...

  16. Automatic procedure for realistic 3D finite element modelling of human brain for bioelectromagnetic computations

    International Nuclear Information System (INIS)

    Aristovich, K Y; Khan, S H

    2010-01-01

    Realistic computer modelling of biological objects requires building of very accurate and realistic computer models based on geometric and material data, type, and accuracy of numerical analyses. This paper presents some of the automatic tools and algorithms that were used to build accurate and realistic 3D finite element (FE) model of whole-brain. These models were used to solve the forward problem in magnetic field tomography (MFT) based on Magnetoencephalography (MEG). The forward problem involves modelling and computation of magnetic fields produced by human brain during cognitive processing. The geometric parameters of the model were obtained from accurate Magnetic Resonance Imaging (MRI) data and the material properties - from those obtained from Diffusion Tensor MRI (DTMRI). The 3D FE models of the brain built using this approach has been shown to be very accurate in terms of both geometric and material properties. The model is stored on the computer in Computer-Aided Parametrical Design (CAD) format. This allows the model to be used in a wide a range of methods of analysis, such as finite element method (FEM), Boundary Element Method (BEM), Monte-Carlo Simulations, etc. The generic model building approach presented here could be used for accurate and realistic modelling of human brain and many other biological objects.

  17. DISTING: A web application for fast algorithmic computation of alternative indistinguishable linear compartmental models.

    Science.gov (United States)

    Davidson, Natalie R; Godfrey, Keith R; Alquaddoomi, Faisal; Nola, David; DiStefano, Joseph J

    2017-05-01

    We describe and illustrate use of DISTING, a novel web application for computing alternative structurally identifiable linear compartmental models that are input-output indistinguishable from a postulated linear compartmental model. Several computer packages are available for analysing the structural identifiability of such models, but DISTING is the first to be made available for assessing indistinguishability. The computational algorithms embedded in DISTING are based on advanced versions of established geometric and algebraic properties of linear compartmental models, embedded in a user-friendly graphic model user interface. Novel computational tools greatly speed up the overall procedure. These include algorithms for Jacobian matrix reduction, submatrix rank reduction, and parallelization of candidate rank computations in symbolic matrix analysis. The application of DISTING to three postulated models with respectively two, three and four compartments is given. The 2-compartment example is used to illustrate the indistinguishability problem; the original (unidentifiable) model is found to have two structurally identifiable models that are indistinguishable from it. The 3-compartment example has three structurally identifiable indistinguishable models. It is found from DISTING that the four-compartment example has five structurally identifiable models indistinguishable from the original postulated model. This example shows that care is needed when dealing with models that have two or more compartments which are neither perturbed nor observed, because the numbering of these compartments may be arbitrary. DISTING is universally and freely available via the Internet. It is easy to use and circumvents tedious and complicated algebraic analysis previously done by hand. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  19. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  20. A Cloud Theory-Based Trust Computing Model in Social Networks

    Directory of Open Access Journals (Sweden)

    Fengming Liu

    2016-12-01

    Full Text Available How to develop a trust management model and then to efficiently control and manage nodes is an important issue in the scope of social network security. In this paper, a trust management model based on a cloud model is proposed. The cloud model uses a specific computation operator to achieve the transformation from qualitative concepts to quantitative computation. Additionally, this can also be used to effectively express the fuzziness, randomness and the relationship between them of the subjective trust. The node trust is divided into reputation trust and transaction trust. In addition, evaluation methods are designed, respectively. Firstly, the two-dimension trust cloud evaluation model is designed based on node’s comprehensive and trading experience to determine the reputation trust. The expected value reflects the average trust status of nodes. Then, entropy and hyper-entropy are used to describe the uncertainty of trust. Secondly, the calculation methods of the proposed direct transaction trust and the recommendation transaction trust involve comprehensively computation of the transaction trust of each node. Then, the choosing strategies were designed for node to trade based on trust cloud. Finally, the results of a simulation experiment in P2P network file sharing on an experimental platform directly reflect the objectivity, accuracy and robustness of the proposed model, and could also effectively identify the malicious or unreliable service nodes in the system. In addition, this can be used to promote the service reliability of the nodes with high credibility, by which the stability of the whole network is improved.

  1. A way forward for the development of an exposure computational model to computed tomography dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, C.C., E-mail: cassio.c.ferreira@gmail.co [Nucleo de Fisica, Universidade Federal de Sergipe, Itabaiana-SE, CEP 49500-000 (Brazil); Galvao, L.A., E-mail: lailagalmeida@gmail.co [Departamento de Fisica, Universidade Federal de Sergipe, Sao Cristovao-SE, CEP 49100-000 (Brazil); Vieira, J.W., E-mail: jose.wilson59@uol.com.b [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco, Recife-PE, CEP 50740-540 (Brazil); Escola Politecnica de Pernambuco, Universidade de Pernambuco, Recife-PE, CEP 50720-001 (Brazil); Maia, A.F., E-mail: afmaia@ufs.b [Departamento de Fisica, Universidade Federal de Sergipe, Sao Cristovao-SE, CEP 49100-000 (Brazil)

    2011-04-15

    A way forward for the development of an exposure computational model to computed tomography dosimetry has been presented. In this way, an exposure computational model (ECM) for computed tomography (CT) dosimetry has been developed and validated through comparison with experimental results. For the development of the ECM, X-ray spectra generator codes have been evaluated and the head bow tie filter has been modelled through a mathematical equation. EGS4 and EGSnrc have been used for simulating the radiation transport by the ECM. Geometrical phantoms, commonly used in CT dosimetry, have been modelled by IDN software. MAX06 has also been used to simulate an adult male patient submitted for CT examinations. The evaluation of the X-ray spectra generator codes in CT dosimetry showed dependence with tube filtration (or HVL value). More generally, with the increment of total filtration (or HVL value) the X-raytbc becomes the best X-ray spectra generator code for CT dosimetry. The EGSnrc/X-raytbc combination has calculated C{sub 100,c} in better concordance with C{sub 100,c} measured in two different CT scanners. For a Toshiba CT scanner, the average percentage difference between the calculated C{sub 100,c} values and measured C{sub 100,c} values was 8.2%. Whilst for a GE CT scanner, the average percentage difference was 10.4%. By the measurements of air kerma through a prototype head bow tie filter a third-order exponential decay equation was found. C{sub 100,c} and C{sub 100,p} values calculated by the ECM are in good agreement with values measured at a specific CT scanner. A maximum percentage difference of 2% has been found in the PMMA CT head phantoms, demonstrating effective modelling of the head bow tie filter by the equation. The absorbed and effective doses calculated by the ECM developed in this work have been compared to those calculated by the ECM of Jones and Shrimpton for an adult male patient. For a head examination the absorbed dose values calculated by the

  2. A conceptual gamma shield design using the DRP model computation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, E E [Reactor Department, Nuclear Research Center, Atomic Energy Authority, Cairo (Egypt); Rahman, F A [National Center of Nuclear Safety and Radiation Control, Atomic Energy Authority, Cairo (Egypt)

    1997-12-31

    The purpose of this investigation is to assess basic areas of concern in the development of reactor shielding conceptual design calculations. A spherical shield model composed of low carbon steel and lead have been constructed to surround a Co-60 gamma point source. two alternative configurations have been considered in the model computation. The numerical calculations have been performed using both the ANISN code and DRP model computation together with the DLC 75-Bugle 80 data library. A resume of results for deep penetration in different shield materials with different packing densities is presented and analysed. The results showed that the gamma fluxes attenuation is increased with increasing distribution the packing density of the shield material which reflects its importance of considering it as a safety parameter in shielding design. 3 figs.

  3. Large Scale Computing for the Modelling of Whole Brain Connectivity

    DEFF Research Database (Denmark)

    Albers, Kristoffer Jon

    organization of the brain in continuously increasing resolution. From these images, networks of structural and functional connectivity can be constructed. Bayesian stochastic block modelling provides a prominent data-driven approach for uncovering the latent organization, by clustering the networks into groups...... of neurons. Relying on Markov Chain Monte Carlo (MCMC) simulations as the workhorse in Bayesian inference however poses significant computational challenges, especially when modelling networks at the scale and complexity supported by high-resolution whole-brain MRI. In this thesis, we present how to overcome...... these computational limitations and apply Bayesian stochastic block models for un-supervised data-driven clustering of whole-brain connectivity in full image resolution. We implement high-performance software that allows us to efficiently apply stochastic blockmodelling with MCMC sampling on large complex networks...

  4. A Multidisciplinary Model for Development of Intelligent Computer-Assisted Instruction.

    Science.gov (United States)

    Park, Ok-choon; Seidel, Robert J.

    1989-01-01

    Proposes a schematic multidisciplinary model to help developers of intelligent computer-assisted instruction (ICAI) identify the types of required expertise and integrate them into a system. Highlights include domain types and expertise; knowledge acquisition; task analysis; knowledge representation; student modeling; diagnosis of learning needs;…

  5. Implementation and use of Gaussian process meta model for sensitivity analysis of numerical models: application to a hydrogeological transport computer code

    International Nuclear Information System (INIS)

    Marrel, A.

    2008-01-01

    In the studies of environmental transfer and risk assessment, numerical models are used to simulate, understand and predict the transfer of pollutant. These computer codes can depend on a high number of uncertain input parameters (geophysical variables, chemical parameters, etc.) and can be often too computer time expensive. To conduct uncertainty propagation studies and to measure the importance of each input on the response variability, the computer code has to be approximated by a meta model which is build on an acceptable number of simulations of the code and requires a negligible calculation time. We focused our research work on the use of Gaussian process meta model to make the sensitivity analysis of the code. We proposed a methodology with estimation and input selection procedures in order to build the meta model in the case of a high number of inputs and with few simulations available. Then, we compared two approaches to compute the sensitivity indices with the meta model and proposed an algorithm to build prediction intervals for these indices. Afterwards, we were interested in the choice of the code simulations. We studied the influence of different sampling strategies on the predictiveness of the Gaussian process meta model. Finally, we extended our statistical tools to a functional output of a computer code. We combined a decomposition on a wavelet basis with the Gaussian process modelling before computing the functional sensitivity indices. All the tools and statistical methodologies that we developed were applied to the real case of a complex hydrogeological computer code, simulating radionuclide transport in groundwater. (author) [fr

  6. A Computational Model of Linguistic Humor in Puns

    Science.gov (United States)

    Kao, Justine T.; Levy, Roger; Goodman, Noah D.

    2016-01-01

    Humor plays an essential role in human interactions. Precisely what makes something funny, however, remains elusive. While research on natural language understanding has made significant advancements in recent years, there has been little direct integration of humor research with computational models of language understanding. In this paper, we…

  7. Cloud computing models and their application in LTE based cellular systems

    NARCIS (Netherlands)

    Staring, A.J.; Karagiannis, Georgios

    2013-01-01

    As cloud computing emerges as the next novel concept in computer science, it becomes clear that the model applied in large data storage systems used to resolve issues coming forth from an increasing demand, could also be used to resolve the very high bandwidth requirements on access network, core

  8. Computation and experiment results of the grounding model of Three Gorges Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Wen Xishan; Zhang Yuanfang; Yu Jianhui; Chen Cixuan [Wuhan University of Hydraulic and Electrical Engineering (China); Qin Liming; Xu Jun; Shu Lianfu [Yangtze River Water Resources Commission, Wuhan (China)

    1999-07-01

    A model for the computation of the grounding parameters of the grids of Three Gorges Power Plant (TGPP) on the Yangtze River is presented in this paper. Using this model computation and analysis of grounding grids is carried out. The results show that reinforcing the grid of the dam is the main body of current dissipation. It must be reliably welded to form a good grounding grid. The experimental results show that the method and program of the computations are correct. (UK)

  9. Stability and Hopf bifurcation for a delayed SLBRS computer virus model.

    Science.gov (United States)

    Zhang, Zizhen; Yang, Huizhong

    2014-01-01

    By incorporating the time delay due to the period that computers use antivirus software to clean the virus into the SLBRS model a delayed SLBRS computer virus model is proposed in this paper. The dynamical behaviors which include local stability and Hopf bifurcation are investigated by regarding the delay as bifurcating parameter. Specially, direction and stability of the Hopf bifurcation are derived by applying the normal form method and center manifold theory. Finally, an illustrative example is also presented to testify our analytical results.

  10. Stability and Hopf Bifurcation for a Delayed SLBRS Computer Virus Model

    Directory of Open Access Journals (Sweden)

    Zizhen Zhang

    2014-01-01

    Full Text Available By incorporating the time delay due to the period that computers use antivirus software to clean the virus into the SLBRS model a delayed SLBRS computer virus model is proposed in this paper. The dynamical behaviors which include local stability and Hopf bifurcation are investigated by regarding the delay as bifurcating parameter. Specially, direction and stability of the Hopf bifurcation are derived by applying the normal form method and center manifold theory. Finally, an illustrative example is also presented to testify our analytical results.

  11. A framework to establish credibility of computational models in biology.

    Science.gov (United States)

    Patterson, Eann A; Whelan, Maurice P

    2017-10-01

    Computational models in biology and biomedical science are often constructed to aid people's understanding of phenomena or to inform decisions with socioeconomic consequences. Model credibility is the willingness of people to trust a model's predictions and is often difficult to establish for computational biology models. A 3 × 3 matrix has been proposed to allow such models to be categorised with respect to their testability and epistemic foundation in order to guide the selection of an appropriate process of validation to supply evidence to establish credibility. Three approaches to validation are identified that can be deployed depending on whether a model is deemed untestable, testable or lies somewhere in between. In the latter two cases, the validation process involves the quantification of uncertainty which is a key output. The issues arising due to the complexity and inherent variability of biological systems are discussed and the creation of 'digital twins' proposed as a means to alleviate the issues and provide a more robust, transparent and traceable route to model credibility and acceptance. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Cloud Computing Adoption Model for Universities to Increase ICT Proficiency

    Directory of Open Access Journals (Sweden)

    Safiya Okai

    2014-08-01

    Full Text Available Universities around the world especially those in developing countries are faced with the problem of delivering the level of information and communications technology (ICT needed to facilitate teaching, learning, research, and development activities ideal in a typical university, which is needed to meet educational needs in-line with advancement in technology and the growing dependence on IT. This is mainly due to the high cost involved in providing and maintaining the needed hardware and software. A technology such as cloud computing that delivers on demand provisioning of IT resources on a pay per use basis can be used to address this problem. Cloud computing promises better delivery of IT services as well as availability whenever and wherever needed at reduced costs with users paying only as much as they consume through the services of cloud service providers. The cloud technology reduces complexity while increasing speed and quality of IT services provided; however, despite these benefits the challenges that come with its adoption have left many sectors especially the higher education skeptical in committing to this technology. This article identifies the reasons for the slow rate of adoption of cloud computing at university level, discusses the challenges faced and proposes a cloud computing adoption model that contains strategic guidelines to overcome the major challenges identified and a roadmap for the successful adoption of cloud computing by universities. The model was tested in one of the universities and found to be both useful and appropriate for adopting cloud computing at university level.

  13. Advanced computational model for three-phase slurry reactors

    International Nuclear Information System (INIS)

    Goodarz Ahmadi

    2001-10-01

    In the second year of the project, the Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column is further developed. The approach uses an Eulerian analysis of liquid flows in the bubble column, and makes use of the Lagrangian trajectory analysis for the bubbles and particle motions. An experimental set for studying a two-dimensional bubble column is also developed. The operation of the bubble column is being tested and diagnostic methodology for quantitative measurements is being developed. An Eulerian computational model for the flow condition in the two-dimensional bubble column is also being developed. The liquid and bubble motions are being analyzed and the results are being compared with the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures is also being studied. Further progress was also made in developing a thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion. The balance laws are obtained and the constitutive laws are being developed. Progress was also made in measuring concentration and velocity of particles of different sizes near a wall in a duct flow. The technique of Phase-Doppler anemometry was used in these studies. The general objective of this project is to provide the needed fundamental understanding of three-phase slurry reactors in Fischer-Tropsch (F-T) liquid fuel synthesis. The other main goal is to develop a computational capability for predicting the transport and processing of three-phase coal slurries. The specific objectives are: (1) To develop a thermodynamically consistent rate-dependent anisotropic model for multiphase slurry flows with and without chemical reaction for application to coal liquefaction. Also establish the

  14. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation.

    Science.gov (United States)

    Mangado, Nerea; Ceresa, Mario; Duchateau, Nicolas; Kjer, Hans Martin; Vera, Sergio; Dejea Velardo, Hector; Mistrik, Pavel; Paulsen, Rasmus R; Fagertun, Jens; Noailly, Jérôme; Piella, Gemma; González Ballester, Miguel Ángel

    2016-08-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging. To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient's CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns constitutive parameters to all components of the finite element model. This model can then be used to study in silico the effects of the electrical stimulation of the cochlear implant. Results are shown on a total of 25 models of patients. In all cases, a final mesh suitable for finite element simulations was obtained, in an average time of 94 s. The framework has proven to be fast and robust, and is promising for a detailed prognosis of the cochlear implantation surgery.

  15. Computer model of the MFTF-B neutral beam Accel dc power supply

    International Nuclear Information System (INIS)

    Wilson, J.H.

    1983-01-01

    Using the SCEPTRE circuit modeling code, a computer model was developed for the MFTF Neutral Beam Power Supply System (NBPSS) Accel dc Power Supply (ADCPS). The ADCPS provides 90 kV, 88 A, to the Accel Modulator. Because of the complex behavior of the power supply, use of the computer model is necessary to adequately understand the power supply's behavior over a wide range of load conditions and faults. The model developed includes all the circuit components and parameters, and some of the stray values. The model has been well validated for transients with times on the order of milliseconds, and with one exception, for steady-state operation. When using a circuit modeling code for a system with a wide range of time constants, it can become impossible to obtain good solutions for all time ranges at once. The present model concentrates on the millisecond-range transients because the compensating capacitor bank tends to isolate the power supply from the load for faster transients. Attempts to include stray circuit elements with time constants in the microsecond and shorter range have had little success because of huge increases in computing time that result. The model has been successfully extended to include the accel modulator

  16. Modeling Reality: How Computers Mirror Life

    International Nuclear Information System (INIS)

    Inoue, J-I

    2005-01-01

    Modeling Reality: How Computers Mirror Life covers a wide range of modern subjects in complex systems, suitable not only for undergraduate students who want to learn about modelling 'reality' by using computer simulations, but also for researchers who want to learn something about subjects outside of their majors and need a simple guide. Readers are not required to have specialized training before they start the book. Each chapter is organized so as to train the reader to grasp the essential idea of simulating phenomena and guide him/her towards more advanced areas. The topics presented in this textbook fall into two categories. The first is at graduate level, namely probability, statistics, information theory, graph theory, and the Turing machine, which are standard topics in the course of information science and information engineering departments. The second addresses more advanced topics, namely cellular automata, deterministic chaos, fractals, game theory, neural networks, and genetic algorithms. Several topics included here (neural networks, game theory, information processing, etc) are now some of the main subjects of statistical mechanics, and many papers related to these interdisciplinary fields are published in Journal of Physics A: Mathematical and General, so readers of this journal will be familiar with the subject areas of this book. However, each area is restricted to an elementary level and if readers wish to know more about the topics they are interested in, they will need more advanced books. For example, on neural networks, the text deals with the back-propagation algorithm for perceptron learning. Nowadays, however, this is a rather old topic, so the reader might well choose, for example, Introduction to the Theory of Neural Computation by J Hertz et al (Perseus books, 1991) or Statistical Physics of Spin Glasses and Information Processing by H Nishimori (Oxford University Press, 2001) for further reading. Nevertheless, this book is worthwhile

  17. Direct modeling for computational fluid dynamics

    Science.gov (United States)

    Xu, Kun

    2015-06-01

    All fluid dynamic equations are valid under their modeling scales, such as the particle mean free path and mean collision time scale of the Boltzmann equation and the hydrodynamic scale of the Navier-Stokes (NS) equations. The current computational fluid dynamics (CFD) focuses on the numerical solution of partial differential equations (PDEs), and its aim is to get the accurate solution of these governing equations. Under such a CFD practice, it is hard to develop a unified scheme that covers flow physics from kinetic to hydrodynamic scales continuously because there is no such governing equation which could make a smooth transition from the Boltzmann to the NS modeling. The study of fluid dynamics needs to go beyond the traditional numerical partial differential equations. The emerging engineering applications, such as air-vehicle design for near-space flight and flow and heat transfer in micro-devices, do require further expansion of the concept of gas dynamics to a larger domain of physical reality, rather than the traditional distinguishable governing equations. At the current stage, the non-equilibrium flow physics has not yet been well explored or clearly understood due to the lack of appropriate tools. Unfortunately, under the current numerical PDE approach, it is hard to develop such a meaningful tool due to the absence of valid PDEs. In order to construct multiscale and multiphysics simulation methods similar to the modeling process of constructing the Boltzmann or the NS governing equations, the development of a numerical algorithm should be based on the first principle of physical modeling. In this paper, instead of following the traditional numerical PDE path, we introduce direct modeling as a principle for CFD algorithm development. Since all computations are conducted in a discretized space with limited cell resolution, the flow physics to be modeled has to be done in the mesh size and time step scales. Here, the CFD is more or less a direct

  18. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  19. Computational methods for a three-dimensional model of the petroleum-discovery process

    Science.gov (United States)

    Schuenemeyer, J.H.; Bawiec, W.J.; Drew, L.J.

    1980-01-01

    A discovery-process model devised by Drew, Schuenemeyer, and Root can be used to predict the amount of petroleum to be discovered in a basin from some future level of exploratory effort: the predictions are based on historical drilling and discovery data. Because marginal costs of discovery and production are a function of field size, the model can be used to make estimates of future discoveries within deposit size classes. The modeling approach is a geometric one in which the area searched is a function of the size and shape of the targets being sought. A high correlation is assumed between the surface-projection area of the fields and the volume of petroleum. To predict how much oil remains to be found, the area searched must be computed, and the basin size and discovery efficiency must be estimated. The basin is assumed to be explored randomly rather than by pattern drilling. The model may be used to compute independent estimates of future oil at different depth intervals for a play involving multiple producing horizons. We have written FORTRAN computer programs that are used with Drew, Schuenemeyer, and Root's model to merge the discovery and drilling information and perform the necessary computations to estimate undiscovered petroleum. These program may be modified easily for the estimation of remaining quantities of commodities other than petroleum. ?? 1980.

  20. A PROFICIENT MODEL FOR HIGH END SECURITY IN CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    R. Bala Chandar

    2014-01-01

    Full Text Available Cloud computing is an inspiring technology due to its abilities like ensuring scalable services, reducing the anxiety of local hardware and software management associated with computing while increasing flexibility and scalability. A key trait of the cloud services is remotely processing of data. Even though this technology had offered a lot of services, there are a few concerns such as misbehavior of server side stored data , out of control of data owner's data and cloud computing does not control the access of outsourced data desired by the data owner. To handle these issues, we propose a new model to ensure the data correctness for assurance of stored data, distributed accountability for authentication and efficient access control of outsourced data for authorization. This model strengthens the correctness of data and helps to achieve the cloud data integrity, supports data owner to have control on their own data through tracking and improves the access control of outsourced data.