WorldWideScience

Sample records for regulation computer model

  1. Computing Active Power Losses Using a Mathematical Model of a Regulated Street Luminaire

    Directory of Open Access Journals (Sweden)

    Roman Sikora

    2018-05-01

    Full Text Available Before the use of regulated street luminaires with variable power and luminous flux, computations were performed using constant values for their electrical and photometric parameters. At present, where such lighting is in use, it is no longer possible to base calculations on such assumptions. Computations of energy and power losses, for example, need to be performed for all dimming levels and based on the applied regulation algorithm. Based on measurements carried out on regulated luminaires, it was found that certain electrical parameters have a nonlinear dependence on the dimming level. Electrical parameters were also observed to depend on the value of the supply voltage. The results of the measurements are presented in this article. Failure to take account of power losses in computations of the energy efficiency of street lighting in accordance with the applicable EN 13201 standard causes values of energy efficiency indicators to be overstated. Power loss computations are presented in this article for a sample street lighting system with regulated luminaires, for the whole range of dimming levels and additionally for fluctuations of ±10% in the supply voltage. In addition, a mathematical model of a regulated luminaire is constructed with the use of regression methods, and a practical application of that model is described.

  2. Computational Modelling of the Metabolic States Regulated by the Kinase Akt

    Directory of Open Access Journals (Sweden)

    Ettore eMosca

    2012-11-01

    Full Text Available Signal transduction pathways and gene regulation determine a major reorganization of metabolic activities in order to support cell proliferation. Protein Kinase B (PKB, also known as Akt, participates in the PI3K/Akt/mTOR pathway, a master regulator of aerobic glycolysis and cellular biosynthesis, two activities shown by both normal and cancer proliferating cells. Not surprisingly considering its relevance for cellular metabolism, Akt/PKB is often found hyperactive in cancer cells. In the last decade, many efforts have been made to improve the understanding of the control of glucose metabolism and the identification of a therapeutic window between proliferating cancer cells and proliferating normal cells. In this context, we have modelled the link between the PI3K/Akt/mTOR pathway, glycolysis, lactic acid production and nucleotide biosynthesis. We used a computational model in order to compare two metabolic states generated by the specific variation of the metabolic fluxes regulated by the activity of the PI3K/Akt/mTOR pathway. One of the two states represented the metabolism of a growing cancer cell characterised by aerobic glycolysis and cellular biosynthesis, while the other state represented the same metabolic network with a reduced glycolytic rate and a higher mitochondrial pyruvate metabolism, as reported in literature in relation to the activity of the PI3K/Akt/mTOR. Some steps that link glycolysis and pentose phosphate pathway revealed their importance for controlling the dynamics of cancer glucose metabolism.

  3. Computer simulation studies in fluid and calcium regulation and orthostatic intolerance

    Science.gov (United States)

    1985-01-01

    The systems analysis approach to physiological research uses mathematical models and computer simulation. Major areas of concern during prolonged space flight discussed include fluid and blood volume regulation; cardiovascular response during shuttle reentry; countermeasures for orthostatic intolerance; and calcium regulation and bone atrophy. Potential contributions of physiologic math models to future flight experiments are examined.

  4. Computational Methods for Modeling Aptamers and Designing Riboswitches

    Directory of Open Access Journals (Sweden)

    Sha Gong

    2017-11-01

    Full Text Available Riboswitches, which are located within certain noncoding RNA region perform functions as genetic “switches”, regulating when and where genes are expressed in response to certain ligands. Understanding the numerous functions of riboswitches requires computation models to predict structures and structural changes of the aptamer domains. Although aptamers often form a complex structure, computational approaches, such as RNAComposer and Rosetta, have already been applied to model the tertiary (three-dimensional (3D structure for several aptamers. As structural changes in aptamers must be achieved within the certain time window for effective regulation, kinetics is another key point for understanding aptamer function in riboswitch-mediated gene regulation. The coarse-grained self-organized polymer (SOP model using Langevin dynamics simulation has been successfully developed to investigate folding kinetics of aptamers, while their co-transcriptional folding kinetics can be modeled by the helix-based computational method and BarMap approach. Based on the known aptamers, the web server Riboswitch Calculator and other theoretical methods provide a new tool to design synthetic riboswitches. This review will represent an overview of these computational methods for modeling structure and kinetics of riboswitch aptamers and for designing riboswitches.

  5. Computational Modeling of Complex Protein Activity Networks

    NARCIS (Netherlands)

    Schivo, Stefano; Leijten, Jeroen; Karperien, Marcel; Post, Janine N.; Prignet, Claude

    2017-01-01

    Because of the numerous entities interacting, the complexity of the networks that regulate cell fate makes it impossible to analyze and understand them using the human brain alone. Computational modeling is a powerful method to unravel complex systems. We recently described the development of a

  6. Reconciliation of the cloud computing model with US federal electronic health record regulations.

    Science.gov (United States)

    Schweitzer, Eugene J

    2012-01-01

    Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing.

  7. Life system modeling and intelligent computing. Pt. I. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part I of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 55 papers in this volume are organized in topical sections on intelligent modeling, monitoring, and control of complex nonlinear systems; autonomy-oriented computing and intelligent agents; advanced theory and methodology in fuzzy systems and soft computing; computational intelligence in utilization of clean and renewable energy resources; intelligent modeling, control and supervision for energy saving and pollution reduction; intelligent methods in developing vehicles, engines and equipments; computational methods and intelligence in modeling genetic and biochemical networks and regulation. (orig.)

  8. Reconciliation of the cloud computing model with US federal electronic health record regulations

    Science.gov (United States)

    2011-01-01

    Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing. PMID:21727204

  9. Computational Prediction of MicroRNAs from Toxoplasma gondii Potentially Regulating the Hosts’ Gene Expression

    Directory of Open Access Journals (Sweden)

    Müşerref Duygu Saçar

    2014-10-01

    Full Text Available MicroRNAs (miRNAs were discovered two decades ago, yet there is still a great need for further studies elucidating their genesis and targeting in different phyla. Since experimental discovery and validation of miRNAs is difficult, computational predictions are indispensable and today most computational approaches employ machine learning. Toxoplasma gondii, a parasite residing within the cells of its hosts like human, uses miRNAs for its post-transcriptional gene regulation. It may also regulate its hosts’ gene expression, which has been shown in brain cancer. Since previous studies have shown that overexpressed miRNAs within the host are causal for disease onset, we hypothesized that T. gondii could export miRNAs into its host cell. We computationally predicted all hairpins from the genome of T. gondii and used mouse and human models to filter possible candidates. These were then further compared to known miRNAs in human and rodents and their expression was examined for T. gondii grown in mouse and human hosts, respectively. We found that among the millions of potential hairpins in T. gondii, only a few thousand pass filtering using a human or mouse model and that even fewer of those are expressed. Since they are expressed and differentially expressed in rodents and human, we suggest that there is a chance that T. gondii may export miRNAs into its hosts for direct regulation.

  10. Computational modelling and analysis of the molecular network regulating sporulation initiation in Bacillus subtilis.

    Science.gov (United States)

    Ihekwaba, Adaoha E C; Mura, Ivan; Barker, Gary C

    2014-10-24

    Bacterial spores are important contaminants in food, and the spore forming bacteria are often implicated in food safety and food quality considerations. Spore formation is a complex developmental process involving the expression of more than 500 genes over the course of 6 to 8 hrs. The process culminates in the formation of resting cells capable of resisting environmental extremes and remaining dormant for long periods of time, germinating when conditions promote further vegetative growth. Experimental observations of sporulation and germination are problematic and time consuming so that reliable models are an invaluable asset in terms of prediction and risk assessment. In this report we develop a model which assists in the interpretation of sporulation dynamics. This paper defines and analyses a mathematical model for the network regulating Bacillus subtilis sporulation initiation, from sensing of sporulation signals down to the activation of the early genes under control of the master regulator Spo0A. Our model summarises and extends other published modelling studies, by allowing the user to execute sporulation initiation in a scenario where Isopropyl β-D-1-thiogalactopyranoside (IPTG) is used as an artificial sporulation initiator as well as in modelling the induction of sporulation in wild-type cells. The analysis of the model results and the comparison with experimental data indicate that the model is good at predicting inducible responses to sporulation signals. However, the model is unable to reproduce experimentally observed accumulation of phosphorelay sporulation proteins in wild type B. subtilis. This model also highlights that the phosphorelay sub-component, which relays the signals detected by the sensor kinases to the master regulator Spo0A, is crucial in determining the response dynamics of the system. We show that there is a complex connectivity between the phosphorelay features and the master regulatory Spo0A. Additional we discovered that the

  11. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  12. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  13. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  14. Modeling and performance improvement of the constant power regulator systems in variable displacement axial piston pump.

    Science.gov (United States)

    Park, Sung Hwan; Lee, Ji Min; Kim, Jong Shik

    2013-01-01

    An irregular performance of a mechanical-type constant power regulator is considered. In order to find the cause of an irregular discharge flow at the cut-off pressure area, modeling and numerical simulations are performed to observe dynamic behavior of internal parts of the constant power regulator system for a swashplate-type axial piston pump. The commercial numerical simulation software AMESim is applied to model the mechanical-type regulator with hydraulic pump and simulate the performance of it. The validity of the simulation model of the constant power regulator system is verified by comparing simulation results with experiments. In order to find the cause of the irregular performance of the mechanical-type constant power regulator system, the behavior of main components such as the spool, sleeve, and counterbalance piston is investigated using computer simulation. The shape modification of the counterbalance piston is proposed to improve the undesirable performance of the mechanical-type constant power regulator. The performance improvement is verified by computer simulation using AMESim software.

  15. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  16. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  17. A computational model for telomere-dependent cell-replicative aging.

    Science.gov (United States)

    Portugal, R D; Land, M G P; Svaiter, B F

    2008-01-01

    Telomere shortening provides a molecular basis for the Hayflick limit. Recent data suggest that telomere shortening also influence mitotic rate. We propose a stochastic growth model of this phenomena, assuming that cell division in each time interval is a random process which probability decreases linearly with telomere shortening. Computer simulations of the proposed stochastic telomere-regulated model provides good approximation of the qualitative growth of cultured human mesenchymal stem cells.

  18. Modeling and Performance Improvement of the Constant Power Regulator Systems in Variable Displacement Axial Piston Pump

    Science.gov (United States)

    Park, Sung Hwan; Lee, Ji Min; Kim, Jong Shik

    2013-01-01

    An irregular performance of a mechanical-type constant power regulator is considered. In order to find the cause of an irregular discharge flow at the cut-off pressure area, modeling and numerical simulations are performed to observe dynamic behavior of internal parts of the constant power regulator system for a swashplate-type axial piston pump. The commercial numerical simulation software AMESim is applied to model the mechanical-type regulator with hydraulic pump and simulate the performance of it. The validity of the simulation model of the constant power regulator system is verified by comparing simulation results with experiments. In order to find the cause of the irregular performance of the mechanical-type constant power regulator system, the behavior of main components such as the spool, sleeve, and counterbalance piston is investigated using computer simulation. The shape modification of the counterbalance piston is proposed to improve the undesirable performance of the mechanical-type constant power regulator. The performance improvement is verified by computer simulation using AMESim software. PMID:24282389

  19. Modeling and Performance Improvement of the Constant Power Regulator Systems in Variable Displacement Axial Piston Pump

    Directory of Open Access Journals (Sweden)

    Sung Hwan Park

    2013-01-01

    Full Text Available An irregular performance of a mechanical-type constant power regulator is considered. In order to find the cause of an irregular discharge flow at the cut-off pressure area, modeling and numerical simulations are performed to observe dynamic behavior of internal parts of the constant power regulator system for a swashplate-type axial piston pump. The commercial numerical simulation software AMESim is applied to model the mechanical-type regulator with hydraulic pump and simulate the performance of it. The validity of the simulation model of the constant power regulator system is verified by comparing simulation results with experiments. In order to find the cause of the irregular performance of the mechanical-type constant power regulator system, the behavior of main components such as the spool, sleeve, and counterbalance piston is investigated using computer simulation. The shape modification of the counterbalance piston is proposed to improve the undesirable performance of the mechanical-type constant power regulator. The performance improvement is verified by computer simulation using AMESim software.

  20. Government regulations and other influences on the medical use of computers.

    Science.gov (United States)

    Mishelevich, D J; Grams, R R; Mize, S G; Smith, J P

    1979-01-01

    This paper presents points brought out in a panel discussion held at the 12th Hawaiian International Conference on System Sciences, January 1979. The session was attended by approximately two dozen interested parties from various segments of the academic, government, and health care communities. The broad categories covered include the specific problems of government regulations and their impact on specific clinical information systems installed at The University of Texas Health Science Center at Dallas, opportunities in a regulated environment, problems in a regulated environment, vendor-related issues in the marketing and manufacture of computer-based information systems, rational approaches to government control, and specific issues related to medical computer science.

  1. Modeling and simulation of CANDU reactor and its regulating system

    Science.gov (United States)

    Javidnia, Hooman

    Analytical computer codes are indispensable tools in design, optimization, and control of nuclear power plants. Numerous codes have been developed to perform different types of analyses related to the nuclear power plants. A large number of these codes are designed to perform safety analyses. In the context of safety analyses, the control system is often neglected. Although there are good reasons for such a decision, that does not mean that the study of control systems in the nuclear power plants should be neglected altogether. In this thesis, a proof of concept code is developed as a tool that can be used in the design. optimization. and operation stages of the control system. The main objective in the design of this computer code is providing a tool that is easy to use by its target audience and is capable of producing high fidelity results that can be trusted to design the control system and optimize its performance. Since the overall plant control system covers a very wide range of processes, in this thesis the focus has been on one particular module of the the overall plant control system, namely, the reactor regulating system. The center of the reactor regulating system is the CANDU reactor. A nodal model for the reactor is used to represent the spatial neutronic kinetics of the core. The nodal model produces better results compared to the point kinetics model which is often used in the design and analysis of control system for nuclear reactors. The model can capture the spatial effects to some extent. although it is not as detailed as the finite difference methods. The criteria for choosing a nodal model of the core are: (1) the model should provide more detail than point kinetics and capture spatial effects, (2) it should not be too complex or overly detailed to slow down the simulation and provide details that are extraneous or unnecessary for a control engineer. Other than the reactor itself, there are auxiliary models that describe dynamics of different

  2. Teacher regulation of multiple computer-supported collaborating groups

    NARCIS (Netherlands)

    Van Leeuwen, Anouschka; Janssen, Jeroen; Erkens, Gijsbert; Brekelmans, Mieke

    2015-01-01

    Teachers regulating groups of students during computer-supported collaborative learning (CSCL) face the challenge of orchestrating their guidance at student, group, and class level. During CSCL, teachers can monitor all student activity and interact with multiple groups at the same time. Not much is

  3. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  4. Computer modeling for optimal placement of gloveboxes

    International Nuclear Information System (INIS)

    Hench, K.W.; Olivas, J.D.; Finch, P.R.

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units

  5. Computer modeling for optimal placement of gloveboxes

    Energy Technology Data Exchange (ETDEWEB)

    Hench, K.W.; Olivas, J.D. [Los Alamos National Lab., NM (United States); Finch, P.R. [New Mexico State Univ., Las Cruces, NM (United States)

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units.

  6. Reverse Engineering the Inflammatory "Clock": From Computational Modeling to Rational Resetting.

    Science.gov (United States)

    Vodovotz, Yoram

    2016-01-01

    Properly-regulated inflammation is central to homeostasis. Traumatic injury, hemorrhagic shock, septic shock, and other injury-related processes such as wound healing are associated with dysregulated inflammation. Like many biological processes, inflammation is a dynamic, complex system whose function, like that of an analog clock, cannot be discerned simply from a laundry list of its parts (data). The advent of multiplexed platforms for gathering biological data, while providing an unprecedented level of detailed information about the inflammatory response, has paradoxically also proven to be overwhelming. This problem is especially acute when the datasets involve time courses, since typical statistical analyses and data-driven modeling are geared towards single time points. Various groups have addressed this problem using dynamic approaches to data-driven and mechanistic computational modeling. These modeling tools can be thought of as the "gears" and "hands" of the "clock," and have led to insights regarding principal drivers, dynamic networks, feedbacks, and regulatory switches that characterize and perhaps regulate the inflammatory response. In parallel, mechanistic computational models have given an abstracted sense of how the inflammatory "clock" works, leading to in silico models of critically ill individuals and populations. Integrating data-driven and mechanistic modeling may point the way to a rational "resetting" of inflammation via model-driven precision medicine.

  7. A computational analysis of the long-term regulation of arterial pressure.

    Science.gov (United States)

    Beard, Daniel A; Pettersen, Klas H; Carlson, Brian E; Omholt, Stig W; Bugenhagen, Scott M

    2013-01-01

    The asserted dominant role of the kidneys in the chronic regulation of blood pressure and in the etiology of hypertension has been debated since the 1970s. At the center of the theory is the observation that the acute relationships between arterial pressure and urine production-the acute pressure-diuresis and pressure-natriuresis curves-physiologically adapt to perturbations in pressure and/or changes in the rate of salt and volume intake. These adaptations, modulated by various interacting neurohumoral mechanisms, result in chronic relationships between water and salt excretion and pressure that are much steeper than the acute relationships. While the view that renal function is the dominant controller of arterial pressure has been supported by computer models of the cardiovascular system known as the "Guyton-Coleman model", no unambiguous description of a computer model capturing chronic adaptation of acute renal function in blood pressure control has been presented. Here, such a model is developed with the goals of: 1. representing the relevant mechanisms in an identifiable mathematical model; 2. identifying model parameters using appropriate data; 3. validating model predictions in comparison to data; and 4. probing hypotheses regarding the long-term control of arterial pressure and the etiology of primary hypertension. The developed model reveals: long-term control of arterial blood pressure is primarily through the baroreflex arc and the renin-angiotensin system; and arterial stiffening provides a sufficient explanation for the etiology of primary hypertension associated with ageing. Furthermore, the model provides the first consistent explanation of the physiological response to chronic stimulation of the baroreflex.

  8. Dreaming Your Fear Away: A Computational Model for Fear Extinction Learning During Dreaming

    NARCIS (Netherlands)

    Treur, J.; Lu et al., B.L.

    2011-01-01

    In this paper a computational model is presented that models how dreaming is used to learn fear extinction. The approach addresses dreaming as internal simulation incorporating memory elements in the form of sensory representations and their associated fear. During dream episodes regulation of fear

  9. Internal cholinergic regulation of learning and recall in a model of olfactory processing

    Directory of Open Access Journals (Sweden)

    Licurgo Benemann Almeida

    2016-11-01

    Full Text Available In the olfactory system, cholinergic modulation has been associated with contrast modulation and changes in receptive fields in the olfactory bulb, as well the learning of odor associations in olfactory cortex. Computational modeling and behavioral studies suggest that cholinergic modulation could improve sensory processing and learning while preventing pro-active interference when task demands are high. However, how sensory inputs and/or learning regulate incoming modulation has not yet been elucidated. We here use a computational model of the olfactory bulb, piriform cortex (PC and horizontal limb of the diagonal band of Broca (HDB to explore how olfactory learning could regulate cholinergic inputs to the system in a closed feedback loop. In our model, the novelty of an odor is reflected in firing rates and sparseness of cortical neurons in response to that odor and these firing rates can directly regulate learning in the system by modifying cholinergic inputs to the system. In the model, cholinergic neurons reduce their firing in response to familiar odors – reducing plasticity in the PC, but increase their firing in response to novel odor – increasing PC plasticity. Recordings from HDB neurons in awake behaving rats reflect predictions from the model by showing that a subset of neurons decrease their firing as an odor becomes familiar.

  10. An agent-based model for integrated emotion regulation and contagion in socially affected decision making

    OpenAIRE

    Manzoor, A.; Treur, J.

    2015-01-01

    This paper addresses an agent-based computational social agent model for the integration of emotion regulation, emotion contagion and decision making in a social context. The model integrates emotion-related valuing, in order to analyse the role of emotions in socially affected decision making. The agent-based model is illustrated for the interaction between two persons. Simulation experiments for different kinds of scenarios help to understand how decisions can be affected by regulating the ...

  11. Fast Performance Computing Model for Smart Distributed Power Systems

    Directory of Open Access Journals (Sweden)

    Umair Younas

    2017-06-01

    Full Text Available Plug-in Electric Vehicles (PEVs are becoming the more prominent solution compared to fossil fuels cars technology due to its significant role in Greenhouse Gas (GHG reduction, flexible storage, and ancillary service provision as a Distributed Generation (DG resource in Vehicle to Grid (V2G regulation mode. However, large-scale penetration of PEVs and growing demand of energy intensive Data Centers (DCs brings undesirable higher load peaks in electricity demand hence, impose supply-demand imbalance and threaten the reliability of wholesale and retail power market. In order to overcome the aforementioned challenges, the proposed research considers smart Distributed Power System (DPS comprising conventional sources, renewable energy, V2G regulation, and flexible storage energy resources. Moreover, price and incentive based Demand Response (DR programs are implemented to sustain the balance between net demand and available generating resources in the DPS. In addition, we adapted a novel strategy to implement the computational intensive jobs of the proposed DPS model including incoming load profiles, V2G regulation, battery State of Charge (SOC indication, and fast computation in decision based automated DR algorithm using Fast Performance Computing resources of DCs. In response, DPS provide economical and stable power to DCs under strict power quality constraints. Finally, the improved results are verified using case study of ISO California integrated with hybrid generation.

  12. Computational models of neuromodulation.

    Science.gov (United States)

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  13. Computer Simulation of Noise Effects of the Neighborhood of Stimulus Threshold for a Mathematical Model of Homeostatic Regulation of Sleep-Wake Cycles

    Directory of Open Access Journals (Sweden)

    Wuyin Jin

    2017-01-01

    Full Text Available The noise effects on a homeostatic regulation of sleep-wake cycles’ neuronal mathematical model determined by the hypocretin/orexin and the local glutamate interneurons spatiotemporal behaviors are studied within the neighborhood of stimulus threshold in this work; the neuronal noise added to the stimulus, the conductance, and the activation variable of the modulation function are investigated, respectively, based on a circadian input skewed in sine function. The computer simulation results suggested that the increased amplitude of external current input will lead to the fact that awakening time is advanced but the sleepy time remains the same; for the bigger conductance and modulation noise, the regulatory mechanism of the model sometimes will be collapsed and the coupled two neurons of the model show very irregular activities; the falling asleep or wake transform appears at nondeterminate time.

  14. Computer Models in Biomechanics From Nano to Macro

    CERN Document Server

    Kuhl, Ellen

    2013-01-01

    This book contains a collection of papers that were presented at the IUTAM Symposium on “Computer Models in Biomechanics: From Nano to Macro” held at Stanford University, California, USA, from August 29 to September 2, 2011. It contains state-of-the-art papers on: - Protein and Cell Mechanics: coarse-grained model for unfolded proteins, collagen-proteoglycan structural interactions in the cornea, simulations of cell behavior on substrates - Muscle Mechanics: modeling approaches for Ca2+–regulated smooth muscle contraction, smooth muscle modeling using continuum thermodynamical frameworks, cross-bridge model describing the mechanoenergetics of actomyosin interaction, multiscale skeletal muscle modeling - Cardiovascular Mechanics: multiscale modeling of arterial adaptations by incorporating molecular mechanisms, cardiovascular tissue damage, dissection properties of aortic aneurysms, intracranial aneurysms, electromechanics of the heart, hemodynamic alterations associated with arterial remodeling followin...

  15. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  16. Mesh influence on the fire computer modeling in nuclear power plants

    Directory of Open Access Journals (Sweden)

    D. Lázaro

    2018-04-01

    Full Text Available Fire computer models allow to study real fire scenarios consequences. Its use in nuclear power plants has increased with the new regulations to apply risk informed performance-based methods for the analysis and design of fire safety solutions. The selection of the cell side factor is very important in these kinds of models. The mesh must establish a compromise between the geometry adjustment, the resolution of the equations and the computation times. This paper aims to study the impact of several cell sizes, using the fire computer model FDS, to evaluate the relative affectation in the final simulation results. In order to validate that, we have employed several scenarios of interest for nuclear power plants. Conclusions offer relevant data for users and show some cell sizes that can be selected to guarantee the quality of the simulations and reduce the results uncertainty.

  17. Computational Modeling of Oxygen Transport in the Microcirculation: From an Experiment-Based Model to Theoretical Analyses

    OpenAIRE

    Lücker, Adrien

    2017-01-01

    Oxygen supply to cells by the cardiovascular system involves multiple physical and chemical processes that aim to satisfy fluctuating metabolic demand. Regulation mechanisms range from increased heart rate to minute adaptations in the microvasculature. The challenges and limitations of experimental studies in vivo make computational models an invaluable complement. In this thesis, oxygen transport from capillaries to tissue is investigated using a new numerical model that is tailored for vali...

  18. A computational model predicting disruption of blood vessel development.

    Directory of Open Access Journals (Sweden)

    Nicole Kleinstreuer

    2013-04-01

    Full Text Available Vascular development is a complex process regulated by dynamic biological networks that vary in topology and state across different tissues and developmental stages. Signals regulating de novo blood vessel formation (vasculogenesis and remodeling (angiogenesis come from a variety of biological pathways linked to endothelial cell (EC behavior, extracellular matrix (ECM remodeling and the local generation of chemokines and growth factors. Simulating these interactions at a systems level requires sufficient biological detail about the relevant molecular pathways and associated cellular behaviors, and tractable computational models that offset mathematical and biological complexity. Here, we describe a novel multicellular agent-based model of vasculogenesis using the CompuCell3D (http://www.compucell3d.org/ modeling environment supplemented with semi-automatic knowledgebase creation. The model incorporates vascular endothelial growth factor signals, pro- and anti-angiogenic inflammatory chemokine signals, and the plasminogen activating system of enzymes and proteases linked to ECM interactions, to simulate nascent EC organization, growth and remodeling. The model was shown to recapitulate stereotypical capillary plexus formation and structural emergence of non-coded cellular behaviors, such as a heterologous bridging phenomenon linking endothelial tip cells together during formation of polygonal endothelial cords. Molecular targets in the computational model were mapped to signatures of vascular disruption derived from in vitro chemical profiling using the EPA's ToxCast high-throughput screening (HTS dataset. Simulating the HTS data with the cell-agent based model of vascular development predicted adverse effects of a reference anti-angiogenic thalidomide analog, 5HPP-33, on in vitro angiogenesis with respect to both concentration-response and morphological consequences. These findings support the utility of cell agent-based models for simulating a

  19. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    Models are playing important roles in design and analysis of chemicals based products and the processes that manufacture them. Computer-aided methods and tools have the potential to reduce the number of experiments, which can be expensive and time consuming, and there is a benefit of working...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...

  20. Computer Profiling Based Model for Investigation

    OpenAIRE

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  1. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  2. An agent-based model for integrated emotion regulation and contagion in socially affected decision making

    NARCIS (Netherlands)

    Manzoor, A.; Treur, J.

    2015-01-01

    This paper addresses an agent-based computational social agent model for the integration of emotion regulation, emotion contagion and decision making in a social context. The model integrates emotion-related valuing, in order to analyse the role of emotions in socially affected decision making. The

  3. Overhead Crane Computer Model

    Science.gov (United States)

    Enin, S. S.; Omelchenko, E. Y.; Fomin, N. V.; Beliy, A. V.

    2018-03-01

    The paper has a description of a computer model of an overhead crane system. The designed overhead crane system consists of hoisting, trolley and crane mechanisms as well as a payload two-axis system. With the help of the differential equation of specified mechanisms movement derived through Lagrange equation of the II kind, it is possible to build an overhead crane computer model. The computer model was obtained using Matlab software. Transients of coordinate, linear speed and motor torque of trolley and crane mechanism systems were simulated. In addition, transients of payload swaying were obtained with respect to the vertical axis. A trajectory of the trolley mechanism with simultaneous operation with the crane mechanism is represented in the paper as well as a two-axis trajectory of payload. The designed computer model of an overhead crane is a great means for studying positioning control and anti-sway control systems.

  4. Improving self-regulated learning junior high school students through computer-based learning

    Science.gov (United States)

    Nurjanah; Dahlan, J. A.

    2018-05-01

    This study is back grounded by the importance of self-regulated learning as an affective aspect that determines the success of students in learning mathematics. The purpose of this research is to see how the improvement of junior high school students' self-regulated learning through computer based learning is reviewed in whole and school level. This research used a quasi-experimental research method. This is because individual sample subjects are not randomly selected. The research design used is Pretest-and-Posttest Control Group Design. Subjects in this study were students of grade VIII junior high school in Bandung taken from high school (A) and middle school (B). The results of this study showed that the increase of the students' self-regulated learning who obtain learning with computer-based learning is higher than students who obtain conventional learning. School-level factors have a significant effect on increasing of the students' self-regulated learning.

  5. Dynamic Scaffolding of Socially Regulated Learning in a Computer-Based Learning Environment

    NARCIS (Netherlands)

    Molenaar, I.; Roda, Claudia; van Boxtel, Carla A.M.; Sleegers, P.J.C.

    2012-01-01

    The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N = 56) are supported with computer-generated scaffolds and students in the control condition (N =

  6. A Computational Model of Hydraulic Volume Displacement Drive

    Directory of Open Access Journals (Sweden)

    V. N. Pil'gunov

    2014-01-01

    Full Text Available The paper offers a computational model of industrial-purpose hydraulic drive with two hydraulic volume adjustable working chamber machines (pump and motor. Adjustable pump equipped with the pressure control unit can be run together with several adjustable hydraulic motors on the principle of three-phase hydraulic socket-outlet with high-pressure lines, drain, and drainage system. The paper considers the pressure-controlled hydrostatic transmission with hydraulic motor as an output link. It shows a possibility to create a saving hydraulic drive using a functional tie between the adjusting parameters of the pump and hydraulic motor through the pressure difference, torque, and angular rate of the hydraulic motor shaft rotation. The programmable logic controller can implement such tie. The Coulomb and viscous frictions are taken into consideration when developing a computational model of the hydraulic volume displacement drive. Discharge balance considers external and internal leakages in equivalent clearances of hydraulic machines, as well as compression loss volume caused by hydraulic fluid compressibility and deformation of pipe walls. To correct dynamic properties of hydraulic drive, the paper offers that in discharge balance are included the additional regulated external leakages in the open circuit of hydraulic drive and regulated internal leakages in the closed-loop circuit. Generalized differential equations having functional multipliers and multilinked nature have been obtained to describe the operation of hydraulic positioning and speed drive with two hydraulic volume adjustable working chamber machines. It is shown that a proposed computational model of hydraulic drive can be taken into consideration in development of LS («Load-Sensing» drives, in which the pumping pressure is tuned to the value required for the most loaded slave motor to overcome the load. Results attained can be used both in designing the industrial-purpose heavy

  7. Computational Models and Emergent Properties of Respiratory Neural Networks

    Science.gov (United States)

    Lindsey, Bruce G.; Rybak, Ilya A.; Smith, Jeffrey C.

    2012-01-01

    Computational models of the neural control system for breathing in mammals provide a theoretical and computational framework bringing together experimental data obtained from different animal preparations under various experimental conditions. Many of these models were developed in parallel and iteratively with experimental studies and provided predictions guiding new experiments. This data-driven modeling approach has advanced our understanding of respiratory network architecture and neural mechanisms underlying generation of the respiratory rhythm and pattern, including their functional reorganization under different physiological conditions. Models reviewed here vary in neurobiological details and computational complexity and span multiple spatiotemporal scales of respiratory control mechanisms. Recent models describe interacting populations of respiratory neurons spatially distributed within the Bötzinger and pre-Bötzinger complexes and rostral ventrolateral medulla that contain core circuits of the respiratory central pattern generator (CPG). Network interactions within these circuits along with intrinsic rhythmogenic properties of neurons form a hierarchy of multiple rhythm generation mechanisms. The functional expression of these mechanisms is controlled by input drives from other brainstem components, including the retrotrapezoid nucleus and pons, which regulate the dynamic behavior of the core circuitry. The emerging view is that the brainstem respiratory network has rhythmogenic capabilities at multiple levels of circuit organization. This allows flexible, state-dependent expression of different neural pattern-generation mechanisms under various physiological conditions, enabling a wide repertoire of respiratory behaviors. Some models consider control of the respiratory CPG by pulmonary feedback and network reconfiguration during defensive behaviors such as cough. Future directions in modeling of the respiratory CPG are considered. PMID:23687564

  8. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  9. Computational Modeling of Space Physiology

    Science.gov (United States)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  10. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    Science.gov (United States)

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  11. Regulation of flow computers for the measurement of biofuels

    Science.gov (United States)

    Almeida, R. O.; Aguiar Júnior, E. A.; Costa-Felix, R. P. B.

    2018-03-01

    This article aims to discuss the need to develop a standard or regulation applicable to flow computers in the measurement of biofuels. International standards and recommendations are presented which are possibly adequate to fill this gap and at the end of the article a way is proposed to obtain a single document on the subject.

  12. Eukaryotic Cell Cycle as a Test Case for Modeling Cellular Regulation in a Collaborative Problem-Solving Environment

    Science.gov (United States)

    2007-03-01

    computer models of cell cycle regulation in a variety of organisms, including yeast cells, amphibian embryos, bacterial cells and human cells. These...and meiosis ), but they do not nullify the central role played by irreversible, alternating START and FINISH transitions in the cell cycle. 32...AFRL-IF-RS-TR-2007-69 Final Technical Report March 2007 EUKARYOTIC CELL CYCLE AS A TEST CASE FOR MODELING CELLULAR REGULATION IN A

  13. Computational modeling identifies key gene regulatory interactions underlying phenobarbital-mediated tumor promotion

    Science.gov (United States)

    Luisier, Raphaëlle; Unterberger, Elif B.; Goodman, Jay I.; Schwarz, Michael; Moggs, Jonathan; Terranova, Rémi; van Nimwegen, Erik

    2014-01-01

    Gene regulatory interactions underlying the early stages of non-genotoxic carcinogenesis are poorly understood. Here, we have identified key candidate regulators of phenobarbital (PB)-mediated mouse liver tumorigenesis, a well-characterized model of non-genotoxic carcinogenesis, by applying a new computational modeling approach to a comprehensive collection of in vivo gene expression studies. We have combined our previously developed motif activity response analysis (MARA), which models gene expression patterns in terms of computationally predicted transcription factor binding sites with singular value decomposition (SVD) of the inferred motif activities, to disentangle the roles that different transcriptional regulators play in specific biological pathways of tumor promotion. Furthermore, transgenic mouse models enabled us to identify which of these regulatory activities was downstream of constitutive androstane receptor and β-catenin signaling, both crucial components of PB-mediated liver tumorigenesis. We propose novel roles for E2F and ZFP161 in PB-mediated hepatocyte proliferation and suggest that PB-mediated suppression of ESR1 activity contributes to the development of a tumor-prone environment. Our study shows that combining MARA with SVD allows for automated identification of independent transcription regulatory programs within a complex in vivo tissue environment and provides novel mechanistic insights into PB-mediated hepatocarcinogenesis. PMID:24464994

  14. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  15. Thermodynamic state ensemble models of cis-regulation.

    Directory of Open Access Journals (Sweden)

    Marc S Sherman

    Full Text Available A major goal in computational biology is to develop models that accurately predict a gene's expression from its surrounding regulatory DNA. Here we present one class of such models, thermodynamic state ensemble models. We describe the biochemical derivation of the thermodynamic framework in simple terms, and lay out the mathematical components that comprise each model. These components include (1 the possible states of a promoter, where a state is defined as a particular arrangement of transcription factors bound to a DNA promoter, (2 the binding constants that describe the affinity of the protein-protein and protein-DNA interactions that occur in each state, and (3 whether each state is capable of transcribing. Using these components, we demonstrate how to compute a cis-regulatory function that encodes the probability of a promoter being active. Our intention is to provide enough detail so that readers with little background in thermodynamics can compose their own cis-regulatory functions. To facilitate this goal, we also describe a matrix form of the model that can be easily coded in any programming language. This formalism has great flexibility, which we show by illustrating how phenomena such as competition between transcription factors and cooperativity are readily incorporated into these models. Using this framework, we also demonstrate that Michaelis-like functions, another class of cis-regulatory models, are a subset of the thermodynamic framework with specific assumptions. By recasting Michaelis-like functions as thermodynamic functions, we emphasize the relationship between these models and delineate the specific circumstances representable by each approach. Application of thermodynamic state ensemble models is likely to be an important tool in unraveling the physical basis of combinatorial cis-regulation and in generating formalisms that accurately predict gene expression from DNA sequence.

  16. Dynamic Scaffolding of Socially Regulated Learning in a Computer-Based Learning Environment

    Science.gov (United States)

    Molenaar, Inge; Roda, Claudia; van Boxtel, Carla; Sleegers, Peter

    2012-01-01

    The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N=56) are supported with computer-generated scaffolds and students in the control condition (N=54) do not receive scaffolds. The scaffolds are…

  17. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    Science.gov (United States)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  18. Business Models and Regulation | Distributed Generation Interconnection

    Science.gov (United States)

    Collaborative | NREL Business Models and Regulation Business Models and Regulation Subscribe to new business models and approaches. The growing role of distributed resources in the electricity system is leading to a shift in business models and regulation for electric utilities. These

  19. International Conference on Computational Intelligence, Cyber Security, and Computational Models

    CERN Document Server

    Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn

    2016-01-01

    This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

  20. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  1. Modeling Computer Virus and Its Dynamics

    Directory of Open Access Journals (Sweden)

    Mei Peng

    2013-01-01

    Full Text Available Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that the infected part of the computer disappears, and the virus dies out, and P0 is a globally asymptotically stable equilibrium if R01 then this model has only one viral equilibrium P*, which means that the computer persists at a constant endemic level, and P* is also globally asymptotically stable. Finally, some numerical examples are given to demonstrate the analytical results.

  2. Substrate stiffness and oxygen as regulators of stem cell differentiation during skeletal tissue regeneration: a mechanobiological model.

    Directory of Open Access Journals (Sweden)

    Darren Paul Burke

    Full Text Available Extrinsic mechanical signals have been implicated as key regulators of mesenchymal stem cell (MSC differentiation. It has been possible to test different hypotheses for mechano-regulated MSC differentiation by attempting to simulate regenerative events such as bone fracture repair, where repeatable spatial and temporal patterns of tissue differentiation occur. More recently, in vitro studies have identified other environmental cues such as substrate stiffness and oxygen tension as key regulators of MSC differentiation; however it remains unclear if and how such cues determine stem cell fate in vivo. As part of this study, a computational model was developed to test the hypothesis that substrate stiffness and oxygen tension regulate stem cell differentiation during fracture healing. Rather than assuming mechanical signals act directly on stem cells to determine their differentiation pathway, it is postulated that they act indirectly to regulate angiogenesis and hence partially determine the local oxygen environment within a regenerating tissue. Chondrogenesis of MSCs was hypothesized to occur in low oxygen regions, while in well vascularised regions of the regenerating tissue a soft local substrate was hypothesised to facilitate adipogenesis while a stiff substrate facilitated osteogenesis. Predictions from the model were compared to both experimental data and to predictions of a well established computational mechanobiological model where tissue differentiation is assumed to be regulated directly by the local mechanical environment. The model predicted all the major events of fracture repair, including cartilaginous bridging, endosteal and periosteal bony bridging and bone remodelling. It therefore provides support for the hypothesis that substrate stiffness and oxygen play a key role in regulating MSC fate during regenerative events such as fracture healing.

  3. COMPUTATIONAL MODELS FOR SUSTAINABLE DEVELOPMENT

    OpenAIRE

    Monendra Grover; Rajesh Kumar; Tapan Kumar Mondal; S. Rajkumar

    2011-01-01

    Genetic erosion is a serious problem and computational models have been developed to prevent it. The computational modeling in this field not only includes (terrestrial) reserve design, but also decision modeling for related problems such as habitat restoration, marine reserve design, and nonreserve approaches to conservation management. Models have been formulated for evaluating tradeoffs between socioeconomic, biophysical, and spatial criteria in establishing marine reserves. The percolatio...

  4. Ranked retrieval of Computational Biology models.

    Science.gov (United States)

    Henkel, Ron; Endler, Lukas; Peters, Andre; Le Novère, Nicolas; Waltemath, Dagmar

    2010-08-11

    The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models.

  5. CMS computing model evolution

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  6. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  7. The IceCube Computing Infrastructure Model

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Besides the big LHC experiments a number of mid-size experiments is coming online which need to define new computing models to meet the demands on processing and storage requirements of those experiments. We present the hybrid computing model of IceCube which leverages GRID models with a more flexible direct user model as an example of a possible solution. In IceCube a central datacenter at UW-Madison servers as Tier-0 with a single Tier-1 datacenter at DESY Zeuthen. We describe the setup of the IceCube computing infrastructure and report on our experience in successfully provisioning the IceCube computing needs.

  8. Computational modeling of epidural cortical stimulation

    Science.gov (United States)

    Wongsarnpigoon, Amorn; Grill, Warren M.

    2008-12-01

    Epidural cortical stimulation (ECS) is a developing therapy to treat neurological disorders. However, it is not clear how the cortical anatomy or the polarity and position of the electrode affects current flow and neural activation in the cortex. We developed a 3D computational model simulating ECS over the precentral gyrus. With the electrode placed directly above the gyrus, about half of the stimulus current flowed through the crown of the gyrus while current density was low along the banks deep in the sulci. Beneath the electrode, neurons oriented perpendicular to the cortical surface were depolarized by anodic stimulation, and neurons oriented parallel to the boundary were depolarized by cathodic stimulation. Activation was localized to the crown of the gyrus, and neurons on the banks deep in the sulci were not polarized. During regulated voltage stimulation, the magnitude of the activating function was inversely proportional to the thickness of the CSF and dura. During regulated current stimulation, the activating function was not sensitive to the thickness of the dura but was slightly more sensitive than during regulated voltage stimulation to the thickness of the CSF. Varying the width of the gyrus and the position of the electrode altered the distribution of the activating function due to changes in the orientation of the neurons beneath the electrode. Bipolar stimulation, although often used in clinical practice, reduced spatial selectivity as well as selectivity for neuron orientation.

  9. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  10. Sierra toolkit computational mesh conceptual model

    International Nuclear Information System (INIS)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-01-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  11. Modeling post-transcriptional regulation activity of small non-coding RNAs in Escherichia coli.

    Science.gov (United States)

    Wang, Rui-Sheng; Jin, Guangxu; Zhang, Xiang-Sun; Chen, Luonan

    2009-04-29

    Transcriptional regulation is a fundamental process in biological systems, where transcription factors (TFs) have been revealed to play crucial roles. In recent years, in addition to TFs, an increasing number of non-coding RNAs (ncRNAs) have been shown to mediate post-transcriptional processes and regulate many critical pathways in both prokaryotes and eukaryotes. On the other hand, with more and more high-throughput biological data becoming available, it is possible and imperative to quantitatively study gene regulation in a systematic and detailed manner. Most existing studies for inferring transcriptional regulatory interactions and the activity of TFs ignore the possible post-transcriptional effects of ncRNAs. In this work, we propose a novel framework to infer the activity of regulators including both TFs and ncRNAs by exploring the expression profiles of target genes and (post)transcriptional regulatory relationships. We model the integrated regulatory system by a set of biochemical reactions which lead to a log-bilinear problem. The inference process is achieved by an iterative algorithm, in which two linear programming models are efficiently solved. In contrast to available related studies, the effects of ncRNAs on transcription process are considered in this work, and thus more reasonable and accurate reconstruction can be expected. In addition, the approach is suitable for large-scale problems from the viewpoint of computation. Experiments on two synthesized data sets and a model system of Escherichia coli (E. coli) carbon source transition from glucose to acetate illustrate the effectiveness of our model and algorithm. Our results show that incorporating the post-transcriptional regulation of ncRNAs into system model can mine the hidden effects from the regulation activity of TFs in transcription processes and thus can uncover the biological mechanisms in gene regulation in a more accurate manner. The software for the algorithm in this paper is available

  12. Modelling computer networks

    International Nuclear Information System (INIS)

    Max, G

    2011-01-01

    Traffic models in computer networks can be described as a complicated system. These systems show non-linear features and to simulate behaviours of these systems are also difficult. Before implementing network equipments users wants to know capability of their computer network. They do not want the servers to be overloaded during temporary traffic peaks when more requests arrive than the server is designed for. As a starting point for our study a non-linear system model of network traffic is established to exam behaviour of the network planned. The paper presents setting up a non-linear simulation model that helps us to observe dataflow problems of the networks. This simple model captures the relationship between the competing traffic and the input and output dataflow. In this paper, we also focus on measuring the bottleneck of the network, which was defined as the difference between the link capacity and the competing traffic volume on the link that limits end-to-end throughput. We validate the model using measurements on a working network. The results show that the initial model estimates well main behaviours and critical parameters of the network. Based on this study, we propose to develop a new algorithm, which experimentally determines and predict the available parameters of the network modelled.

  13. Mathematical Modeling and Computational Thinking

    Science.gov (United States)

    Sanford, John F.; Naidu, Jaideep T.

    2017-01-01

    The paper argues that mathematical modeling is the essence of computational thinking. Learning a computer language is a valuable assistance in learning logical thinking but of less assistance when learning problem-solving skills. The paper is third in a series and presents some examples of mathematical modeling using spreadsheets at an advanced…

  14. Myndplay: Measuring Attention Regulation with Single Dry Electrode Brain Computer Interface

    NARCIS (Netherlands)

    van der Wal, C.N.; Irrmischer, M.; Guo, Y.; Friston, K.; Faisal, A.; Hill, S.; Peng, H.

    2015-01-01

    Future applications for the detection of attention can be helped by the development and validation of single electrode brain computer interfaces that are small and user-friendly. The two objectives of this study were: to (1) understand the correlates of attention regulation as detected with the

  15. Global parameter estimation for thermodynamic models of transcriptional regulation.

    Science.gov (United States)

    Suleimenov, Yerzhan; Ay, Ahmet; Samee, Md Abul Hassan; Dresch, Jacqueline M; Sinha, Saurabh; Arnosti, David N

    2013-07-15

    Deciphering the mechanisms involved in gene regulation holds the key to understanding the control of central biological processes, including human disease, population variation, and the evolution of morphological innovations. New experimental techniques including whole genome sequencing and transcriptome analysis have enabled comprehensive modeling approaches to study gene regulation. In many cases, it is useful to be able to assign biological significance to the inferred model parameters, but such interpretation should take into account features that affect these parameters, including model construction and sensitivity, the type of fitness calculation, and the effectiveness of parameter estimation. This last point is often neglected, as estimation methods are often selected for historical reasons or for computational ease. Here, we compare the performance of two parameter estimation techniques broadly representative of local and global approaches, namely, a quasi-Newton/Nelder-Mead simplex (QN/NMS) method and a covariance matrix adaptation-evolutionary strategy (CMA-ES) method. The estimation methods were applied to a set of thermodynamic models of gene transcription applied to regulatory elements active in the Drosophila embryo. Measuring overall fit, the global CMA-ES method performed significantly better than the local QN/NMS method on high quality data sets, but this difference was negligible on lower quality data sets with increased noise or on data sets simplified by stringent thresholding. Our results suggest that the choice of parameter estimation technique for evaluation of gene expression models depends both on quality of data, the nature of the models [again, remains to be established] and the aims of the modeling effort. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. A computational model of self-efficacy's various effects on performance: Moving the debate forward.

    Science.gov (United States)

    Vancouver, Jeffrey B; Purl, Justin D

    2017-04-01

    Self-efficacy, which is one's belief in one's capacity, has been found to both positively and negatively influence effort and performance. The reasons for these different effects have been a major topic of debate among social-cognitive and perceptual control theorists. In particular, the findings of various self-efficacy effects has been motivated by a perceptual control theory view of self-regulation that social-cognitive theorists' question. To provide more clarity to the theoretical arguments, a computational model of the multiple processes presumed to create the positive, negative, and null effects for self-efficacy is presented. Building on an existing computational model of goal choice that produces a positive effect for self-efficacy, the current article adds a symbolic processing structure used during goal striving that explains the negative self-efficacy effect observed in recent studies. Moreover, the multiple processes, operating together, allow the model to recreate the various effects found in a published study of feedback ambiguity's moderating role on the self-efficacy to performance relationship (Schmidt & DeShon, 2010). Discussion focuses on the implications of the model for the self-efficacy debate, alternative computational models, the overlap between control theory and social-cognitive theory explanations, the value of using computational models for resolving theoretical disputes, and future research and directions the model inspires. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. LHCb computing model

    CERN Document Server

    Frank, M; Pacheco, Andreu

    1998-01-01

    This document is a first attempt to describe the LHCb computing model. The CPU power needed to process data for the event filter and reconstruction is estimated to be 2.2 \\Theta 106 MIPS. This will be installed at the experiment and will be reused during non data-taking periods for reprocessing. The maximal I/O of these activities is estimated to be around 40 MB/s.We have studied three basic models concerning the placement of the CPU resources for the other computing activities, Monte Carlo-simulation (1:4 \\Theta 106 MIPS) and physics analysis (0:5 \\Theta 106 MIPS): CPU resources may either be located at the physicist's homelab, national computer centres (Regional Centres) or at CERN.The CPU resources foreseen for analysis are sufficient to allow 100 concurrent analyses. It is assumed that physicists will work in physics groups that produce analysis data at an average rate of 4.2 MB/s or 11 TB per month. However, producing these group analysis data requires reading capabilities of 660 MB/s. It is further assu...

  18. 40 CFR 194.23 - Models and computer codes.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  19. Trust Models in Ubiquitous Computing

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2008-01-01

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.......We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models....

  20. Predictive model identifies key network regulators of cardiomyocyte mechano-signaling.

    Directory of Open Access Journals (Sweden)

    Philip M Tan

    2017-11-01

    Full Text Available Mechanical strain is a potent stimulus for growth and remodeling in cells. Although many pathways have been implicated in stretch-induced remodeling, the control structures by which signals from distinct mechano-sensors are integrated to modulate hypertrophy and gene expression in cardiomyocytes remain unclear. Here, we constructed and validated a predictive computational model of the cardiac mechano-signaling network in order to elucidate the mechanisms underlying signal integration. The model identifies calcium, actin, Ras, Raf1, PI3K, and JAK as key regulators of cardiac mechano-signaling and characterizes crosstalk logic imparting differential control of transcription by AT1R, integrins, and calcium channels. We find that while these regulators maintain mostly independent control over distinct groups of transcription factors, synergy between multiple pathways is necessary to activate all the transcription factors necessary for gene transcription and hypertrophy. We also identify a PKG-dependent mechanism by which valsartan/sacubitril, a combination drug recently approved for treating heart failure, inhibits stretch-induced hypertrophy, and predict further efficacious pairs of drug targets in the network through a network-wide combinatorial search.

  1. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  2. Computer models for economic and silvicultural decisions

    Science.gov (United States)

    Rosalie J. Ingram

    1989-01-01

    Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.

  3. The effect of regulation feedback in a computer-based formative assessment on information problem solving

    NARCIS (Netherlands)

    Timmers, Caroline; Walraven, Amber; Veldkamp, Bernard P.

    2015-01-01

    This study examines the effect of regulation feedback in a computer-based formative assessment in the context of searching for information online. Fifty 13-year-old students completed two randomly selected assessment tasks, receiving automated regulation feedback between them. Student performance

  4. Quantum vertex model for reversible classical computing.

    Science.gov (United States)

    Chamon, C; Mucciolo, E R; Ruckenstein, A E; Yang, Z-C

    2017-05-12

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without 'learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  5. GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique

    Science.gov (United States)

    Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.

    2015-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).

  6. Mapping the Most Significant Computer Hacking Events to a Temporal Computer Attack Model

    OpenAIRE

    Heerden , Renier ,; Pieterse , Heloise; Irwin , Barry

    2012-01-01

    Part 4: Section 3: ICT for Peace and War; International audience; This paper presents eight of the most significant computer hacking events (also known as computer attacks). These events were selected because of their unique impact, methodology, or other properties. A temporal computer attack model is presented that can be used to model computer based attacks. This model consists of the following stages: Target Identification, Reconnaissance, Attack, and Post-Attack Reconnaissance stages. The...

  7. A bargaining model of regulated markets' integration with an application to electricity supply market

    International Nuclear Information System (INIS)

    Wei Jingyuan; Smeers, Y.; Canon, E.

    1995-01-01

    An integrated market organized by regulated electric utilities is modelled. It is assumed that, given a price vector for the exchange of electricity between each pair of neighboring utilities, utilities independently maximize their own domestic social welfare subject to the zero profit constraint. An equilibrium price vector for exchanges among utilities is defined as the one which clears the exchanges for all pair of business partners. A single piecewise linear model is formulated for computing market equilibria. The model is used to simulate the electricity supply market organized by 11 western European countries

  8. Assessing medical students' self-regulation as aptitude in computer-based learning.

    Science.gov (United States)

    Song, Hyuksoon S; Kalet, Adina L; Plass, Jan L

    2011-03-01

    We developed a Self-Regulation Measure for Computer-based learning (SRMC) tailored toward medical students, by modifying Zimmerman's Self-Regulated Learning Interview Schedule (SRLIS) for K-12 learners. The SRMC's reliability and validity were examined in 2 studies. In Study 1, 109 first-year medical students were asked to complete the SRMC. Bivariate correlation analysis results indicated that the SRMC scores had a moderate degree of correlation with student achievement in a teacher-developed test. In Study 2, 58 third-year clerkship students completed the SRMC. Regression analysis results indicated that the frequency of medical students' usage of self-regulation strategies was associated with their general clinical knowledge measured by a nationally standardized licensing exam. These two studies provided evidence for the reliability and concurrent validity of the SRMC to assess medical students' self-regulation as aptitude. Future work should provide evidence to guide and improve instructional design as well as inform educational policy.

  9. A Consumer Protection Model for Regulating Lawyers.

    Science.gov (United States)

    Chalfie, Deborah M.

    1992-01-01

    Describes and critiques the "discipline model" of lawyer regulation from a consumer point of view and outlines an alternative model for regulating lawyers that is grounded in consumer protection principles. (JOW)

  10. The international radioactive transportation regulations: A model for national regulations

    International Nuclear Information System (INIS)

    Pope, R.B.; Rawl, R.R.

    1990-06-01

    The International Atomic Energy Agency's (IAEA) Regulations for the Safe Transport of Radioactive Material, Safety Series No. 6 (herein after denoted as the ''International Regulations'') serve as the model for the regulations for individual countries and international modal organizations controlling the packaging and transportation of radioactive materials. The purpose of this paper is to outline the background and history of the International Regulations, the general principles behind the requirements of the International Regulations, the structure and general contents of the latest edition of the International Regulations, and the roles of various international bodies in the development and implementation of the International Regulations and the current status of regulatory and supportive document development at both the international and domestic level. This review will provide a basis for users and potential users to better understand the source and application of the International Regulations. 1 tab

  11. Models of parallel computation :a survey and classification

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yunquan; CHEN Guoliang; SUN Guangzhong; MIAO Qiankun

    2007-01-01

    In this paper,the state-of-the-art parallel computational model research is reviewed.We will introduce various models that were developed during the past decades.According to their targeting architecture features,especially memory organization,we classify these parallel computational models into three generations.These models and their characteristics are discussed based on three generations classification.We believe that with the ever increasing speed gap between the CPU and memory systems,incorporating non-uniform memory hierarchy into computational models will become unavoidable.With the emergence of multi-core CPUs,the parallelism hierarchy of current computing platforms becomes more and more complicated.Describing this complicated parallelism hierarchy in future computational models becomes more and more important.A semi-automatic toolkit that can extract model parameters and their values on real computers can reduce the model analysis complexity,thus allowing more complicated models with more parameters to be adopted.Hierarchical memory and hierarchical parallelism will be two very important features that should be considered in future model design and research.

  12. Computer model for ductile fracture

    International Nuclear Information System (INIS)

    Moran, B.; Reaugh, J. E.

    1979-01-01

    A computer model is described for predicting ductile fracture initiation and propagation. The computer fracture model is calibrated by simple and notched round-bar tension tests and a precracked compact tension test. The model is used to predict fracture initiation and propagation in a Charpy specimen and compare the results with experiments. The calibrated model provides a correlation between Charpy V-notch (CVN) fracture energy and any measure of fracture toughness, such as J/sub Ic/. A second simpler empirical correlation was obtained using the energy to initiate fracture in the Charpy specimen rather than total energy CVN, and compared the results with the empirical correlation of Rolfe and Novak

  13. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  14. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  15. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  16. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  17. A physicist's model of computation

    International Nuclear Information System (INIS)

    Fredkin, E.

    1991-01-01

    An attempt is presented to make a statement about what a computer is and how it works from the perspective of physics. The single observation that computation can be a reversible process allows for the same kind of insight into computing as was obtained by Carnot's discovery that heat engines could be modelled as reversible processes. It allows us to bring computation into the realm of physics, where the power of physics allows us to ask and answer questions that seemed intractable from the viewpoint of computer science. Strangely enough, this effort makes it clear why computers get cheaper every year. (author) 14 refs., 4 figs

  18. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Economic models for management of resources in peer-to-peer and grid computing

    Science.gov (United States)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  20. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer-aided...... methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task....... The methodology has been implemented into a computer-aided modeling framework, which combines expert skills, tools, and database connections that are required for the different steps of the model development work-flow with the goal to increase the efficiency of the modeling process. The framework has two main...

  1. Substrate Curvature Regulates Cell Migration -A Computational Study

    Science.gov (United States)

    He, Xiuxiu; Jiang, Yi

    Cell migration in host microenvironment is essential to cancer etiology, progression and metastasis. Cellular processes of adhesion, cytoskeletal polymerization, contraction, and matrix remodeling act in concert to regulate cell migration, while local extracellular matrix architecture modulate these processes. In this work we study how stromal microenvironment with native and cell-derived curvature at micron-meter scale regulate cell motility pattern. We developed a 3D model of single cell migration on a curved substrate. Mathematical analysis of cell morphological adaption to the cell-substrate interface shows that cell migration on convex surfaces deforms more than on concave surfaces. Both analytical and simulation results show that curved surfaces regulate the cell motile force for cell's protruding front through force balance with focal adhesion and cell contraction. We also found that cell migration on concave substrates is more persistent. These results offer a novel biomechanical explanation to substrate curvature regulation of cell migration. NIH 1U01CA143069.

  2. Elements of matrix modeling and computing with Matlab

    CERN Document Server

    White, Robert E

    2006-01-01

    As discrete models and computing have become more common, there is a need to study matrix computation and numerical linear algebra. Encompassing a diverse mathematical core, Elements of Matrix Modeling and Computing with MATLAB examines a variety of applications and their modeling processes, showing you how to develop matrix models and solve algebraic systems. Emphasizing practical skills, it creates a bridge from problems with two and three variables to more realistic problems that have additional variables. Elements of Matrix Modeling and Computing with MATLAB focuses on seven basic applicat

  3. Vehicle - Bridge interaction, comparison of two computing models

    Science.gov (United States)

    Melcer, Jozef; Kuchárová, Daniela

    2017-07-01

    The paper presents the calculation of the bridge response on the effect of moving vehicle moves along the bridge with various velocities. The multi-body plane computing model of vehicle is adopted. The bridge computing models are created in two variants. One computing model represents the bridge as the Bernoulli-Euler beam with continuously distributed mass and the second one represents the bridge as the lumped mass model with 1 degrees of freedom. The mid-span bridge dynamic deflections are calculated for both computing models. The results are mutually compared and quantitative evaluated.

  4. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  5. Creation of 'Ukrytie' objects computer model

    International Nuclear Information System (INIS)

    Mazur, A.B.; Kotlyarov, V.T.; Ermolenko, A.I.; Podbereznyj, S.S.; Postil, S.D.; Shaptala, D.V.

    1999-01-01

    A partial computer model of the 'Ukrytie' object was created with the use of geoinformation technologies. The computer model makes it possible to carry out information support of the works related to the 'Ukrytie' object stabilization and its conversion into ecologically safe system for analyzing, forecasting and controlling the processes occurring in the 'Ukrytie' object. Elements and structures of the 'Ukryttia' object were designed and input into the model

  6. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  7. Computational modeling of spiking neural network with learning rules from STDP and intrinsic plasticity

    Science.gov (United States)

    Li, Xiumin; Wang, Wei; Xue, Fangzheng; Song, Yongduan

    2018-02-01

    Recently there has been continuously increasing interest in building up computational models of spiking neural networks (SNN), such as the Liquid State Machine (LSM). The biologically inspired self-organized neural networks with neural plasticity can enhance the capability of computational performance, with the characteristic features of dynamical memory and recurrent connection cycles which distinguish them from the more widely used feedforward neural networks. Despite a variety of computational models for brain-like learning and information processing have been proposed, the modeling of self-organized neural networks with multi-neural plasticity is still an important open challenge. The main difficulties lie in the interplay among different forms of neural plasticity rules and understanding how structures and dynamics of neural networks shape the computational performance. In this paper, we propose a novel approach to develop the models of LSM with a biologically inspired self-organizing network based on two neural plasticity learning rules. The connectivity among excitatory neurons is adapted by spike-timing-dependent plasticity (STDP) learning; meanwhile, the degrees of neuronal excitability are regulated to maintain a moderate average activity level by another learning rule: intrinsic plasticity (IP). Our study shows that LSM with STDP+IP performs better than LSM with a random SNN or SNN obtained by STDP alone. The noticeable improvement with the proposed method is due to the better reflected competition among different neurons in the developed SNN model, as well as the more effectively encoded and processed relevant dynamic information with its learning and self-organizing mechanism. This result gives insights to the optimization of computational models of spiking neural networks with neural plasticity.

  8. Computational modelling in fluid mechanics

    International Nuclear Information System (INIS)

    Hauguel, A.

    1985-01-01

    The modelling of the greatest part of environmental or industrial flow problems gives very similar types of equations. The considerable increase in computing capacity over the last ten years consequently allowed numerical models of growing complexity to be processed. The varied group of computer codes presented are now a complementary tool of experimental facilities to achieve studies in the field of fluid mechanics. Several codes applied in the nuclear field (reactors, cooling towers, exchangers, plumes...) are presented among others [fr

  9. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  10. Model of cap-dependent translation initiation in sea urchin: a step towards the eukaryotic translation regulation network.

    Science.gov (United States)

    Bellé, Robert; Prigent, Sylvain; Siegel, Anne; Cormier, Patrick

    2010-03-01

    The large and rapid increase in the rate of protein synthesis following fertilization of the sea urchin egg has long been a paradigm of translational control, an important component of the regulation of gene expression in cells. This translational up-regulation is linked to physiological changes that occur upon fertilization and is necessary for entry into first cell division cycle. Accumulated knowledge on cap-dependent initiation of translation makes it suited and timely to start integrating the data into a system view of biological functions. Using a programming environment for system biology coupled with model validation (named Biocham), we have built an integrative model for cap-dependent initiation of translation. The model is described by abstract rules. It contains 51 reactions involved in 74 molecular complexes. The model proved to be coherent with existing knowledge by using queries based on computational tree logic (CTL) as well as Boolean simulations. The model could simulate the change in translation occurring at fertilization in the sea urchin model. It could also be coupled with an existing model designed for cell-cycle control. Therefore, the cap-dependent translation initiation model can be considered a first step towards the eukaryotic translation regulation network.

  11. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  12. Computational multiscale modeling of intergranular cracking

    International Nuclear Information System (INIS)

    Simonovski, Igor; Cizelj, Leon

    2011-01-01

    A novel computational approach for simulation of intergranular cracks in a polycrystalline aggregate is proposed in this paper. The computational model includes a topological model of the experimentally determined microstructure of a 400 μm diameter stainless steel wire and automatic finite element discretization of the grains and grain boundaries. The microstructure was spatially characterized by X-ray diffraction contrast tomography and contains 362 grains and some 1600 grain boundaries. Available constitutive models currently include isotropic elasticity for the grain interior and cohesive behavior with damage for the grain boundaries. The experimentally determined lattice orientations are employed to distinguish between resistant low energy and susceptible high energy grain boundaries in the model. The feasibility and performance of the proposed computational approach is demonstrated by simulating the onset and propagation of intergranular cracking. The preliminary numerical results are outlined and discussed.

  13. Quantum Vertex Model for Reversible Classical Computing

    Science.gov (United States)

    Chamon, Claudio; Mucciolo, Eduardo; Ruckenstein, Andrei; Yang, Zhicheng

    We present a planar vertex model that encodes the result of a universal reversible classical computation in its ground state. The approach involves Boolean variables (spins) placed on links of a two-dimensional lattice, with vertices representing logic gates. Large short-ranged interactions between at most two spins implement the operation of each gate. The lattice is anisotropic with one direction corresponding to computational time, and with transverse boundaries storing the computation's input and output. The model displays no finite temperature phase transitions, including no glass transitions, independent of circuit. The computational complexity is encoded in the scaling of the relaxation rate into the ground state with the system size. We use thermal annealing and a novel and more efficient heuristic \\x9Dannealing with learning to study various computational problems. To explore faster relaxation routes, we construct an explicit mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating a novel approach to reversible classical computation based on quantum annealing.

  14. Computational and Organotypic Modeling of Microcephaly ...

    Science.gov (United States)

    Microcephaly is associated with reduced cortical surface area and ventricular dilations. Many genetic and environmental factors precipitate this malformation, including prenatal alcohol exposure and maternal Zika infection. This complexity motivates the engineering of computational and experimental models to probe the underlying molecular targets, cellular consequences, and biological processes. We describe an Adverse Outcome Pathway (AOP) framework for microcephaly derived from literature on all gene-, chemical-, or viral- effects and brain development. Overlap with NTDs is likely, although the AOP connections identified here focused on microcephaly as the adverse outcome. A query of the Mammalian Phenotype Browser database for ‘microcephaly’ (MP:0000433) returned 85 gene associations; several function in microtubule assembly and centrosome cycle regulated by (microcephalin, MCPH1), a gene for primary microcephaly in humans. The developing ventricular zone is the likely target. In this zone, neuroprogenitor cells (NPCs) self-replicate during the 1st trimester setting brain size, followed by neural differentiation of the neocortex. Recent studies with human NPCs confirmed infectivity with Zika virions invoking critical cell loss (apoptosis) of precursor NPCs; similar findings have been shown with fetal alcohol or methylmercury exposure in rodent studies, leading to mathematical models of NPC dynamics in size determination of the ventricular zone. A key event

  15. Impact of implementation choices on quantitative predictions of cell-based computational models

    Science.gov (United States)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  16. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  17. Model Regulations for Decommissioning of Facilities

    International Nuclear Information System (INIS)

    2017-07-01

    The IAEA has systematic programmes to provide Member States with the guidance, services and training necessary for establishing a legal and regulatory framework, including the planning and implementation of decommissioning. The model regulations provided in this publication cover all aspects of the planning, conduct and termination of the decommissioning of facilities and management of the associated waste, in accordance with the relevant requirements of the IAEA safety standards. They provide a framework for establishing regulatory requirements and conditions of authorization to be incorporated into individual authorizations for the decommissioning of specific facilities. The model regulations also establish criteria to be used for assessing compliance with regulatory requirements. The publication will be of assistance to Member States in appraising the adequacy of their existing regulations and regulatory guides, and serves as a reference for those Member States developing regulations for the first time.

  18. Regulator Loss Functions and Hierarchical Modeling for Safety Decision Making.

    Science.gov (United States)

    Hatfield, Laura A; Baugh, Christine M; Azzone, Vanessa; Normand, Sharon-Lise T

    2017-07-01

    Regulators must act to protect the public when evidence indicates safety problems with medical devices. This requires complex tradeoffs among risks and benefits, which conventional safety surveillance methods do not incorporate. To combine explicit regulator loss functions with statistical evidence on medical device safety signals to improve decision making. In the Hospital Cost and Utilization Project National Inpatient Sample, we select pediatric inpatient admissions and identify adverse medical device events (AMDEs). We fit hierarchical Bayesian models to the annual hospital-level AMDE rates, accounting for patient and hospital characteristics. These models produce expected AMDE rates (a safety target), against which we compare the observed rates in a test year to compute a safety signal. We specify a set of loss functions that quantify the costs and benefits of each action as a function of the safety signal. We integrate the loss functions over the posterior distribution of the safety signal to obtain the posterior (Bayes) risk; the preferred action has the smallest Bayes risk. Using simulation and an analysis of AMDE data, we compare our minimum-risk decisions to a conventional Z score approach for classifying safety signals. The 2 rules produced different actions for nearly half of hospitals (45%). In the simulation, decisions that minimize Bayes risk outperform Z score-based decisions, even when the loss functions or hierarchical models are misspecified. Our method is sensitive to the choice of loss functions; eliciting quantitative inputs to the loss functions from regulators is challenging. A decision-theoretic approach to acting on safety signals is potentially promising but requires careful specification of loss functions in consultation with subject matter experts.

  19. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar

    2016-03-21

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.

  1. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar; Henkel, Ron; Hoehndorf, Robert; Kacprowski, Tim; Knuepfer, Christian; Liebermeister, Wolfram

    2016-01-01

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users' intuition about model similarity, and to support complex model searches in databases.

  2. Computational Models Used to Assess US Tobacco Control Policies.

    Science.gov (United States)

    Feirman, Shari P; Glasser, Allison M; Rose, Shyanika; Niaura, Ray; Abrams, David B; Teplitskaya, Lyubov; Villanti, Andrea C

    2017-11-01

    Simulation models can be used to evaluate existing and potential tobacco control interventions, including policies. The purpose of this systematic review was to synthesize evidence from computational models used to project population-level effects of tobacco control interventions. We provide recommendations to strengthen simulation models that evaluate tobacco control interventions. Studies were eligible for review if they employed a computational model to predict the expected effects of a non-clinical US-based tobacco control intervention. We searched five electronic databases on July 1, 2013 with no date restrictions and synthesized studies qualitatively. Six primary non-clinical intervention types were examined across the 40 studies: taxation, youth prevention, smoke-free policies, mass media campaigns, marketing/advertising restrictions, and product regulation. Simulation models demonstrated the independent and combined effects of these interventions on decreasing projected future smoking prevalence. Taxation effects were the most robust, as studies examining other interventions exhibited substantial heterogeneity with regard to the outcomes and specific policies examined across models. Models should project the impact of interventions on overall tobacco use, including nicotine delivery product use, to estimate preventable health and cost-saving outcomes. Model validation, transparency, more sophisticated models, and modeling policy interactions are also needed to inform policymakers to make decisions that will minimize harm and maximize health. In this systematic review, evidence from multiple studies demonstrated the independent effect of taxation on decreasing future smoking prevalence, and models for other tobacco control interventions showed that these strategies are expected to decrease smoking, benefit population health, and are reasonable to implement from a cost perspective. Our recommendations aim to help policymakers and researchers minimize harm and

  3. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  4. Computational models of airway branching morphogenesis.

    Science.gov (United States)

    Varner, Victor D; Nelson, Celeste M

    2017-07-01

    The bronchial network of the mammalian lung consists of millions of dichotomous branches arranged in a highly complex, space-filling tree. Recent computational models of branching morphogenesis in the lung have helped uncover the biological mechanisms that construct this ramified architecture. In this review, we focus on three different theoretical approaches - geometric modeling, reaction-diffusion modeling, and continuum mechanical modeling - and discuss how, taken together, these models have identified the geometric principles necessary to build an efficient bronchial network, as well as the patterning mechanisms that specify airway geometry in the developing embryo. We emphasize models that are integrated with biological experiments and suggest how recent progress in computational modeling has advanced our understanding of airway branching morphogenesis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Model to Implement Virtual Computing Labs via Cloud Computing Services

    Directory of Open Access Journals (Sweden)

    Washington Luna Encalada

    2017-07-01

    Full Text Available In recent years, we have seen a significant number of new technological ideas appearing in literature discussing the future of education. For example, E-learning, cloud computing, social networking, virtual laboratories, virtual realities, virtual worlds, massive open online courses (MOOCs, and bring your own device (BYOD are all new concepts of immersive and global education that have emerged in educational literature. One of the greatest challenges presented to e-learning solutions is the reproduction of the benefits of an educational institution’s physical laboratory. For a university without a computing lab, to obtain hands-on IT training with software, operating systems, networks, servers, storage, and cloud computing similar to that which could be received on a university campus computing lab, it is necessary to use a combination of technological tools. Such teaching tools must promote the transmission of knowledge, encourage interaction and collaboration, and ensure students obtain valuable hands-on experience. That, in turn, allows the universities to focus more on teaching and research activities than on the implementation and configuration of complex physical systems. In this article, we present a model for implementing ecosystems which allow universities to teach practical Information Technology (IT skills. The model utilizes what is called a “social cloud”, which utilizes all cloud computing services, such as Software as a Service (SaaS, Platform as a Service (PaaS, and Infrastructure as a Service (IaaS. Additionally, it integrates the cloud learning aspects of a MOOC and several aspects of social networking and support. Social clouds have striking benefits such as centrality, ease of use, scalability, and ubiquity, providing a superior learning environment when compared to that of a simple physical lab. The proposed model allows students to foster all the educational pillars such as learning to know, learning to be, learning

  6. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  7. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  8. Stochastic modeling for the expression of a gene regulated by competing transcription factors.

    Directory of Open Access Journals (Sweden)

    Hsih-Te Yang

    Full Text Available It is widely accepted that gene expression regulation is a stochastic event. The common approach for its computer simulation requires detailed information on the interactions of individual molecules, which is often not available for the analyses of biological experiments. As an alternative approach, we employed a more intuitive model to simulate the experimental result, the Markov-chain model, in which a gene is regulated by activators and repressors, which bind the same site in a mutually exclusive manner. Our stochastic simulation in the presence of both activators and repressors predicted a Hill-coefficient of the dose-response curve closer to the experimentally observed value than the calculated value based on the simple additive effects of activators alone and repressors alone. The simulation also reproduced the heterogeneity of gene expression levels among individual cells observed by Fluorescence Activated Cell Sorting analysis. Therefore, our approach may help to apply stochastic simulations to broader experimental data.

  9. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  10. Do's and Don'ts of Computer Models for Planning

    Science.gov (United States)

    Hammond, John S., III

    1974-01-01

    Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)

  11. Reduced order methods for modeling and computational reduction

    CERN Document Server

    Rozza, Gianluigi

    2014-01-01

    This monograph addresses the state of the art of reduced order methods for modeling and computational reduction of complex parametrized systems, governed by ordinary and/or partial differential equations, with a special emphasis on real time computing techniques and applications in computational mechanics, bioengineering and computer graphics.  Several topics are covered, including: design, optimization, and control theory in real-time with applications in engineering; data assimilation, geometry registration, and parameter estimation with special attention to real-time computing in biomedical engineering and computational physics; real-time visualization of physics-based simulations in computer science; the treatment of high-dimensional problems in state space, physical space, or parameter space; the interactions between different model reduction and dimensionality reduction approaches; the development of general error estimation frameworks which take into account both model and discretization effects. This...

  12. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  13. Towards The Deep Model : Understanding Visual Recognition Through Computational Models

    OpenAIRE

    Wang, Panqu

    2017-01-01

    Understanding how visual recognition is achieved in the human brain is one of the most fundamental questions in vision research. In this thesis I seek to tackle this problem from a neurocomputational modeling perspective. More specifically, I build machine learning-based models to simulate and explain cognitive phenomena related to human visual recognition, and I improve computational models using brain-inspired principles to excel at computer vision tasks.I first describe how a neurocomputat...

  14. Symmetry structure in discrete models of biochemical systems: natural subsystems and the weak control hierarchy in a new model of computation driven by interactions.

    Science.gov (United States)

    Nehaniv, Chrystopher L; Rhodes, John; Egri-Nagy, Attila; Dini, Paolo; Morris, Eric Rothstein; Horváth, Gábor; Karimi, Fariba; Schreckling, Daniel; Schilstra, Maria J

    2015-07-28

    Interaction computing is inspired by the observation that cell metabolic/regulatory systems construct order dynamically, through constrained interactions between their components and based on a wide range of possible inputs and environmental conditions. The goals of this work are to (i) identify and understand mathematically the natural subsystems and hierarchical relations in natural systems enabling this and (ii) use the resulting insights to define a new model of computation based on interactions that is useful for both biology and computation. The dynamical characteristics of the cellular pathways studied in systems biology relate, mathematically, to the computational characteristics of automata derived from them, and their internal symmetry structures to computational power. Finite discrete automata models of biological systems such as the lac operon, the Krebs cycle and p53-mdm2 genetic regulation constructed from systems biology models have canonically associated algebraic structures (their transformation semigroups). These contain permutation groups (local substructures exhibiting symmetry) that correspond to 'pools of reversibility'. These natural subsystems are related to one another in a hierarchical manner by the notion of 'weak control'. We present natural subsystems arising from several biological examples and their weak control hierarchies in detail. Finite simple non-Abelian groups are found in biological examples and can be harnessed to realize finitary universal computation. This allows ensembles of cells to achieve any desired finitary computational transformation, depending on external inputs, via suitably constrained interactions. Based on this, interaction machines that grow and change their structure recursively are introduced and applied, providing a natural model of computation driven by interactions.

  15. Early Prediction of Student Self-Regulation Strategies by Combining Multiple Models

    Science.gov (United States)

    Sabourin, Jennifer L.; Mott, Bradford W.; Lester, James C.

    2012-01-01

    Self-regulated learning behaviors such as goal setting and monitoring have been found to be crucial to students' success in computer-based learning environments. Consequently, understanding students' self-regulated learning behavior has been the subject of increasing interest. Unfortunately, monitoring these behaviors in real-time has proven…

  16. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  17. Research on application of intelligent computation based LUCC model in urbanization process

    Science.gov (United States)

    Chen, Zemin

    2007-06-01

    own characteristics in detail, elaborate the feasibility of them in LUCC analog research, and bring forward improvement methods and measures on existing problems of this kind of model. 4. Establishment of LUCC analog model in urbanization process based on theories of intelligent computation and complexity science. Based on the research on abovementioned BP artificial neural network, genetic algorithms, CA model and multi-agent technology, put forward improvement methods and application assumption towards their expansion on geography, build LUCC analog model in urbanization process based on CA model and Agent model, realize the combination of learning mechanism of BP artificial neural network and fuzzy logic reasoning, express the regulation with explicit formula, and amend the initial regulation through self study; optimize network structure of LUCC analog model and methods and procedures of model parameters with genetic algorithms. In this paper, I introduce research theory and methods of complexity science into LUCC analog research and presents LUCC analog model based upon CA model and MAS theory. Meanwhile, I carry out corresponding expansion on traditional Markov model and introduce the theory of fuzzy set into data screening and parameter amendment of improved model to improve the accuracy and feasibility of Markov model in the research on land use/cover change.

  18. Industrial applications of formal methods to model, design and analyze computer systems

    CERN Document Server

    Craigen, Dan

    1995-01-01

    Formal methods are mathematically-based techniques, often supported by reasoning tools, that can offer a rigorous and effective way to model, design and analyze computer systems. The purpose of this study is to evaluate international industrial experience in using formal methods. The cases selected are representative of industrial-grade projects and span a variety of application domains. The study had three main objectives: · To better inform deliberations within industry and government on standards and regulations; · To provide an authoritative record on the practical experience of formal m

  19. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  20. A Computational Analysis Model for Open-ended Cognitions

    Science.gov (United States)

    Morita, Junya; Miwa, Kazuhisa

    In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.

  1. Modeling multimodal human-computer interaction

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2004-01-01

    Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze

  2. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  3. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...

  4. Editorial: Modelling and computational challenges in granular materials

    OpenAIRE

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss the current progress and latest advancements in the field of advanced numerical methods and modelling of granular materials. The focus will be on computational methods, improved algorithms and the m...

  5. SmartShadow models and methods for pervasive computing

    CERN Document Server

    Wu, Zhaohui

    2013-01-01

    SmartShadow: Models and Methods for Pervasive Computing offers a new perspective on pervasive computing with SmartShadow, which is designed to model a user as a personality ""shadow"" and to model pervasive computing environments as user-centric dynamic virtual personal spaces. Just like human beings' shadows in the physical world, it follows people wherever they go, providing them with pervasive services. The model, methods, and software infrastructure for SmartShadow are presented and an application for smart cars is also introduced.  The book can serve as a valuable reference work for resea

  6. Flow regulation in coronary vascular tree: a model study.

    Directory of Open Access Journals (Sweden)

    Xinzhou Xie

    Full Text Available Coronary blood flow can always be matched to the metabolic demand of the myocardium due to the regulation of vasoactive segments. Myocardial compressive forces play an important role in determining coronary blood flow but its impact on flow regulation is still unknown. The purpose of this study was to develop a coronary specified flow regulation model, which can integrate myocardial compressive forces and other identified regulation factors, to further investigate the coronary blood flow regulation behavior.A theoretical coronary flow regulation model including the myogenic, shear-dependent and metabolic responses was developed. Myocardial compressive forces were included in the modified wall tension model. Shear-dependent response was estimated by using the experimental data from coronary circulation. Capillary density and basal oxygen consumption were specified to corresponding to those in coronary circulation. Zero flow pressure was also modeled by using a simplified capillary model.Pressure-flow relations predicted by the proposed model are consistent with previous experimental data. The predicted diameter changes in small arteries are in good agreement with experiment observations in adenosine infusion and inhibition of NO synthesis conditions. Results demonstrate that the myocardial compressive forces acting on the vessel wall would extend the auto-regulatory range by decreasing the myogenic tone at the given perfusion pressure.Myocardial compressive forces had great impact on coronary auto-regulation effect. The proposed model was proved to be consistent with experiment observations and can be employed to investigate the coronary blood flow regulation effect in physiological and pathophysiological conditions.

  7. Computational Modeling of Biological Systems From Molecules to Pathways

    CERN Document Server

    2012-01-01

    Computational modeling is emerging as a powerful new approach for studying and manipulating biological systems. Many diverse methods have been developed to model, visualize, and rationally alter these systems at various length scales, from atomic resolution to the level of cellular pathways. Processes taking place at larger time and length scales, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. Computational Modeling of Biological Systems: From Molecules to Pathways provides an overview of established computational methods for the modeling of biologically and medically relevant systems. It is suitable for researchers and professionals working in the fields of biophysics, computational biology, systems biology, and molecular medicine.

  8. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  9. A computational model of selection by consequences.

    OpenAIRE

    McDowell, J J

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied o...

  10. A Categorisation of Cloud Computing Business Models

    OpenAIRE

    Chang, Victor; Bacigalupo, David; Wills, Gary; De Roure, David

    2010-01-01

    This paper reviews current cloud computing business models and presents proposals on how organisations can achieve sustainability by adopting appropriate models. We classify cloud computing business models into eight types: (1) Service Provider and Service Orientation; (2) Support and Services Contracts; (3) In-House Private Clouds; (4) All-In-One Enterprise Cloud; (5) One-Stop Resources and Services; (6) Government funding; (7) Venture Capitals; and (8) Entertainment and Social Networking. U...

  11. Improving public transportation systems with self-organization: A headway-based model and regulation of passenger alighting and boarding.

    Science.gov (United States)

    Carreón, Gustavo; Gershenson, Carlos; Pineda, Luis A

    2017-01-01

    The equal headway instability-the fact that a configuration with regular time intervals between vehicles tends to be volatile-is a common regulation problem in public transportation systems. An unsatisfactory regulation results in low efficiency and possible collapses of the service. Computational simulations have shown that self-organizing methods can regulate the headway adaptively beyond the theoretical optimum. In this work, we develop a computer simulation for metro systems fed with real data from the Mexico City Metro to test the current regulatory method with a novel self-organizing approach. The current model considers overall system's data such as minimum and maximum waiting times at stations, while the self-organizing method regulates the headway in a decentralized manner using local information such as the passenger's inflow and the positions of neighboring trains. The simulation shows that the self-organizing method improves the performance over the current one as it adapts to environmental changes at the timescale they occur. The correlation between the simulation of the current model and empirical observations carried out in the Mexico City Metro provides a base to calculate the expected performance of the self-organizing method in case it is implemented in the real system. We also performed a pilot study at the Balderas station to regulate the alighting and boarding of passengers through guide signs on platforms. The analysis of empirical data shows a delay reduction of the waiting time of trains at stations. Finally, we provide recommendations to improve public transportation systems.

  12. Chaos Modelling with Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Chaos Modelling with Computers Unpredicatable Behaviour of Deterministic Systems. Balakrishnan Ramasamy T S K V Iyer. General Article Volume 1 Issue 5 May 1996 pp 29-39 ...

  13. A computer simulation model to compute the radiation transfer of mountainous regions

    Science.gov (United States)

    Li, Yuguang; Zhao, Feng; Song, Rui

    2011-11-01

    In mountainous regions, the radiometric signal recorded at the sensor depends on a number of factors such as sun angle, atmospheric conditions, surface cover type, and topography. In this paper, a computer simulation model of radiation transfer is designed and evaluated. This model implements the Monte Carlo ray-tracing techniques and is specifically dedicated to the study of light propagation in mountainous regions. The radiative processes between sun light and the objects within the mountainous region are realized by using forward Monte Carlo ray-tracing methods. The performance of the model is evaluated through detailed comparisons with the well-established 3D computer simulation model: RGM (Radiosity-Graphics combined Model) based on the same scenes and identical spectral parameters, which shows good agreements between these two models' results. By using the newly developed computer model, series of typical mountainous scenes are generated to analyze the physical mechanism of mountainous radiation transfer. The results show that the effects of the adjacent slopes are important for deep valleys and they particularly affect shadowed pixels, and the topographic effect needs to be considered in mountainous terrain before accurate inferences from remotely sensed data can be made.

  14. Applications of computer modeling to fusion research

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling

  15. The Self-Regulated Learning Model and Music Education

    OpenAIRE

    Maja Marijan

    2017-01-01

    Self-regulation and self-regulated learning (SRL) are important features in music education. In this research self-regulated learning model is presented as a complex, multidimensional structure. SRL starts with the self-regulation. Self-regulation is formed through interaction with the environment, thus self-learning, self-analysis, self-judgment, self-instruction, and self-monitoring are the main functions in self-regulatory structure. Co-regulation is needed, and helps self-regulation to be...

  16. Computational Analysis of Residue Interaction Networks and Coevolutionary Relationships in the Hsp70 Chaperones: A Community-Hopping Model of Allosteric Regulation and Communication.

    Directory of Open Access Journals (Sweden)

    Gabrielle Stetz

    2017-01-01

    Full Text Available Allosteric interactions in the Hsp70 proteins are linked with their regulatory mechanisms and cellular functions. Despite significant progress in structural and functional characterization of the Hsp70 proteins fundamental questions concerning modularity of the allosteric interaction networks and hierarchy of signaling pathways in the Hsp70 chaperones remained largely unexplored and poorly understood. In this work, we proposed an integrated computational strategy that combined atomistic and coarse-grained simulations with coevolutionary analysis and network modeling of the residue interactions. A novel aspect of this work is the incorporation of dynamic residue correlations and coevolutionary residue dependencies in the construction of allosteric interaction networks and signaling pathways. We found that functional sites involved in allosteric regulation of Hsp70 may be characterized by structural stability, proximity to global hinge centers and local structural environment that is enriched by highly coevolving flexible residues. These specific characteristics may be necessary for regulation of allosteric structural transitions and could distinguish regulatory sites from nonfunctional conserved residues. The observed confluence of dynamics correlations and coevolutionary residue couplings with global networking features may determine modular organization of allosteric interactions and dictate localization of key mediating sites. Community analysis of the residue interaction networks revealed that concerted rearrangements of local interacting modules at the inter-domain interface may be responsible for global structural changes and a population shift in the DnaK chaperone. The inter-domain communities in the Hsp70 structures harbor the majority of regulatory residues involved in allosteric signaling, suggesting that these sites could be integral to the network organization and coordination of structural changes. Using a network-based formalism of

  17. Trust models in ubiquitous computing.

    Science.gov (United States)

    Krukow, Karl; Nielsen, Mogens; Sassone, Vladimiro

    2008-10-28

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.

  18. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  19. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  20. Blackboard architecture and qualitative model in a computer aided assistant designed to define computers for HEP computing

    International Nuclear Information System (INIS)

    Nodarse, F.F.; Ivanov, V.G.

    1991-01-01

    Using BLACKBOARD architecture and qualitative model, an expert systm was developed to assist the use in defining the computers method for High Energy Physics computing. The COMEX system requires an IBM AT personal computer or compatible with than 640 Kb RAM and hard disk. 5 refs.; 9 figs

  1. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  2. Wind farms providing secondary frequency regulation: evaluating the performance of model-based receding horizon control

    Directory of Open Access Journals (Sweden)

    C. R. Shapiro

    2018-01-01

    Full Text Available This paper is an extended version of our paper presented at the 2016 TORQUE conference (Shapiro et al., 2016. We investigate the use of wind farms to provide secondary frequency regulation for a power grid using a model-based receding horizon control framework. In order to enable real-time implementation, the control actions are computed based on a time-varying one-dimensional wake model. This model describes wake advection and wake interactions, both of which play an important role in wind farm power production. In order to test the control strategy, it is implemented in a large-eddy simulation (LES model of an 84-turbine wind farm using the actuator disk turbine representation. Rotor-averaged velocity measurements at each turbine are used to provide feedback for error correction. The importance of including the dynamics of wake advection in the underlying wake model is tested by comparing the performance of this dynamic-model control approach to a comparable static-model control approach that relies on a modified Jensen model. We compare the performance of both control approaches using two types of regulation signals, RegA and RegD, which are used by PJM, an independent system operator in the eastern United States. The poor performance of the static-model control relative to the dynamic-model control demonstrates that modeling the dynamics of wake advection is key to providing the proposed type of model-based coordinated control of large wind farms. We further explore the performance of the dynamic-model control via composite performance scores used by PJM to qualify plants for regulation services or markets. Our results demonstrate that the dynamic-model-controlled wind farm consistently performs well, passing the qualification threshold for all fast-acting RegD signals. For the RegA signal, which changes over slower timescales, the dynamic-model control leads to average performance that surpasses the qualification threshold, but further

  3. Computational models in physics teaching: a framework

    Directory of Open Access Journals (Sweden)

    Marco Antonio Moreira

    2012-08-01

    Full Text Available The purpose of the present paper is to present a theoretical framework to promote and assist meaningful physics learning through computational models. Our proposal is based on the use of a tool, the AVM diagram, to design educational activities involving modeling and computer simulations. The idea is to provide a starting point for the construction and implementation of didactical approaches grounded in a coherent epistemological view about scientific modeling.

  4. A model for calculating the optimal replacement interval of computer systems

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1981-08-01

    A mathematical model for calculating the optimal replacement interval of computer systems is described. This model is made to estimate the best economical interval of computer replacement when computing demand, cost and performance of computer, etc. are known. The computing demand is assumed to monotonously increase every year. Four kinds of models are described. In the model 1, a computer system is represented by only a central processing unit (CPU) and all the computing demand is to be processed on the present computer until the next replacement. On the other hand in the model 2, the excessive demand is admitted and may be transferred to other computing center and processed costly there. In the model 3, the computer system is represented by a CPU, memories (MEM) and input/output devices (I/O) and it must process all the demand. Model 4 is same as model 3, but the excessive demand is admitted to be processed in other center. (1) Computing demand at the JAERI, (2) conformity of Grosch's law for the recent computers, (3) replacement cost of computer systems, etc. are also described. (author)

  5. Modeling soft factors in computer-based wargames

    Science.gov (United States)

    Alexander, Steven M.; Ross, David O.; Vinarskai, Jonathan S.; Farr, Steven D.

    2002-07-01

    Computer-based wargames have seen much improvement in recent years due to rapid increases in computing power. Because these games have been developed for the entertainment industry, most of these advances have centered on the graphics, sound, and user interfaces integrated into these wargames with less attention paid to the game's fidelity. However, for a wargame to be useful to the military, it must closely approximate as many of the elements of war as possible. Among the elements that are typically not modeled or are poorly modeled in nearly all military computer-based wargames are systematic effects, command and control, intelligence, morale, training, and other human and political factors. These aspects of war, with the possible exception of systematic effects, are individually modeled quite well in many board-based commercial wargames. The work described in this paper focuses on incorporating these elements from the board-based games into a computer-based wargame. This paper will also address the modeling and simulation of the systemic paralysis of an adversary that is implied by the concept of Effects Based Operations (EBO). Combining the fidelity of current commercial board wargames with the speed, ease of use, and advanced visualization of the computer can significantly improve the effectiveness of military decision making and education. Once in place, the process of converting board wargames concepts to computer wargames will allow the infusion of soft factors into military training and planning.

  6. Proposing a Model of Co-Regulated Learning for Graduate Medical Education.

    Science.gov (United States)

    Rich, Jessica V

    2017-08-01

    Primarily grounded in Zimmerman's social cognitive model of self-regulation, graduate medical education is guided by principles that self-regulated learning takes place within social context and influence, and that the social context and physical environment reciprocally influence persons and their cognition, behavior, and development. However, contemporary perspectives on self-regulation are moving beyond Zimmerman's triadic reciprocal orientation to models that consider social transactions as the central core of regulated learning. Such co-regulated learning models emphasize shared control of learning and the role more advanced others play in scaffolding novices' metacognitive engagement.Models of co-regulated learning describe social transactions as periods of distributed regulation among individuals, which instrumentally promote or inhibit the capacity for individuals to independently self-regulate. Social transactions with other regulators, including attending physicians, more experienced residents, and allied health care professionals, are known to mediate residents' learning and to support or hamper the development of their self-regulated learning competence. Given that social transactions are at the heart of learning-oriented assessment and entrustment decisions, an appreciation for co-regulated learning is likely important for advancing medical education research and practice-especially given the momentum of new innovations such as entrustable professional activities.In this article, the author explains why graduate medical educators should consider adopting a model of co-regulated learning to complement and extend Zimmerman's models of self-regulated learning. In doing so, the author suggests a model of co-regulated learning and provides practical examples of how the model is relevant to graduate medical education research and practice.

  7. Computer models for kinetic equations of magnetically confined plasmas

    International Nuclear Information System (INIS)

    Killeen, J.; Kerbel, G.D.; McCoy, M.G.; Mirin, A.A.; Horowitz, E.J.; Shumaker, D.E.

    1987-01-01

    This paper presents four working computer models developed by the computational physics group of the National Magnetic Fusion Energy Computer Center. All of the models employ a kinetic description of plasma species. Three of the models are collisional, i.e., they include the solution of the Fokker-Planck equation in velocity space. The fourth model is collisionless and treats the plasma ions by a fully three-dimensional particle-in-cell method

  8. Wind farms providing secondary frequency regulation: Evaluating the performance of model-based receding horizon control

    International Nuclear Information System (INIS)

    Shapiro, Carl R.; Meneveau, Charles; Gayme, Dennice F.; Meyers, Johan

    2016-01-01

    We investigate the use of wind farms to provide secondary frequency regulation for a power grid. Our approach uses model-based receding horizon control of a wind farm that is tested using a large eddy simulation (LES) framework. In order to enable real-time implementation, the control actions are computed based on a time-varying one-dimensional wake model. This model describes wake advection and interactions, both of which play an important role in wind farm power production. This controller is implemented in an LES model of an 84-turbine wind farm represented by actuator disk turbine models. Differences between the velocities at each turbine predicted by the wake model and measured in LES are used for closed-loop feedback. The controller is tested on two types of regulation signals, “RegA” and “RegD”, obtained from PJM, an independent system operator in the eastern United States. Composite performance scores, which are used by PJM to qualify plants for regulation, are used to evaluate the performance of the controlled wind farm. Our results demonstrate that the controlled wind farm consistently performs well, passing the qualification threshold for all fastacting RegD signals. For the RegA signal, which changes over slower time scales, the controlled wind farm's average performance surpasses the threshold, but further work is needed to enable the controlled system to achieve qualifying performance all of the time. (paper)

  9. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  10. A response-modeling alternative to surrogate models for support in computational analyses

    International Nuclear Information System (INIS)

    Rutherford, Brian

    2006-01-01

    Often, the objectives in a computational analysis involve characterization of system performance based on some function of the computed response. In general, this characterization includes (at least) an estimate or prediction for some performance measure and an estimate of the associated uncertainty. Surrogate models can be used to approximate the response in regions where simulations were not performed. For most surrogate modeling approaches, however (1) estimates are based on smoothing of available data and (2) uncertainty in the response is specified in a point-wise (in the input space) fashion. These aspects of the surrogate model construction might limit their capabilities. One alternative is to construct a probability measure, G(r), for the computer response, r, based on available data. This 'response-modeling' approach will permit probability estimation for an arbitrary event, E(r), based on the computer response. In this general setting, event probabilities can be computed: prob(E)=∫ r I(E(r))dG(r) where I is the indicator function. Furthermore, one can use G(r) to calculate an induced distribution on a performance measure, pm. For prediction problems where the performance measure is a scalar, its distribution F pm is determined by: F pm (z)=∫ r I(pm(r)≤z)dG(r). We introduce response models for scalar computer output and then generalize the approach to more complicated responses that utilize multiple response models

  11. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  12. The Architectural Designs of a Nanoscale Computing Model

    Directory of Open Access Journals (Sweden)

    Mary M. Eshaghian-Wilner

    2004-08-01

    Full Text Available A generic nanoscale computing model is presented in this paper. The model consists of a collection of fully interconnected nanoscale computing modules, where each module is a cube of cells made out of quantum dots, spins, or molecules. The cells dynamically switch between two states by quantum interactions among their neighbors in all three dimensions. This paper includes a brief introduction to the field of nanotechnology from a computing point of view and presents a set of preliminary architectural designs for fabricating the nanoscale model studied.

  13. Geometric and computer-aided spline hob modeling

    Science.gov (United States)

    Brailov, I. G.; Myasoedova, T. M.; Panchuk, K. L.; Krysova, I. V.; Rogoza, YU A.

    2018-03-01

    The paper considers acquiring the spline hob geometric model. The objective of the research is the development of a mathematical model of spline hob for spline shaft machining. The structure of the spline hob is described taking into consideration the motion in parameters of the machine tool system of cutting edge positioning and orientation. Computer-aided study is performed with the use of CAD and on the basis of 3D modeling methods. Vector representation of cutting edge geometry is accepted as the principal method of spline hob mathematical model development. The paper defines the correlations described by parametric vector functions representing helical cutting edges designed for spline shaft machining with consideration for helical movement in two dimensions. An application for acquiring the 3D model of spline hob is developed on the basis of AutoLISP for AutoCAD environment. The application presents the opportunity for the use of the acquired model for milling process imitation. An example of evaluation, analytical representation and computer modeling of the proposed geometrical model is reviewed. In the mentioned example, a calculation of key spline hob parameters assuring the capability of hobbing a spline shaft of standard design is performed. The polygonal and solid spline hob 3D models are acquired by the use of imitational computer modeling.

  14. Structure, function, and behaviour of computational models in systems biology.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens; Dittrich, Peter; Le Novère, Nicolas

    2013-05-31

    Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such "bio-models" necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. We present a conceptual framework - the meaning facets - which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model's components (structure), the meaning of the model's intended use (function), and the meaning of the model's dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research.

  15. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  16. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...

  17. Computer Modeling of Direct Metal Laser Sintering

    Science.gov (United States)

    Cross, Matthew

    2014-01-01

    A computational approach to modeling direct metal laser sintering (DMLS) additive manufacturing process is presented. The primary application of the model is for determining the temperature history of parts fabricated using DMLS to evaluate residual stresses found in finished pieces and to assess manufacturing process strategies to reduce part slumping. The model utilizes MSC SINDA as a heat transfer solver with imbedded FORTRAN computer code to direct laser motion, apply laser heating as a boundary condition, and simulate the addition of metal powder layers during part fabrication. Model results are compared to available data collected during in situ DMLS part manufacture.

  18. Climate Ocean Modeling on Parallel Computers

    Science.gov (United States)

    Wang, P.; Cheng, B. N.; Chao, Y.

    1998-01-01

    Ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change. However, modeling the ocean circulation at various spatial and temporal scales is a very challenging computational task.

  19. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  20. A comparison of computer- and hand-generated clinical dental notes with statutory regulations in record keeping.

    Science.gov (United States)

    McAndrew, R; Ban, J; Playle, R

    2012-02-01

    Dental patient records should be of high quality, contain information to allow for good continuity of care and clinical defence (should the need ever arise) and, ideally, facilitate clinical audit. Handwritten dental records have been assessed for their compliance to statutory regulations, but the same cannot be levelled at computer-generated notes. This study aimed to compare and analyse the compliance of both methods of data recording with statutory regulations. Fifty consecutive sets of handwritten notes and 50 sets of computer-generated notes were audited for compliance with a number of legal requirements and desirable characteristics for dental records and the results compared. The standard set for compliance with all characteristics was 100%. The computer-generated notes satisfied the set standard for 8 of the 11 legal requirements and three of six desirable characteristics. The handwritten notes satisfied the set standard for 1 of 11 legal requirements and none of the desirable characteristics. A statistical difference (using a 95% confidence interval) between the two methods was observed in 5 of 11 legal characteristics and three of six desirable characteristics, all of which were in favour of computer-generated notes. Within the limitations of this study, computer-generated notes achieved a much higher compliance rate with the set parameters, making defence in cases of litigation, continuity of care and clinical audit easier and more efficient. © 2011 John Wiley & Sons A/S.

  1. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  2. Ch. 33 Modeling: Computational Thermodynamics

    International Nuclear Information System (INIS)

    Besmann, Theodore M.

    2012-01-01

    This chapter considers methods and techniques for computational modeling for nuclear materials with a focus on fuels. The basic concepts for chemical thermodynamics are described and various current models for complex crystalline and liquid phases are illustrated. Also included are descriptions of available databases for use in chemical thermodynamic studies and commercial codes for performing complex equilibrium calculations.

  3. Multiscale Mechanics of Articular Cartilage: Potentials and Challenges of Coupling Musculoskeletal, Joint, and Microscale Computational Models

    Science.gov (United States)

    Halloran, J. P.; Sibole, S.; van Donkelaar, C. C.; van Turnhout, M. C.; Oomens, C. W. J.; Weiss, J. A.; Guilak, F.; Erdemir, A.

    2012-01-01

    Articular cartilage experiences significant mechanical loads during daily activities. Healthy cartilage provides the capacity for load bearing and regulates the mechanobiological processes for tissue development, maintenance, and repair. Experimental studies at multiple scales have provided a fundamental understanding of macroscopic mechanical function, evaluation of the micromechanical environment of chondrocytes, and the foundations for mechanobiological response. In addition, computational models of cartilage have offered a concise description of experimental data at many spatial levels under healthy and diseased conditions, and have served to generate hypotheses for the mechanical and biological function. Further, modeling and simulation provides a platform for predictive risk assessment, management of dysfunction, as well as a means to relate multiple spatial scales. Simulation-based investigation of cartilage comes with many challenges including both the computational burden and often insufficient availability of data for model development and validation. This review outlines recent modeling and simulation approaches to understand cartilage function from a mechanical systems perspective, and illustrates pathways to associate mechanics with biological function. Computational representations at single scales are provided from the body down to the microstructure, along with attempts to explore multiscale mechanisms of load sharing that dictate the mechanical environment of the cartilage and chondrocytes. PMID:22648577

  4. Novel dimer based descriptors with solvational computation for QSAR study of oxadiazoylbenzoyl-ureas as novel insect-growth regulators.

    Science.gov (United States)

    Fan, Feng; Cheng, Jiagao; Li, Zhong; Xu, Xiaoyong; Qian, Xuhong

    2010-02-01

    Molecular aggregation state of bioactive compounds plays a key role in its bio-interactive procedure. In this article, based on the structure information of dimers, the simplest model of molecular aggregation state, and combined with solvational computation, total four descriptors (DeltaV, MR2, DeltaE(1), and DeltaE(2)) were calculated for QSAR study of a novel insect-growth regulator, N-(5-phenyl-1,3,4-oxadiazol-2-yl)-N'-benzoyl urea. Two QSAR models were constructed with r(2) = 0.671, q(2) = 0.516 and r(2) = 0.816, q(2) = 0.695, respectively. It implicates that the bioactivity may strongly depend on the characters of molecular aggregation state, especially on the dimeric transport ability from oil phase to water phase. Copyright 2009 Wiley Periodicals, Inc.

  5. Dynamical Analysis of bantam-Regulated Drosophila Circadian Rhythm Model

    Science.gov (United States)

    Li, Ying; Liu, Zengrong

    MicroRNAs (miRNAs) interact with 3‧untranslated region (UTR) elements of target genes to regulate mRNA stability or translation, and play a crucial role in regulating many different biological processes. bantam, a conserved miRNA, is involved in several functions, such as regulating Drosophila growth and circadian rhythm. Recently, it has been discovered that bantam plays a crucial role in the core circadian pacemaker. In this paper, based on experimental observations, a detailed dynamical model of bantam-regulated circadian clock system is developed to show the post-transcriptional behaviors in the modulation of Drosophila circadian rhythm, in which the regulation of bantam is incorporated into a classical model. The dynamical behaviors of the model are consistent with the experimental observations, which shows that bantam is an important regulator of Drosophila circadian rhythm. The sensitivity analysis of parameters demonstrates that with the regulation of bantam the system is more sensitive to perturbations, indicating that bantam regulation makes it easier for the organism to modulate its period against the environmental perturbations. The effectiveness in rescuing locomotor activity rhythms of mutated flies shows that bantam is necessary for strong and sustained rhythms. In addition, the biological mechanisms of bantam regulation are analyzed, which may help us more clearly understand Drosophila circadian rhythm regulated by other miRNAs.

  6. A model of clearance rate regulation in mussels

    Science.gov (United States)

    Fréchette, Marcel

    2012-10-01

    Clearance rate regulation has been modelled as an instantaneous response to food availability, independent of the internal state of the animals. This view is incompatible with latent effects during ontogeny and phenotypic flexibility in clearance rate. Internal-state regulation of clearance rate is required to account for these patterns. Here I develop a model of internal-state based regulation of clearance rate. External factors such as suspended sediments are included in the model. To assess the relative merits of instantaneous regulation and internal-state regulation, I modelled blue mussel clearance rate and growth using a DEB model. In the usual standard feeding module, feeding is governed by a Holling's Type II response to food concentration. In the internal-state feeding module, gill ciliary activity and thus clearance rate are driven by internal reserve level. Factors such as suspended sediments were not included in the simulations. The two feeding modules were compared on the basis of their ability to capture the impact of latent effects, of environmental heterogeneity in food abundance and of physiological flexibility on clearance rate and individual growth. The Holling feeding module was unable to capture the effect of any of these sources of variability. In contrast, the internal-state feeding module did so without any modification or ad hoc calibration. Latent effects, however, appeared transient. With simple annual variability in temperature and food concentration, the relationship between clearance rate and food availability predicted by the internal-state feeding module was quite similar to that observed in Norwegian fjords. I conclude that in contrast with the usual Holling feeding module, internal-state regulation of clearance rate is consistent with well-documented growth and clearance rate patterns.

  7. NEURO-SYSTEM OF AIMING AND STABILIZING WITH A REGULATOR ON THE BASIS OF STANDARD MODEL MODEL REFERENCE CONTROLLER

    Directory of Open Access Journals (Sweden)

    B.I. Kuznetsov

    2015-08-01

    Full Text Available The aim of this work is the synthesis of neural network aiming and stabilization system for the special equipment of moving objects with neuro-controller on the basis of standard model and performance comparison of the neural network system with the neural network predictive control. Build a block diagram of the neural network aiming and stabilization system, based on the subject control principle with PD-regulator in the position loop and with neuro-controller on the basis of standard model in the in the velocity loop. The neuro-controller on the basis of standard model Model Reference Controller is synthesized in the MATLAB Neural Network Toolbox and system simulation is performed. The studies show that the transient state variables of the system are oscillatory. Therefore, the neuro-controller with the prediction NN Predictive Controller should be used for aiming and stabilizing system to provide high dynamic characteristics achieved at the cost of higher complexity and computational cost.

  8. Uses of Computer Simulation Models in Ag-Research and Everyday Life

    Science.gov (United States)

    When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...

  9. Development of a external exposure computational model for studying of input dose in skin for radiographs of thorax and vertebral column

    International Nuclear Information System (INIS)

    Muniz, Bianca C.; Menezes, Claudio J.M.; Vieira, Jose W.

    2014-01-01

    The dosimetric measurements do not always happen directly in the human body. Therefore, these assessments can be performed using anthropomorphic models (phantoms) evidencing models computational exposure (MCE) using techniques of Monte Carlo Method for virtual simulations. These processing techniques coupled with more powerful and affordable computers make the Monte Carlo method one of the tools most used worldwide in radiation transport area. In this work, the Monte Carlo EGS4 program was used to develop a computer model of external exposure to study the entrance skin dose for chest and column X-radiography and, aiming to optimize these practices by reducing doses to patients, professionals involved and the general public. The results obtained experimentally with the electrometer Radcal, model 9015, associated with the ionization chamber for radiology model 10X5-6, showed that the proposed computational model can be used in quality assurance programs in radiodiagnostic, evaluating the entrance skin dose when varying parameters of the radiation beam such as kilo voltage peak (kVp), current-time product (mAs), total filtration and distance surface source (DFS), optimizing the practices in radiodiagnostic and meeting the current regulation

  10. Deployment Models: Towards Eliminating Security Concerns From Cloud Computing

    OpenAIRE

    Zhao, Gansen; Chunming, Rong; Jaatun, Martin Gilje; Sandnes, Frode Eika

    2010-01-01

    Cloud computing has become a popular choice as an alternative to investing new IT systems. When making decisions on adopting cloud computing related solutions, security has always been a major concern. This article summarizes security concerns in cloud computing and proposes five service deployment models to ease these concerns. The proposed models provide different security related features to address different requirements and scenarios and can serve as reference models for deployment. D...

  11. [The implementation of computer model in research of dynamics of proliferation of cells of thyroid gland follicle].

    Science.gov (United States)

    Abduvaliev, A A; Gil'dieva, M S; Khidirov, B N; Saĭdalieva, M; Khasanov, A A; Musaeva, Sh N; Saatov, T S

    2012-04-01

    The article deals with the results of computational experiments in research of dynamics of proliferation of cells of thyroid gland follicle in normal condition and in the case of malignant neoplasm. The model studies demonstrated that the chronic increase of parameter of proliferation of cells of thyroid gland follicle results in abnormal behavior of numbers of cell cenosis of thyroid gland follicle. The stationary state interrupts, the auto-oscillations occur with transition to irregular oscillations with unpredictable cell proliferation and further to the "black hole" effect. It is demonstrated that the present medical biologic experimental data and theory propositions concerning the structural functional organization of thyroid gland on cell level permit to develop mathematical models for quantitative analysis of numbers of cell cenosis of thyroid gland follicle in normal conditions. The technique of modeling of regulative mechanisms of living systems and equations of cell cenosis regulations was used

  12. The emerging role of cloud computing in molecular modelling.

    Science.gov (United States)

    Ebejer, Jean-Paul; Fulle, Simone; Morris, Garrett M; Finn, Paul W

    2013-07-01

    There is a growing recognition of the importance of cloud computing for large-scale and data-intensive applications. The distinguishing features of cloud computing and their relationship to other distributed computing paradigms are described, as are the strengths and weaknesses of the approach. We review the use made to date of cloud computing for molecular modelling projects and the availability of front ends for molecular modelling applications. Although the use of cloud computing technologies for molecular modelling is still in its infancy, we demonstrate its potential by presenting several case studies. Rapid growth can be expected as more applications become available and costs continue to fall; cloud computing can make a major contribution not just in terms of the availability of on-demand computing power, but could also spur innovation in the development of novel approaches that utilize that capacity in more effective ways. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  14. Computational Models of Rock Failure

    Science.gov (United States)

    May, Dave A.; Spiegelman, Marc

    2017-04-01

    Practitioners in computational geodynamics, as per many other branches of applied science, typically do not analyse the underlying PDE's being solved in order to establish the existence or uniqueness of solutions. Rather, such proofs are left to the mathematicians, and all too frequently these results lag far behind (in time) the applied research being conducted, are often unintelligible to the non-specialist, are buried in journals applied scientists simply do not read, or simply have not been proven. As practitioners, we are by definition pragmatic. Thus, rather than first analysing our PDE's, we first attempt to find approximate solutions by throwing all our computational methods and machinery at the given problem and hoping for the best. Typically this approach leads to a satisfactory outcome. Usually it is only if the numerical solutions "look odd" that we start delving deeper into the math. In this presentation I summarise our findings in relation to using pressure dependent (Drucker-Prager type) flow laws in a simplified model of continental extension in which the material is assumed to be an incompressible, highly viscous fluid. Such assumptions represent the current mainstream adopted in computational studies of mantle and lithosphere deformation within our community. In short, we conclude that for the parameter range of cohesion and friction angle relevant to studying rocks, the incompressibility constraint combined with a Drucker-Prager flow law can result in problems which have no solution. This is proven by a 1D analytic model and convincingly demonstrated by 2D numerical simulations. To date, we do not have a robust "fix" for this fundamental problem. The intent of this submission is to highlight the importance of simple analytic models, highlight some of the dangers / risks of interpreting numerical solutions without understanding the properties of the PDE we solved, and lastly to stimulate discussions to develop an improved computational model of

  15. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  16. Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.

    Science.gov (United States)

    Pearl, Lisa S; Sprouse, Jon

    2015-06-01

    Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.

  17. Computational modeling of neural activities for statistical inference

    CERN Document Server

    Kolossa, Antonio

    2016-01-01

    This authored monograph supplies empirical evidence for the Bayesian brain hypothesis by modeling event-related potentials (ERP) of the human electroencephalogram (EEG) during successive trials in cognitive tasks. The employed observer models are useful to compute probability distributions over observable events and hidden states, depending on which are present in the respective tasks. Bayesian model selection is then used to choose the model which best explains the ERP amplitude fluctuations. Thus, this book constitutes a decisive step towards a better understanding of the neural coding and computing of probabilities following Bayesian rules. The target audience primarily comprises research experts in the field of computational neurosciences, but the book may also be beneficial for graduate students who want to specialize in this field. .

  18. An integrative computational modelling of music structure apprehension

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2014-01-01

    , the computational model, by virtue of its generality, extensiveness and operationality, is suggested as a blueprint for the establishment of cognitively validated model of music structure apprehension. Available as a Matlab module, it can be used for practical musicological uses.......An objectivization of music analysis requires a detailed formalization of the underlying principles and methods. The formalization of the most elementary structural processes is hindered by the complexity of music, both in terms of profusions of entities (such as notes) and of tight interactions...... between a large number of dimensions. Computational modeling would enable systematic and exhaustive tests on sizeable pieces of music, yet current researches cover particular musical dimensions with limited success. The aim of this research is to conceive a computational modeling of music analysis...

  19. Computational algebraic geometry of epidemic models

    Science.gov (United States)

    Rodríguez Vega, Martín.

    2014-06-01

    Computational Algebraic Geometry is applied to the analysis of various epidemic models for Schistosomiasis and Dengue, both, for the case without control measures and for the case where control measures are applied. The models were analyzed using the mathematical software Maple. Explicitly the analysis is performed using Groebner basis, Hilbert dimension and Hilbert polynomials. These computational tools are included automatically in Maple. Each of these models is represented by a system of ordinary differential equations, and for each model the basic reproductive number (R0) is calculated. The effects of the control measures are observed by the changes in the algebraic structure of R0, the changes in Groebner basis, the changes in Hilbert dimension, and the changes in Hilbert polynomials. It is hoped that the results obtained in this paper become of importance for designing control measures against the epidemic diseases described. For future researches it is proposed the use of algebraic epidemiology to analyze models for airborne and waterborne diseases.

  20. Modeling of Communication in a Computational Situation Assessment Model

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, or situation awareness, because failures of situation assessment may result in wrong decisions for process control and finally errors of commission in nuclear power plants. Quantitative or prescriptive models to predict operator's situation assessment in a situation, the results of situation assessment, provide many benefits such as HSI design solutions, human performance data, and human reliability. Unfortunately, a few computational situation assessment models for NPP operators have been proposed and those insufficiently embed human cognitive characteristics. Thus we proposed a new computational situation assessment model of nuclear power plant operators. The proposed model incorporating significant cognitive factors uses a Bayesian belief network (BBN) as model architecture. It is believed that communication between nuclear power plant operators affects operators' situation assessment and its result, situation awareness. We tried to verify that the proposed model represent the effects of communication on situation assessment. As the result, the proposed model succeeded in representing the operators' behavior and this paper shows the details

  1. Computer models of vocal tract evolution: an overview and critique

    NARCIS (Netherlands)

    de Boer, B.; Fitch, W. T.

    2010-01-01

    Human speech has been investigated with computer models since the invention of digital computers, and models of the evolution of speech first appeared in the late 1960s and early 1970s. Speech science and computer models have a long shared history because speech is a physical signal and can be

  2. The complete guide to blender graphics computer modeling and animation

    CERN Document Server

    Blain, John M

    2014-01-01

    Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments

  3. COMPUTATIONAL MODELING OF AIRFLOW IN NONREGULAR SHAPED CHANNELS

    Directory of Open Access Journals (Sweden)

    A. A. Voronin

    2013-05-01

    Full Text Available The basic approaches to computational modeling of airflow in the human nasal cavity are analyzed. Different models of turbulent flow which may be used in order to calculate air velocity and pressure are discussed. Experimental measurement results of airflow temperature are illustrated. Geometrical model of human nasal cavity reconstructed from computer-aided tomography scans and numerical simulation results of airflow inside this model are also given. Spatial distributions of velocity and temperature for inhaled and exhaled air are shown.

  4. Applied Mathematics, Modelling and Computational Science

    CERN Document Server

    Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

    2015-01-01

    The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

  5. Editorial: Modelling and computational challenges in granular materials

    NARCIS (Netherlands)

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss

  6. Generating Computational Models for Serious Gaming

    NARCIS (Netherlands)

    Westera, Wim

    2018-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  7. Airfoil Computations using the γ - Reθ Model

    DEFF Research Database (Denmark)

    Sørensen, Niels N.

    computations. Based on this, an estimate of the error in the computations is determined to be approximately one percent in the attached region. Following the verification of the implemented model, the model is applied to four airfoils, NACA64- 018, NACA64-218, NACA64-418 and NACA64-618 and the results...

  8. A System Computational Model of Implicit Emotional Learning.

    Science.gov (United States)

    Puviani, Luca; Rama, Sidita

    2016-01-01

    Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation.

  9. Category-theoretic models of algebraic computer systems

    Science.gov (United States)

    Kovalyov, S. P.

    2016-01-01

    A computer system is said to be algebraic if it contains nodes that implement unconventional computation paradigms based on universal algebra. A category-based approach to modeling such systems that provides a theoretical basis for mapping tasks to these systems' architecture is proposed. The construction of algebraic models of general-purpose computations involving conditional statements and overflow control is formally described by a reflector in an appropriate category of algebras. It is proved that this reflector takes the modulo ring whose operations are implemented in the conventional arithmetic processors to the Łukasiewicz logic matrix. Enrichments of the set of ring operations that form bases in the Łukasiewicz logic matrix are found.

  10. THE EUROPEAN MODEL OF STATE REGULATION OF TOURISM ACTIVITIES

    Directory of Open Access Journals (Sweden)

    О. Davydova

    2013-11-01

    Full Text Available In the article the existing model of state regulation of the development of tourism. Expediency of the European model of state regulation of tourism development in Ukraine. It is noted that the European model of state regulation of tourism activities based on the coordination of marketing activities and the development of cooperation between the public and private sectors. The basic forms of public-private partnerships and the advantages of using cluster model of development of tourism, namely, contracts, production sharing agreement, lease, joint venture. Promising areas of application of the PPP identified the transport sector, housing and utilities, energy and tourism sector. The features of cluster formations in the country and the prospects for tourism clusters.

  11. Cosmic logic: a computational model

    International Nuclear Information System (INIS)

    Vanchurin, Vitaly

    2016-01-01

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps

  12. Computational modelling of the impact of AIDS on business.

    Science.gov (United States)

    Matthews, Alan P

    2007-07-01

    An overview of computational modelling of the impact of AIDS on business in South Africa, with a detailed description of the AIDS Projection Model (APM) for companies, developed by the author, and suggestions for further work. Computational modelling of the impact of AIDS on business in South Africa requires modelling of the epidemic as a whole, and of its impact on a company. This paper gives an overview of epidemiological modelling, with an introduction to the Actuarial Society of South Africa (ASSA) model, the most widely used such model for South Africa. The APM produces projections of HIV prevalence, new infections, and AIDS mortality on a company, based on the anonymous HIV testing of company employees, and projections from the ASSA model. A smoothed statistical model of the prevalence test data is computed, and then the ASSA model projection for each category of employees is adjusted so that it matches the measured prevalence in the year of testing. FURTHER WORK: Further techniques that could be developed are microsimulation (representing individuals in the computer), scenario planning for testing strategies, and models for the business environment, such as models of entire sectors, and mapping of HIV prevalence in time and space, based on workplace and community data.

  13. Computational neurorehabilitation: modeling plasticity and learning to predict recovery.

    Science.gov (United States)

    Reinkensmeyer, David J; Burdet, Etienne; Casadio, Maura; Krakauer, John W; Kwakkel, Gert; Lang, Catherine E; Swinnen, Stephan P; Ward, Nick S; Schweighofer, Nicolas

    2016-04-30

    Despite progress in using computational approaches to inform medicine and neuroscience in the last 30 years, there have been few attempts to model the mechanisms underlying sensorimotor rehabilitation. We argue that a fundamental understanding of neurologic recovery, and as a result accurate predictions at the individual level, will be facilitated by developing computational models of the salient neural processes, including plasticity and learning systems of the brain, and integrating them into a context specific to rehabilitation. Here, we therefore discuss Computational Neurorehabilitation, a newly emerging field aimed at modeling plasticity and motor learning to understand and improve movement recovery of individuals with neurologic impairment. We first explain how the emergence of robotics and wearable sensors for rehabilitation is providing data that make development and testing of such models increasingly feasible. We then review key aspects of plasticity and motor learning that such models will incorporate. We proceed by discussing how computational neurorehabilitation models relate to the current benchmark in rehabilitation modeling - regression-based, prognostic modeling. We then critically discuss the first computational neurorehabilitation models, which have primarily focused on modeling rehabilitation of the upper extremity after stroke, and show how even simple models have produced novel ideas for future investigation. Finally, we conclude with key directions for future research, anticipating that soon we will see the emergence of mechanistic models of motor recovery that are informed by clinical imaging results and driven by the actual movement content of rehabilitation therapy as well as wearable sensor-based records of daily activity.

  14. Computational modeling and engineering in pediatric and congenital heart disease.

    Science.gov (United States)

    Marsden, Alison L; Feinstein, Jeffrey A

    2015-10-01

    Recent methodological advances in computational simulations are enabling increasingly realistic simulations of hemodynamics and physiology, driving increased clinical utility. We review recent developments in the use of computational simulations in pediatric and congenital heart disease, describe the clinical impact in modeling in single-ventricle patients, and provide an overview of emerging areas. Multiscale modeling combining patient-specific hemodynamics with reduced order (i.e., mathematically and computationally simplified) circulatory models has become the de-facto standard for modeling local hemodynamics and 'global' circulatory physiology. We review recent advances that have enabled faster solutions, discuss new methods (e.g., fluid structure interaction and uncertainty quantification), which lend realism both computationally and clinically to results, highlight novel computationally derived surgical methods for single-ventricle patients, and discuss areas in which modeling has begun to exert its influence including Kawasaki disease, fetal circulation, tetralogy of Fallot (and pulmonary tree), and circulatory support. Computational modeling is emerging as a crucial tool for clinical decision-making and evaluation of novel surgical methods and interventions in pediatric cardiology and beyond. Continued development of modeling methods, with an eye towards clinical needs, will enable clinical adoption in a wide range of pediatric and congenital heart diseases.

  15. Climate models on massively parallel computers

    International Nuclear Information System (INIS)

    Vitart, F.; Rouvillois, P.

    1993-01-01

    First results got on massively parallel computers (Multiple Instruction Multiple Data and Simple Instruction Multiple Data) allow to consider building of coupled models with high resolutions. This would make possible simulation of thermoaline circulation and other interaction phenomena between atmosphere and ocean. The increasing of computers powers, and then the improvement of resolution will go us to revise our approximations. Then hydrostatic approximation (in ocean circulation) will not be valid when the grid mesh will be of a dimension lower than a few kilometers: We shall have to find other models. The expert appraisement got in numerical analysis at the Center of Limeil-Valenton (CEL-V) will be used again to imagine global models taking in account atmosphere, ocean, ice floe and biosphere, allowing climate simulation until a regional scale

  16. Rough – Granular Computing knowledge discovery models

    Directory of Open Access Journals (Sweden)

    Mohammed M. Eissa

    2016-11-01

    Full Text Available Medical domain has become one of the most important areas of research in order to richness huge amounts of medical information about the symptoms of diseases and how to distinguish between them to diagnose it correctly. Knowledge discovery models play vital role in refinement and mining of medical indicators to help medical experts to settle treatment decisions. This paper introduces four hybrid Rough – Granular Computing knowledge discovery models based on Rough Sets Theory, Artificial Neural Networks, Genetic Algorithm and Rough Mereology Theory. A comparative analysis of various knowledge discovery models that use different knowledge discovery techniques for data pre-processing, reduction, and data mining supports medical experts to extract the main medical indicators, to reduce the misdiagnosis rates and to improve decision-making for medical diagnosis and treatment. The proposed models utilized two medical datasets: Coronary Heart Disease dataset and Hepatitis C Virus dataset. The main purpose of this paper was to explore and evaluate the proposed models based on Granular Computing methodology for knowledge extraction according to different evaluation criteria for classification of medical datasets. Another purpose is to make enhancement in the frame of KDD processes for supervised learning using Granular Computing methodology.

  17. Modeling the role of negative cooperativity in metabolic regulation and homeostasis.

    Directory of Open Access Journals (Sweden)

    Eliot C Bush

    Full Text Available A significant proportion of enzymes display cooperativity in binding ligand molecules, and such effects have an important impact on metabolic regulation. This is easiest to understand in the case of positive cooperativity. Sharp responses to changes in metabolite concentrations can allow organisms to better respond to environmental changes and maintain metabolic homeostasis. However, despite the fact that negative cooperativity is almost as common as positive, it has been harder to imagine what advantages it provides. Here we use computational models to explore the utility of negative cooperativity in one particular context: that of an inhibitor binding to an enzyme. We identify several factors which may contribute, and show that acting together they can make negative cooperativity advantageous.

  18. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  19. On turbulence models for rod bundle flow computations

    International Nuclear Information System (INIS)

    Hazi, Gabor

    2005-01-01

    In commercial computational fluid dynamics codes there is more than one turbulence model built in. It is the user responsibility to choose one of those models, suitable for the problem studied. In the last decade, several computations were presented using computational fluid dynamics for the simulation of various problems of the nuclear industry. A common feature in a number of those simulations is that they were performed using the standard k-ε turbulence model without justifying the choice of the model. The simulation results were rarely satisfactory. In this paper, we shall consider the flow in a fuel rod bundle as a case study and discuss why the application of the standard k-ε model fails to give reasonable results in this situation. We also show that a turbulence model based on the Reynolds stress transport equations can provide qualitatively correct results. Generally, our aim is pedagogical, we would like to call the readers attention to the fact that turbulence models have to be selected based on theoretical considerations and/or adequate information obtained from measurements

  20. Assessment of weld thickness loss in offshore pipelines using computed radiography and computational modeling

    International Nuclear Information System (INIS)

    Correa, S.C.A.; Souza, E.M.; Oliveira, D.F.; Silva, A.X.; Lopes, R.T.; Marinho, C.; Camerini, C.S.

    2009-01-01

    In order to guarantee the structural integrity of oil plants it is crucial to monitor the amount of weld thickness loss in offshore pipelines. However, in spite of its relevance, this parameter is very difficult to determine, due to both the large diameter of most pipes and the complexity of the multi-variable system involved. In this study, a computational modeling based on Monte Carlo MCNPX code is combined with computed radiography to estimate the weld thickness loss in large-diameter offshore pipelines. Results show that computational modeling is a powerful tool to estimate intensity variations in radiographic images generated by weld thickness variations, and it can be combined with computed radiography to assess weld thickness loss in offshore and subsea pipelines.

  1. Modelling and Analysis of a New Piezoelectric Dynamic Balance Regulator

    Directory of Open Access Journals (Sweden)

    Mu-Xun Xu

    2012-11-01

    Full Text Available In this paper, a new piezoelectric dynamic balance regulator, which can be used in motorised spindle systems, is presented. The dynamic balancing adjustment mechanism is driven by an in-plane bending vibration from an annular piezoelectric stator excited by a high-frequency sinusoidal input voltage. This device has different construction, characteristics and operating principles than a conventional balance regulator. In this work, a dynamic model of the regulator is first developed using a detailed analytical method. Thereafter, MATLAB is employed to numerically simulate the relations between the dominant parameters and the characteristics of the regulator based on thedynamic model. Finally, experimental measurements are used to certify the validity of the dynamic model. Consequently, the mathematical model presented and analysed in this paper can be used as a tool for optimising the design of a piezoelectric dynamic balance regulator during steady state operation.

  2. Introduction to computation and modeling for differential equations

    CERN Document Server

    Edsberg, Lennart

    2008-01-01

    An introduction to scientific computing for differential equationsIntroduction to Computation and Modeling for Differential Equations provides a unified and integrated view of numerical analysis, mathematical modeling in applications, and programming to solve differential equations, which is essential in problem-solving across many disciplines, such as engineering, physics, and economics. This book successfully introduces readers to the subject through a unique ""Five-M"" approach: Modeling, Mathematics, Methods, MATLAB, and Multiphysics. This approach facilitates a thorough understanding of h

  3. A novel patient-specific model to compute coronary fractional flow reserve.

    Science.gov (United States)

    Kwon, Soon-Sung; Chung, Eui-Chul; Park, Jin-Seo; Kim, Gook-Tae; Kim, Jun-Woo; Kim, Keun-Hong; Shin, Eun-Seok; Shim, Eun Bo

    2014-09-01

    The fractional flow reserve (FFR) is a widely used clinical index to evaluate the functional severity of coronary stenosis. A computer simulation method based on patients' computed tomography (CT) data is a plausible non-invasive approach for computing the FFR. This method can provide a detailed solution for the stenosed coronary hemodynamics by coupling computational fluid dynamics (CFD) with the lumped parameter model (LPM) of the cardiovascular system. In this work, we have implemented a simple computational method to compute the FFR. As this method uses only coronary arteries for the CFD model and includes only the LPM of the coronary vascular system, it provides simpler boundary conditions for the coronary geometry and is computationally more efficient than existing approaches. To test the efficacy of this method, we simulated a three-dimensional straight vessel using CFD coupled with the LPM. The computed results were compared with those of the LPM. To validate this method in terms of clinically realistic geometry, a patient-specific model of stenosed coronary arteries was constructed from CT images, and the computed FFR was compared with clinically measured results. We evaluated the effect of a model aorta on the computed FFR and compared this with a model without the aorta. Computationally, the model without the aorta was more efficient than that with the aorta, reducing the CPU time required for computing a cardiac cycle to 43.4%. Copyright © 2014. Published by Elsevier Ltd.

  4. Toward a computational model of hemostasis

    Science.gov (United States)

    Leiderman, Karin; Danes, Nicholas; Schoeman, Rogier; Neeves, Keith

    2017-11-01

    Hemostasis is the process by which a blood clot forms to prevent bleeding at a site of injury. The formation time, size and structure of a clot depends on the local hemodynamics and the nature of the injury. Our group has previously developed computational models to study intravascular clot formation, a process confined to the interior of a single vessel. Here we present the first stage of an experimentally-validated, computational model of extravascular clot formation (hemostasis) in which blood through a single vessel initially escapes through a hole in the vessel wall and out a separate injury channel. This stage of the model consists of a system of partial differential equations that describe platelet aggregation and hemodynamics, solved via the finite element method. We also present results from the analogous, in vitro, microfluidic model. In both models, formation of a blood clot occludes the injury channel and stops flow from escaping while blood in the main vessel retains its fluidity. We discuss the different biochemical and hemodynamic effects on clot formation using distinct geometries representing intra- and extravascular injuries.

  5. Computer-Aided Modeling of Lipid Processing Technology

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel

    2011-01-01

    increase along with growing interest in biofuels, the oleochemical industry faces in the upcoming years major challenges in terms of design and development of better products and more sustainable processes to make them. Computer-aided methods and tools for process synthesis, modeling and simulation...... are widely used for design, analysis, and optimization of processes in the chemical and petrochemical industries. These computer-aided tools have helped the chemical industry to evolve beyond commodities toward specialty chemicals and ‘consumer oriented chemicals based products’. Unfortunately...... to develop systematic computer-aided methods (property models) and tools (database) related to the prediction of the necessary physical properties suitable for design and analysis of processes employing lipid technologies. The methods and tools include: the development of a lipid-database (CAPEC...

  6. Profiles of Motivated Self-Regulation in College Computer Science Courses: Differences in Major versus Required Non-Major Courses

    Science.gov (United States)

    Shell, Duane F.; Soh, Leen-Kiat

    2013-12-01

    The goal of the present study was to utilize a profiling approach to understand differences in motivation and strategic self-regulation among post-secondary STEM students in major versus required non-major computer science courses. Participants were 233 students from required introductory computer science courses (194 men; 35 women; 4 unknown) at a large Midwestern state university. Cluster analysis identified five profiles: (1) a strategic profile of a highly motivated by-any-means good strategy user; (2) a knowledge-building profile of an intrinsically motivated autonomous, mastery-oriented student; (3) a surface learning profile of a utility motivated minimally engaged student; (4) an apathetic profile of an amotivational disengaged student; and (5) a learned helpless profile of a motivated but unable to effectively self-regulate student. Among CS majors and students in courses in their major field, the strategic and knowledge-building profiles were the most prevalent. Among non-CS majors and students in required non-major courses, the learned helpless, surface learning, and apathetic profiles were the most prevalent. Students in the strategic and knowledge-building profiles had significantly higher retention of computational thinking knowledge than students in other profiles. Students in the apathetic and surface learning profiles saw little instrumentality of the course for their future academic and career objectives. Findings show that students in STEM fields taking required computer science courses exhibit the same constellation of motivated strategic self-regulation profiles found in other post-secondary and K-12 settings.

  7. Computational model for dosimetric purposes in dental procedures

    International Nuclear Information System (INIS)

    Kawamoto, Renato H.; Campos, Tarcisio R.

    2013-01-01

    This study aims to develop a computational model for dosimetric purposes the oral region, based on computational tools SISCODES and MCNP-5, to predict deterministic effects and minimize stochastic effects caused by ionizing radiation by radiodiagnosis. Based on a set of digital information provided by computed tomography, three-dimensional voxel model was created, and its tissues represented. The model was exported to the MCNP code. In association with SICODES, we used the Monte Carlo N-Particle Transport Code (MCNP-5) method to play the corresponding interaction of nuclear particles with human tissues statistical process. The study will serve as a source of data for dosimetric studies in the oral region, providing deterministic effect and minimize the stochastic effect of ionizing radiation

  8. Cholinergic modulation of cognitive processing: insights drawn from computational models

    Directory of Open Access Journals (Sweden)

    Ehren L Newman

    2012-06-01

    Full Text Available Acetylcholine plays an important role in cognitive function, as shown by pharmacological manipulations that impact working memory, attention, episodic memory and spatial memory function. Acetylcholine also shows striking modulatory influences on the cellular physiology of hippocampal and cortical neurons. Modeling of neural circuits provides a framework for understanding how the cognitive functions may arise from the influence of acetylcholine on neural and network dynamics. We review the influences of cholinergic manipulations on behavioral performance in working memory, attention, episodic memory and spatial memory tasks, the physiological effects of acetylcholine on neural and circuit dynamics, and the computational models that provide insight into the functional relationships between the physiology and behavior. Specifically, we discuss the important role of acetylcholine in governing mechanisms of active maintenance in working memory tasks and in regulating network dynamics important for effective processing of stimuli in attention and episodic memory tasks. We also propose that theta rhythm play a crucial role as an intermediary between the physiological influences of acetylcholine and behavior in episodic and spatial memory tasks. We conclude with a synthesis of the existing modeling work and highlight future directions that are likely to be rewarding given the existing state of the literature for both empiricists and modelers.

  9. Tip cell overtaking occurs as a side effect of sprouting in computational models of angiogenesis.

    Science.gov (United States)

    Boas, Sonja E M; Merks, Roeland M H

    2015-11-21

    During angiogenesis, the formation of new blood vessels from existing ones, endothelial cells differentiate into tip and stalk cells, after which one tip cell leads the sprout. More recently, this picture has changed. It has become clear that endothelial cells compete for the tip position during angiogenesis: a phenomenon named tip cell overtaking. The biological function of tip cell overtaking is not yet known. From experimental observations, it is unclear to what extent tip cell overtaking is a side effect of sprouting or to what extent it is regulated through a VEGF-Dll4-Notch signaling network and thus might have a biological function. To address this question, we studied tip cell overtaking in computational models of angiogenic sprouting in absence and in presence of VEGF-Dll4-Notch signaling. We looked for tip cell overtaking in two existing Cellular Potts models of angiogenesis. In these simulation models angiogenic sprouting-like behavior emerges from a small set of plausible cell behaviors. In the first model, cells aggregate through contact-inhibited chemotaxis. In the second model the endothelial cells assume an elongated shape and aggregate through (non-inhibited) chemotaxis. In both these sprouting models the endothelial cells spontaneously migrate forwards and backwards within sprouts, suggesting that tip cell overtaking might occur as a side effect of sprouting. In accordance with other experimental observations, in our simulations the cells' tendency to occupy the tip position can be regulated when two cell lines with different levels of Vegfr2 expression are contributing to sprouting (mosaic sprouting assay), where cell behavior is regulated by a simple VEGF-Dll4-Notch signaling network. Our modeling results suggest that tip cell overtaking can occur spontaneously due to the stochastic motion of cells during sprouting. Thus, tip cell overtaking and sprouting dynamics may be interdependent and should be studied and interpreted in combination. VEGF

  10. Recent advances, and unresolved issues, in the application of computational modelling to the prediction of the biological effects of nanomaterials

    International Nuclear Information System (INIS)

    Winkler, David A.

    2016-01-01

    Nanomaterials research is one of the fastest growing contemporary research areas. The unprecedented properties of these materials have meant that they are being incorporated into products very quickly. Regulatory agencies are concerned they cannot assess the potential hazards of these materials adequately, as data on the biological properties of nanomaterials are still relatively limited and expensive to acquire. Computational modelling methods have much to offer in helping understand the mechanisms by which toxicity may occur, and in predicting the likelihood of adverse biological impacts of materials not yet tested experimentally. This paper reviews the progress these methods, particularly those QSAR-based, have made in understanding and predicting potentially adverse biological effects of nanomaterials, and also the limitations and pitfalls of these methods. - Highlights: • Nanomaterials regulators need good information to make good decisions. • Nanomaterials and their interactions with biology are very complex. • Computational methods use existing data to predict properties of new nanomaterials. • Statistical, data driven modelling methods have been successfully applied to this task. • Much more must be learnt before robust toolkits will be widely usable by regulators.

  11. On the Bayesian calibration of computer model mixtures through experimental data, and the design of predictive models

    Science.gov (United States)

    Karagiannis, Georgios; Lin, Guang

    2017-08-01

    For many real systems, several computer models may exist with different physics and predictive abilities. To achieve more accurate simulations/predictions, it is desirable for these models to be properly combined and calibrated. We propose the Bayesian calibration of computer model mixture method which relies on the idea of representing the real system output as a mixture of the available computer model outputs with unknown input dependent weight functions. The method builds a fully Bayesian predictive model as an emulator for the real system output by combining, weighting, and calibrating the available models in the Bayesian framework. Moreover, it fits a mixture of calibrated computer models that can be used by the domain scientist as a mean to combine the available computer models, in a flexible and principled manner, and perform reliable simulations. It can address realistic cases where one model may be more accurate than the others at different input values because the mixture weights, indicating the contribution of each model, are functions of the input. Inference on the calibration parameters can consider multiple computer models associated with different physics. The method does not require knowledge of the fidelity order of the models. We provide a technique able to mitigate the computational overhead due to the consideration of multiple computer models that is suitable to the mixture model framework. We implement the proposed method in a real-world application involving the Weather Research and Forecasting large-scale climate model.

  12. Transnational models for regulation of nanotechnology.

    Science.gov (United States)

    Marchant, Gary E; Sylvester, Douglas J

    2006-01-01

    Like all technologies, nanotechnology will inevitably present risks, whether they result from unintentional effects of otherwise beneficial applications, or from the malevolent misuse of technology. Increasingly, risks from new and emerging technologies are being regulated at the international level, although governments and private experts are only beginning to consider the appropriate international responses to nanotechnology. In this paper, we explore both the potential risks posed by nanotechnology and potential regulatory frameworks that law may impose. In so doing, we also explore the various rationales for international regulation including the potential for cross-boundary harms, sharing of regulatory expertise and resources, controlling protectionism and trade conflicts, avoiding a "race to the bottom" in which governments seek economic advantage through lax regulation, and limiting the "nano divide" between North and South. Finally, we examine some models for international regulation and offer tentative thoughts on the prospects for each.

  13. Biomedical Imaging and Computational Modeling in Biomechanics

    CERN Document Server

    Iacoviello, Daniela

    2013-01-01

    This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease.   The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.

  14. Getting computer models to communicate

    International Nuclear Information System (INIS)

    Caremoli, Ch.; Erhard, P.

    1999-01-01

    Today's computers have the processing power to deliver detailed and global simulations of complex industrial processes such as the operation of a nuclear reactor core. So should we be producing new, global numerical models to take full advantage of this new-found power? If so, it would be a long-term job. There is, however, another solution; to couple the existing validated numerical models together so that they work as one. (authors)

  15. Cloud Computing Adoption Business Model Factors: Does Enterprise Size Matter?

    OpenAIRE

    Bogataj Habjan, Kristina; Pucihar, Andreja

    2017-01-01

    This paper presents the results of research investigating the impact of business model factors on cloud computing adoption. The introduced research model consists of 40 cloud computing business model factors, grouped into eight factor groups. Their impact and importance for cloud computing adoption were investigated among enterpirses in Slovenia. Furthermore, differences in opinion according to enterprise size were investigated. Research results show no statistically significant impacts of in...

  16. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented...

  17. Computational Design Modelling : Proceedings of the Design Modelling Symposium

    CERN Document Server

    Kilian, Axel; Palz, Norbert; Scheurer, Fabian

    2012-01-01

    This book publishes the peer-reviewed proceeding of the third Design Modeling Symposium Berlin . The conference constitutes a platform for dialogue on experimental practice and research within the field of computationally informed architectural design. More than 60 leading experts the computational processes within the field of computationally informed architectural design to develop a broader and less exotic building practice that bears more subtle but powerful traces of the complex tool set and approaches we have developed and studied over recent years. The outcome are new strategies for a reasonable and innovative implementation of digital potential in truly innovative and radical design guided by both responsibility towards processes and the consequences they initiate.

  18. Description of mathematical models and computer programs

    International Nuclear Information System (INIS)

    1977-01-01

    The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives

  19. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  20. Computational challenges in modeling gene regulatory events.

    Science.gov (United States)

    Pataskar, Abhijeet; Tiwari, Vijay K

    2016-10-19

    Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.

  1. Shadow Replication: An Energy-Aware, Fault-Tolerant Computational Model for Green Cloud Computing

    Directory of Open Access Journals (Sweden)

    Xiaolong Cui

    2014-08-01

    Full Text Available As the demand for cloud computing continues to increase, cloud service providers face the daunting challenge to meet the negotiated SLA agreement, in terms of reliability and timely performance, while achieving cost-effectiveness. This challenge is increasingly compounded by the increasing likelihood of failure in large-scale clouds and the rising impact of energy consumption and CO2 emission on the environment. This paper proposes Shadow Replication, a novel fault-tolerance model for cloud computing, which seamlessly addresses failure at scale, while minimizing energy consumption and reducing its impact on the environment. The basic tenet of the model is to associate a suite of shadow processes to execute concurrently with the main process, but initially at a much reduced execution speed, to overcome failures as they occur. Two computationally-feasible schemes are proposed to achieve Shadow Replication. A performance evaluation framework is developed to analyze these schemes and compare their performance to traditional replication-based fault tolerance methods, focusing on the inherent tradeoff between fault tolerance, the specified SLA and profit maximization. The results show that Shadow Replication leads to significant energy reduction, and is better suited for compute-intensive execution models, where up to 30% more profit increase can be achieved due to reduced energy consumption.

  2. Deep Learning and Applications in Computational Biology

    KAUST Repository

    Zeng, Jianyang

    2016-01-01

    -transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three

  3. Computational Modeling of Fluctuations in Energy and Metabolic Pathways of Methanogenic Archaea

    Energy Technology Data Exchange (ETDEWEB)

    Luthey-Schulten, Zaida [Univ. of Illinois, Urbana-Champaign, IL (United States). Dept. of Chemistry; Carl R. Woese Inst. for Genomic Biology

    2017-01-04

    The methanogenic archaea, anaerobic microbes that convert CO2 and H2 and/or other small organic fermentation products into methane, play an unusually large role in the global carbon cycle. As they perform the final step in the anaerobic breakdown of biomass, methanogens are a biogenic source of an estimated one billion tons methane each year. Depending on the location, produced methane can be considered as either a greenhouse gas (agricultural byproduct), sequestered carbon storage (methane hydrate deposits), or a potential energy source (organic wastewater treatment). These microbes therefore represent an important target for biotechnology applications. Computational models of methanogens with predictive power are useful aids in the adaptation of methanogenic systems, but need to connect processes of wide-ranging time and length scales. In this project, we developed several computational methodologies for modeling the dynamic behavior of entire cells that connects stochastic reaction-diffusion dynamics of individual biochemical pathways with genome-scale modeling of metabolic networks. While each of these techniques were in the realm of well-defined computational methods, here we integrated them to develop several entirely new approaches to systems biology. The first scientific aim of the project was to model how noise in a biochemical pathway propagates into cellular phenotypes. Genetic circuits have been optimized by evolution to regulate molecular processes despite stochastic noise, but the effect of such noise on a cellular biochemical networks is currently unknown. An integrated stochastic/systems model of Escherichia coli species was created to analyze how noise in protein expression gives—and therefore noise in metabolic fluxes—gives rise to multiple cellular phenotype in isogenic population. After the initial work developing and validating methods that allow characterization of the heterogeneity in the model organism E. coli, the project shifted toward

  4. Efficiency using computer simulation of Reverse Threshold Model Theory on assessing a “One Laptop Per Child” computer versus desktop computer

    Directory of Open Access Journals (Sweden)

    Supat Faarungsang

    2017-04-01

    Full Text Available The Reverse Threshold Model Theory (RTMT model was introduced based on limiting factor concepts, but its efficiency compared to the Conventional Model (CM has not been published. This investigation assessed the efficiency of RTMT compared to CM using computer simulation on the “One Laptop Per Child” computer and a desktop computer. Based on probability values, it was found that RTMT was more efficient than CM among eight treatment combinations and an earlier study verified that RTMT gives complete elimination of random error. Furthermore, RTMT has several advantages over CM and is therefore proposed to be applied to most research data.

  5. A General Water Resources Regulation Software System in China

    Science.gov (United States)

    LEI, X.

    2017-12-01

    To avoid iterative development of core modules in water resource normal regulation and emergency regulation and improve the capability of maintenance and optimization upgrading of regulation models and business logics, a general water resources regulation software framework was developed based on the collection and analysis of common demands for water resources regulation and emergency management. It can provide a customizable, secondary developed and extensible software framework for the three-level platform "MWR-Basin-Province". Meanwhile, this general software system can realize business collaboration and information sharing of water resources regulation schemes among the three-level platforms, so as to improve the decision-making ability of national water resources regulation. There are four main modules involved in the general software system: 1) A complete set of general water resources regulation modules allows secondary developer to custom-develop water resources regulation decision-making systems; 2) A complete set of model base and model computing software released in the form of Cloud services; 3) A complete set of tools to build the concept map and model system of basin water resources regulation, as well as a model management system to calibrate and configure model parameters; 4) A database which satisfies business functions and functional requirements of general water resources regulation software can finally provide technical support for building basin or regional water resources regulation models.

  6. The Next Generation ARC Middleware and ATLAS Computing Model

    CERN Document Server

    Filipcic, A; The ATLAS collaboration; Smirnova, O; Konstantinov, A; Karpenko, D

    2012-01-01

    The distributed NDGF Tier-1 and associated Nordugrid clusters are well integrated into the ATLAS computing model but follow a slightly different paradigm than other ATLAS resources. The current strategy does not divide the sites as in the commonly used hierarchical model, but rather treats them as a single storage endpoint and a pool of distributed computing nodes. The next generation ARC middleware with its several new technologies provides new possibilities in development of the ATLAS computing model, such as pilot jobs with pre-cached input files, automatic job migration between the sites, integration of remote sites without connected storage elements, and automatic brokering for jobs with non-standard resource requirements. ARC's data transfer model provides an automatic way for the computing sites to participate in ATLAS' global task management system without requiring centralised brokering or data transfer services. The powerful API combined with Python and Java bindings can easily be used to build new ...

  7. Regulation of electricity distribution: Issues for implementing a norm model

    International Nuclear Information System (INIS)

    Bjoerndal, Endre; Bjoerndal, Mette; Bjoernenak, Trond; Johnsen, Thore

    2005-01-01

    The Norwegian regulation of transmission and distribution of electricity is currently under revision, and several proposals, including price caps, various norm models and adjustments to the present revenue cap model, have been considered by the Norwegian regulator, NVE. Our starting point is that a successful and sustainable income-regulation-model for electricity distribution should be in accordance with the way of thinking, and the managerial tools of modern businesses. In the regulation it is assumed that decisions regarding operations and investments are made by independent, business oriented entities. The ambition of a dynamically efficient industry therefore requires that the regulatory model and its implementation support best practice business performance. This will influence how the cost base is determined and the way investments are dealt with. We will investigate a possible implementation of a regulatory model based on cost norms. In this we will distinguish between on the one hand, customer driven costs, and on the other hand, costs related to the network itself. The network related costs, which account for approximately 80% of the total cost of electricity distribution, include the costs of operating and maintaining the network, as well as capital costs. These are the ''difficult'' costs, as their levels depend on structural and climatic factors, as well as the number of customers and the load that is served. Additionally, the costs are not separable, since for instance maintenance and investments can be substitutable activities. The work concentrates on verifying the cost model, and evaluating implications for the use of the present efficiency model (DEA) in the regulation. Moreover, we consider how network related costs can be managed in a norm model. Finally, it is highlighted that an important part of a regulatory model based on cost norms is to devise quality measures and how to use them in the economic regulation. (Author)

  8. Lattice Boltzmann model capable of mesoscopic vorticity computation

    Science.gov (United States)

    Peng, Cheng; Guo, Zhaoli; Wang, Lian-Ping

    2017-11-01

    It is well known that standard lattice Boltzmann (LB) models allow the strain-rate components to be computed mesoscopically (i.e., through the local particle distributions) and as such possess a second-order accuracy in strain rate. This is one of the appealing features of the lattice Boltzmann method (LBM) which is of only second-order accuracy in hydrodynamic velocity itself. However, no known LB model can provide the same quality for vorticity and pressure gradients. In this paper, we design a multiple-relaxation time LB model on a three-dimensional 27-discrete-velocity (D3Q27) lattice. A detailed Chapman-Enskog analysis is presented to illustrate all the necessary constraints in reproducing the isothermal Navier-Stokes equations. The remaining degrees of freedom are carefully analyzed to derive a model that accommodates mesoscopic computation of all the velocity and pressure gradients from the nonequilibrium moments. This way of vorticity calculation naturally ensures a second-order accuracy, which is also proven through an asymptotic analysis. We thus show, with enough degrees of freedom and appropriate modifications, the mesoscopic vorticity computation can be achieved in LBM. The resulting model is then validated in simulations of a three-dimensional decaying Taylor-Green flow, a lid-driven cavity flow, and a uniform flow passing a fixed sphere. Furthermore, it is shown that the mesoscopic vorticity computation can be realized even with single relaxation parameter.

  9. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking.

    Science.gov (United States)

    Pârvu, Ovidiu; Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour

  10. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    Science.gov (United States)

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  11. Understanding the physiology of the ageing individual: computational modelling of changes in metabolism and endurance

    Science.gov (United States)

    2016-01-01

    Ageing and lifespan are strongly affected by metabolism. The maximal possible uptake of oxygen is not only a good predictor of performance in endurance sports, but also of life expectancy. Figuratively speaking, healthy ageing is a competitive sport. Although the root cause of ageing is damage to macromolecules, it is the balance with repair processes that is decisive. Reduced or intermittent nutrition, hormones and intracellular signalling pathways that regulate metabolism have strong effects on ageing. Homeostatic regulatory processes tend to keep the environment of the cells within relatively narrow bounds. On the other hand, the body is constantly adapting to physical activity and food consumption. Spontaneous fluctuations in heart rate and other processes indicate youth and health. A (homeo)dynamic aspect of homeostasis deteriorates with age. We are now in a position to develop computational models of human metabolism and the dynamics of heart rhythm and oxygen transport that will advance our understanding of ageing. Computational modelling of the connections between dietary restriction, metabolism and protein turnover may increase insight into homeostasis of the proteins in our body. In this way, the computational reconstruction of human physiological processes, the Physiome, can help prevent frailty and age-related disease. PMID:27051508

  12. Predictive modeling of liquid-sodium thermal–hydraulics experiments and computations

    International Nuclear Information System (INIS)

    Arslan, Erkan; Cacuci, Dan G.

    2014-01-01

    Highlights: • We applied the predictive modeling method of Cacuci and Ionescu-Bujor (2010). • We assimilated data from sodium flow experiments. • We used computational fluid dynamics simulations of sodium experiments. • The predictive modeling method greatly reduced uncertainties in predicted results. - Abstract: This work applies the predictive modeling procedure formulated by Cacuci and Ionescu-Bujor (2010) to assimilate data from liquid-sodium thermal–hydraulics experiments in order to reduce systematically the uncertainties in the predictions of computational fluid dynamics (CFD) simulations. The predicted CFD-results for the best-estimate model parameters and results describing sodium-flow velocities and temperature distributions are shown to be significantly more precise than the original computations and experiments, in that the predicted uncertainties for the best-estimate results and model parameters are significantly smaller than both the originally computed and the experimental uncertainties

  13. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  14. Computational and Statistical Models: A Comparison for Policy Modeling of Childhood Obesity

    Science.gov (United States)

    Mabry, Patricia L.; Hammond, Ross; Ip, Edward Hak-Sing; Huang, Terry T.-K.

    As systems science methodologies have begun to emerge as a set of innovative approaches to address complex problems in behavioral, social science, and public health research, some apparent conflicts with traditional statistical methodologies for public health have arisen. Computational modeling is an approach set in context that integrates diverse sources of data to test the plausibility of working hypotheses and to elicit novel ones. Statistical models are reductionist approaches geared towards proving the null hypothesis. While these two approaches may seem contrary to each other, we propose that they are in fact complementary and can be used jointly to advance solutions to complex problems. Outputs from statistical models can be fed into computational models, and outputs from computational models can lead to further empirical data collection and statistical models. Together, this presents an iterative process that refines the models and contributes to a greater understanding of the problem and its potential solutions. The purpose of this panel is to foster communication and understanding between statistical and computational modelers. Our goal is to shed light on the differences between the approaches and convey what kinds of research inquiries each one is best for addressing and how they can serve complementary (and synergistic) roles in the research process, to mutual benefit. For each approach the panel will cover the relevant "assumptions" and how the differences in what is assumed can foster misunderstandings. The interpretations of the results from each approach will be compared and contrasted and the limitations for each approach will be delineated. We will use illustrative examples from CompMod, the Comparative Modeling Network for Childhood Obesity Policy. The panel will also incorporate interactive discussions with the audience on the issues raised here.

  15. Computational Models for Calcium-Mediated Astrocyte Functions

    Directory of Open Access Journals (Sweden)

    Tiina Manninen

    2018-04-01

    Full Text Available The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop

  16. Computational Models for Calcium-Mediated Astrocyte Functions.

    Science.gov (United States)

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus

  17. The use of conduction model in laser weld profile computation

    Science.gov (United States)

    Grabas, Bogusław

    2007-02-01

    Profiles of joints resulting from deep penetration laser beam welding of a flat workpiece of carbon steel were computed. A semi-analytical conduction model solved with Green's function method was used in computations. In the model, the moving heat source was attenuated exponentially in accordance with Beer-Lambert law. Computational results were compared with those in the experiment.

  18. Approximate Bayesian Computation by Subset Simulation using hierarchical state-space models

    Science.gov (United States)

    Vakilzadeh, Majid K.; Huang, Yong; Beck, James L.; Abrahamsson, Thomas

    2017-02-01

    A new multi-level Markov Chain Monte Carlo algorithm for Approximate Bayesian Computation, ABC-SubSim, has recently appeared that exploits the Subset Simulation method for efficient rare-event simulation. ABC-SubSim adaptively creates a nested decreasing sequence of data-approximating regions in the output space that correspond to increasingly closer approximations of the observed output vector in this output space. At each level, multiple samples of the model parameter vector are generated by a component-wise Metropolis algorithm so that the predicted output corresponding to each parameter value falls in the current data-approximating region. Theoretically, if continued to the limit, the sequence of data-approximating regions would converge on to the observed output vector and the approximate posterior distributions, which are conditional on the data-approximation region, would become exact, but this is not practically feasible. In this paper we study the performance of the ABC-SubSim algorithm for Bayesian updating of the parameters of dynamical systems using a general hierarchical state-space model. We note that the ABC methodology gives an approximate posterior distribution that actually corresponds to an exact posterior where a uniformly distributed combined measurement and modeling error is added. We also note that ABC algorithms have a problem with learning the uncertain error variances in a stochastic state-space model and so we treat them as nuisance parameters and analytically integrate them out of the posterior distribution. In addition, the statistical efficiency of the original ABC-SubSim algorithm is improved by developing a novel strategy to regulate the proposal variance for the component-wise Metropolis algorithm at each level. We demonstrate that Self-regulated ABC-SubSim is well suited for Bayesian system identification by first applying it successfully to model updating of a two degree-of-freedom linear structure for three cases: globally

  19. Computational Psychometrics for Modeling System Dynamics during Stressful Disasters

    Directory of Open Access Journals (Sweden)

    Pietro Cipresso

    2017-08-01

    Full Text Available Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future.

  20. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  1. Computer modeling of road bridge for simulation moving load

    Directory of Open Access Journals (Sweden)

    Miličić Ilija M.

    2016-01-01

    Full Text Available In this paper is shown computational modelling one span road structures truss bridge with the roadway on the upper belt of. Calculation models were treated as planar and spatial girders made up of 1D finite elements with applications for CAA: Tower and Bridge Designer 2016 (2nd Edition. The conducted computer simulations results are obtained for each comparison of the impact of moving load according to the recommendations of the two standards SRPS and AASHATO. Therefore, it is a variant of the bridge structure modeling application that provides Bridge Designer 2016 (2nd Edition identical modeled in an environment of Tower. As important information for the selection of a computer applications point out that the application Bridge Designer 2016 (2nd Edition we arent unable to treat the impacts moving load model under national standard - V600. .

  2. Epistemic Gameplay and Discovery in Computational Model-Based Inquiry Activities

    Science.gov (United States)

    Wilkerson, Michelle Hoda; Shareff, Rebecca; Laina, Vasiliki; Gravel, Brian

    2018-01-01

    In computational modeling activities, learners are expected to discover the inner workings of scientific and mathematical systems: First elaborating their understandings of a given system through constructing a computer model, then "debugging" that knowledge by testing and refining the model. While such activities have been shown to…

  3. Computer-aided modeling for efficient and innovative product-process engineering

    DEFF Research Database (Denmark)

    Heitzig, Martina

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy and water. This trend is set to continue due to the substantial benefits computer...... in chemical and biochemical engineering have been solved to illustrate the application of the generic modelling methodology, the computeraided modelling framework and the developed software tool.......-aided methods provide. The key prerequisite of computer-aided productprocess engineering is however the availability of models of different types, forms and application modes. The development of the models required for the systems under investigation tends to be a challenging, time-consuming and therefore cost...

  4. Internal and External Regulation to Support Knowledge Construction and Convergence in Computer Supported Collaborative Learning (CSCL)

    Science.gov (United States)

    Romero, Margarida; Lambropoulos, Niki

    2011-01-01

    Computer Supported Collaborative Learning (CSCL) activities aim to promote collaborative knowledge construction and convergence. During the CSCL activity, the students should regulate their learning activity, at the individual and collective level. This implies an organisation cost related to the coordination of the activity with the team-mates…

  5. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  6. A whole-body model for glycogen regulation reveals a critical role for substrate cycling in maintaining blood glucose homeostasis.

    Directory of Open Access Journals (Sweden)

    Ke Xu

    2011-12-01

    Full Text Available Timely, and sometimes rapid, metabolic adaptation to changes in food supply is critical for survival as an organism moves from the fasted to the fed state, and vice versa. These transitions necessitate major metabolic changes to maintain energy homeostasis as the source of blood glucose moves away from ingested carbohydrates, through hepatic glycogen stores, towards gluconeogenesis. The integration of hepatic glycogen regulation with extra-hepatic energetics is a key aspect of these adaptive mechanisms. Here we use computational modeling to explore hepatic glycogen regulation under fed and fasting conditions in the context of a whole-body model. The model was validated against previous experimental results concerning glycogen phosphorylase a (active and glycogen synthase a dynamics. The model qualitatively reproduced physiological changes that occur during transition from the fed to the fasted state. Analysis of the model reveals a critical role for the inhibition of glycogen synthase phosphatase by glycogen phosphorylase a. This negative regulation leads to high levels of glycogen synthase activity during fasting conditions, which in turn increases substrate (futile cycling, priming the system for a rapid response once an external source of glucose is restored. This work demonstrates that a mechanistic understanding of the design principles used by metabolic control circuits to maintain homeostasis can benefit from the incorporation of mathematical descriptions of these networks into "whole-body" contextual models that mimic in vivo conditions.

  7. Light reflection models for computer graphics.

    Science.gov (United States)

    Greenberg, D P

    1989-04-14

    During the past 20 years, computer graphic techniques for simulating the reflection of light have progressed so that today images of photorealistic quality can be produced. Early algorithms considered direct lighting only, but global illumination phenomena with indirect lighting, surface interreflections, and shadows can now be modeled with ray tracing, radiosity, and Monte Carlo simulations. This article describes the historical development of computer graphic algorithms for light reflection and pictorially illustrates what will be commonly available in the near future.

  8. Python for Scientific Computing Education: Modeling of Queueing Systems

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2014-01-01

    Full Text Available In this paper, we present the methodology for the introduction to scientific computing based on model-centered learning. We propose multiphase queueing systems as a basis for learning objects. We use Python and parallel programming for implementing the models and present the computer code and results of stochastic simulations.

  9. Computational Intelligence. Mortality Models for the Actuary

    NARCIS (Netherlands)

    Willemse, W.J.

    2001-01-01

    This thesis applies computational intelligence to the field of actuarial (insurance) science. In particular, this thesis deals with life insurance where mortality modelling is important. Actuaries use ancient models (mortality laws) from the nineteenth century, for example Gompertz' and Makeham's

  10. Mathematical modelling of steam generator and design of temperature regulator

    Energy Technology Data Exchange (ETDEWEB)

    Bogdanovic, S.S. [EE Institute Nikola Tesla, Belgrade (Yugoslavia)

    1999-07-01

    The paper considers mathematical modelling of once-through power station boiler and numerical algorithm for simulation of the model. Fast and numerically stable algorithm based on the linearisation of model equations and on the simultaneous solving of differential and algebraic equations is proposed. The paper also presents the design of steam temperature regulator by using the method of projective controls. Dynamic behaviour of the system closed with optimal linear quadratic regulator is taken as the reference system. The desired proprieties of the reference system are retained and solutions for superheated steam temperature regulator are determined. (author)

  11. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  12. Life system modeling and intelligent computing. Pt. II. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part II of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 56 papers in this volume are organized in topical sections on advanced evolutionary computing theory and algorithms; advanced neural network and fuzzy system theory and algorithms; modeling and simulation of societies and collective behavior; biomedical signal processing, imaging, and visualization; intelligent computing and control in distributed power generation systems; intelligent methods in power and energy infrastructure development; intelligent modeling, monitoring, and control of complex nonlinear systems. (orig.)

  13. Global Stability of an Epidemic Model of Computer Virus

    Directory of Open Access Journals (Sweden)

    Xiaofan Yang

    2014-01-01

    Full Text Available With the rapid popularization of the Internet, computers can enter or leave the Internet increasingly frequently. In fact, no antivirus software can detect and remove all sorts of computer viruses. This implies that viruses would persist on the Internet. To better understand the spread of computer viruses in these situations, a new propagation model is established and analyzed. The unique equilibrium of the model is globally asymptotically stable, in accordance with the reality. A parameter analysis of the equilibrium is also conducted.

  14. Simulation models for computational plasma physics: Concluding report

    International Nuclear Information System (INIS)

    Hewett, D.W.

    1994-01-01

    In this project, the authors enhanced their ability to numerically simulate bounded plasmas that are dominated by low-frequency electric and magnetic fields. They moved towards this goal in several ways; they are now in a position to play significant roles in the modeling of low-frequency electromagnetic plasmas in several new industrial applications. They have significantly increased their facility with the computational methods invented to solve the low frequency limit of Maxwell's equations (DiPeso, Hewett, accepted, J. Comp. Phys., 1993). This low frequency model is called the Streamlined Darwin Field model (SDF, Hewett, Larson, and Doss, J. Comp. Phys., 1992) has now been implemented in a fully non-neutral SDF code BEAGLE (Larson, Ph.D. dissertation, 1993) and has further extended to the quasi-neutral limit (DiPeso, Hewett, Comp. Phys. Comm., 1993). In addition, they have resurrected the quasi-neutral, zero-electron-inertia model (ZMR) and began the task of incorporating internal boundary conditions into this model that have the flexibility of those in GYMNOS, a magnetostatic code now used in ion source work (Hewett, Chen, ICF Quarterly Report, July--September, 1993). Finally, near the end of this project, they invented a new type of banded matrix solver that can be implemented on a massively parallel computer -- thus opening the door for the use of all their ADI schemes on these new computer architecture's (Mattor, Williams, Hewett, submitted to Parallel Computing, 1993)

  15. Pulse cleaning flow models and numerical computation of candle ceramic filters.

    Science.gov (United States)

    Tian, Gui-shan; Ma, Zhen-ji; Zhang, Xin-yi; Xu, Ting-xiang

    2002-04-01

    Analytical and numerical computed models are developed for reverse pulse cleaning system of candle ceramic filters. A standard turbulent model is demonstrated suitably to the designing computation of reverse pulse cleaning system from the experimental and one-dimensional computational result. The computed results can be used to guide the designing of reverse pulse cleaning system, which is optimum Venturi geometry. From the computed results, the general conclusions and the designing methods are obtained.

  16. The role of computer modelling in participatory integrated assessments

    International Nuclear Information System (INIS)

    Siebenhuener, Bernd; Barth, Volker

    2005-01-01

    In a number of recent research projects, computer models have been included in participatory procedures to assess global environmental change. The intention was to support knowledge production and to help the involved non-scientists to develop a deeper understanding of the interactions between natural and social systems. This paper analyses the experiences made in three projects with the use of computer models from a participatory and a risk management perspective. Our cross-cutting analysis of the objectives, the employed project designs and moderation schemes and the observed learning processes in participatory processes with model use shows that models play a mixed role in informing participants and stimulating discussions. However, no deeper reflection on values and belief systems could be achieved. In terms of the risk management phases, computer models serve best the purposes of problem definition and option assessment within participatory integrated assessment (PIA) processes

  17. Distributed parallel computing in stochastic modeling of groundwater systems.

    Science.gov (United States)

    Dong, Yanhui; Li, Guomin; Xu, Haizhen

    2013-03-01

    Stochastic modeling is a rapidly evolving, popular approach to the study of the uncertainty and heterogeneity of groundwater systems. However, the use of Monte Carlo-type simulations to solve practical groundwater problems often encounters computational bottlenecks that hinder the acquisition of meaningful results. To improve the computational efficiency, a system that combines stochastic model generation with MODFLOW-related programs and distributed parallel processing is investigated. The distributed computing framework, called the Java Parallel Processing Framework, is integrated into the system to allow the batch processing of stochastic models in distributed and parallel systems. As an example, the system is applied to the stochastic delineation of well capture zones in the Pinggu Basin in Beijing. Through the use of 50 processing threads on a cluster with 10 multicore nodes, the execution times of 500 realizations are reduced to 3% compared with those of a serial execution. Through this application, the system demonstrates its potential in solving difficult computational problems in practical stochastic modeling. © 2012, The Author(s). Groundwater © 2012, National Ground Water Association.

  18. PETRI NET MODELING OF COMPUTER VIRUS LIFE CYCLE

    African Journals Online (AJOL)

    Dr Obe

    dynamic system analysis is applied to model the virus life cycle. Simulation of the derived model ... Keywords: Virus lifecycle, Petri nets, modeling. simulation. .... complex process. Figure 2 .... by creating Matlab files for five different computer ...

  19. Airfoil computations using the gamma-Retheta model; Wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen, Niels N.

    2009-05-15

    The present work addresses the validation of the implementation of the Menter, Langtry et al. gamma-theta correlation based transition model [1, 2, 3] in the EllipSys2D code. Firstly the 2. order of accuracy of the code is verified using a grid refinement study for laminar, turbulent and transitional computations. Based on this, an estimate of the error in the computations is determined to be approximately one percent in the attached region. Following the verification of the implemented model, the model is applied to four airfoils, NACA64-018, NACA64-218, NACA64-418 and NACA64-618 and the results are compared to measurements [4] and computations using the Xfoil code by Drela et al. [5]. In the linear pre stall region good agreement is observed both for lift and drag, while differences to both measurements and Xfoil computations are observed in stalled conditions. (au)

  20. On the usage of ultrasound computational models for decision making under ambiguity

    Science.gov (United States)

    Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron

    2018-04-01

    Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.

  1. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  2. Computer Modelling «Smart Building»

    Directory of Open Access Journals (Sweden)

    O. Yu. Maryasin

    2016-01-01

    Full Text Available Currently ”Smart building” or ”Smart house” technology is developing actively in industrialized countries. The main idea of ”smart building” or ”smart house” is to have a system which is able to identify definite situations happening in house and respond accordingly. Automated house management system is made for automated control and management and also for organization of interaction between separated systems of engineering equipment. This system includes automation subsystems of one or another engineering equipment as separated components. In order to perform study of different functioning modes of engineering subsystems and the whole system, mathematical and computer modeling needs to be used. From mathematical point of veiw description of ”Smart building” is a continuous-discrete or hybrid system consisting of interacting elements of different nature, whose behavior is described by continuous and discrete processes. In the article the authors present a computer model ”Smart building” which allows to model the work of main engineering subsystems and management algorithms. The model is created in Simulink Matlab system with ”physical modeling” library Simscape and Stateflow library. The peculiarity of this model is the use of specialized management and control algorithms which allow providing coordinated interaction of subsystems and optimizing power consumption. 

  3. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  4. Should I Stop Thinking About It: A Computational Exploration of Reappraisal Based Emotion Regulation

    Directory of Open Access Journals (Sweden)

    Mei Si

    2015-01-01

    Full Text Available Agent-based simulation of people’s behaviors and minds has become increasingly popular in recent years. It provides a research platform to simulate and compare alternative psychological and social theories, as well as to create virtual characters that can interact with people or among each other to provide pedagogical or entertainment effects. In this paper, we investigate computationally modeling people’s coping behaviors and in particular in relation to depression, in decision-theoretic agents. Recent studies have suggested that depression can result from failed emotion regulation under limited cognitive resources. In this work, we demonstrate how reappraisal can fail under high levels of stress and limited cognitive resources using an agent-based simulation. Further, we explored the effectiveness of reappraisal under different conditions. Our experiments suggest that for people who are more likely to recall positive memories, it is more beneficial to think about the recalled events from multiple perspectives. However, for people who are more likely to recall negative memories, the better strategy is to not evaluate the recalled events against multiple goals.

  5. A scalable approach to modeling groundwater flow on massively parallel computers

    International Nuclear Information System (INIS)

    Ashby, S.F.; Falgout, R.D.; Tompson, A.F.B.

    1995-12-01

    We describe a fully scalable approach to the simulation of groundwater flow on a hierarchy of computing platforms, ranging from workstations to massively parallel computers. Specifically, we advocate the use of scalable conceptual models in which the subsurface model is defined independently of the computational grid on which the simulation takes place. We also describe a scalable multigrid algorithm for computing the groundwater flow velocities. We axe thus able to leverage both the engineer's time spent developing the conceptual model and the computing resources used in the numerical simulation. We have successfully employed this approach at the LLNL site, where we have run simulations ranging in size from just a few thousand spatial zones (on workstations) to more than eight million spatial zones (on the CRAY T3D)-all using the same conceptual model

  6. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

  7. ADGEN: ADjoint GENerator for computer models

    Energy Technology Data Exchange (ETDEWEB)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.

    1989-05-01

    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs.

  8. ADGEN: ADjoint GENerator for computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.

    1989-05-01

    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs

  9. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    Science.gov (United States)

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services. PMID:28112020

  10. Establishing a Cloud Computing Success Model for Hospitals in Taiwan.

    Science.gov (United States)

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  11. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    Directory of Open Access Journals (Sweden)

    Jiunn-Woei Lian PhD

    2017-01-01

    Full Text Available The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  12. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  13. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    Science.gov (United States)

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate

  14. A composite computational model of liver glucose homeostasis. I. Building the composite model.

    Science.gov (United States)

    Hetherington, J; Sumner, T; Seymour, R M; Li, L; Rey, M Varela; Yamaji, S; Saffrey, P; Margoninski, O; Bogle, I D L; Finkelstein, A; Warner, A

    2012-04-07

    A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an 'engineering' approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.

  15. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  16. Attacker Modelling in Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Papini, Davide

    in with our everyday life. This future is visible to everyone nowadays: terms like smartphone, cloud, sensor, network etc. are widely known and used in our everyday life. But what about the security of such systems. Ubiquitous computing devices can be limited in terms of energy, computing power and memory...... attacker remain somehow undened and still under extensive investigation. This Thesis explores the nature of the ubiquitous attacker with a focus on how she interacts with the physical world and it denes a model that captures the abilities of the attacker. Furthermore a quantitative implementation...

  17. Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?

    Science.gov (United States)

    Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.

    2017-12-01

    This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.

  18. Methods for teaching geometric modelling and computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Rotkov, S.I.; Faitel`son, Yu. Ts.

    1992-05-01

    This paper considers methods for teaching the methods and algorithms of geometric modelling and computer graphics to programmers, designers and users of CAD and computer-aided research systems. There is a bibliography that can be used to prepare lectures and practical classes. 37 refs., 1 tab.

  19. Computational Modeling of Teaching and Learning through Application of Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Richard Lamb

    2015-09-01

    Full Text Available Within the mind, there are a myriad of ideas that make sense within the bounds of everyday experience, but are not reflective of how the world actually exists; this is particularly true in the domain of science. Classroom learning with teacher explanation are a bridge through which these naive understandings can be brought in line with scientific reality. The purpose of this paper is to examine how the application of a Multiobjective Evolutionary Algorithm (MOEA can work in concert with an existing computational-model to effectively model critical-thinking in the science classroom. An evolutionary algorithm is an algorithm that iteratively optimizes machine learning based computational models. The research question is, does the application of an evolutionary algorithm provide a means to optimize the Student Task and Cognition Model (STAC-M and does the optimized model sufficiently represent and predict teaching and learning outcomes in the science classroom? Within this computational study, the authors outline and simulate the effect of teaching on the ability of a “virtual” student to solve a Piagetian task. Using the Student Task and Cognition Model (STAC-M a computational model of student cognitive processing in science class developed in 2013, the authors complete a computational experiment which examines the role of cognitive retraining on student learning. Comparison of the STAC-M and the STAC-M with inclusion of the Multiobjective Evolutionary Algorithm shows greater success in solving the Piagetian science-tasks post cognitive retraining with the Multiobjective Evolutionary Algorithm. This illustrates the potential uses of cognitive and neuropsychological computational modeling in educational research. The authors also outline the limitations and assumptions of computational modeling.

  20. Finite difference computing with exponential decay models

    CERN Document Server

    Langtangen, Hans Petter

    2016-01-01

    This text provides a very simple, initial introduction to the complete scientific computing pipeline: models, discretization, algorithms, programming, verification, and visualization. The pedagogical strategy is to use one case study – an ordinary differential equation describing exponential decay processes – to illustrate fundamental concepts in mathematics and computer science. The book is easy to read and only requires a command of one-variable calculus and some very basic knowledge about computer programming. Contrary to similar texts on numerical methods and programming, this text has a much stronger focus on implementation and teaches testing and software engineering in particular. .

  1. Computational modelling of thermo-mechanical and transport properties of carbon nanotubes

    International Nuclear Information System (INIS)

    Rafii-Tabar, H.

    2004-01-01

    Over the recent years, numerical modelling and computer-based simulation of the properties of carbon nanotubes have become the focal points of research in computational nano-science and its associated fields of computational condensed matter physics and materials modelling. Modelling of the mechanical, thermal and transport properties of nanotubes via numerical simulations forms the central part of this research, concerned with the nano-scale mechanics and nano-scale thermodynamics of nanotubes, and nano-scale adsorption, storage and flow properties in nanotubes. A review of these properties, obtained via computational modelling studies, is presented here. We first introduce the physics of carbon nanotubes, and then present the computational simulation tools that are appropriate for conducting a modelling study at the nano-scales. These include the molecular dynamics (MD), the Monte Carlo (MC), and the ab initio MD simulation methods. A complete range of inter-atomic potentials, of two-body and many-body varieties, that underlie all the modelling studies considered in this review is also given. Mechanical models from continuum-based elasticity theory that have been extensively employed in computing the energetics of nanotubes, or interpret the results from atomistic modelling, are presented and discussed. These include models based on the continuum theory of curved plates, shells, vibrating rods and bending beams. The validity of these continuum-based models has also been examined and the conditions under which they are applicable to nanotube modelling have been listed. Pertinent concepts from continuum theories of stress analysis are included, and the relevant methods for conducting the computation of the stress tensor, elastic constants and elastic modulii at the atomic level are also given. We then survey a comprehensive range of modelling studies concerned with the adsorption and storage of gases, and flow of fluids, in carbon nanotubes of various types. This

  2. Computational modelling of thermo-mechanical and transport properties of carbon nanotubes

    Energy Technology Data Exchange (ETDEWEB)

    Rafii-Tabar, H

    2004-02-01

    Over the recent years, numerical modelling and computer-based simulation of the properties of carbon nanotubes have become the focal points of research in computational nano-science and its associated fields of computational condensed matter physics and materials modelling. Modelling of the mechanical, thermal and transport properties of nanotubes via numerical simulations forms the central part of this research, concerned with the nano-scale mechanics and nano-scale thermodynamics of nanotubes, and nano-scale adsorption, storage and flow properties in nanotubes. A review of these properties, obtained via computational modelling studies, is presented here. We first introduce the physics of carbon nanotubes, and then present the computational simulation tools that are appropriate for conducting a modelling study at the nano-scales. These include the molecular dynamics (MD), the Monte Carlo (MC), and the ab initio MD simulation methods. A complete range of inter-atomic potentials, of two-body and many-body varieties, that underlie all the modelling studies considered in this review is also given. Mechanical models from continuum-based elasticity theory that have been extensively employed in computing the energetics of nanotubes, or interpret the results from atomistic modelling, are presented and discussed. These include models based on the continuum theory of curved plates, shells, vibrating rods and bending beams. The validity of these continuum-based models has also been examined and the conditions under which they are applicable to nanotube modelling have been listed. Pertinent concepts from continuum theories of stress analysis are included, and the relevant methods for conducting the computation of the stress tensor, elastic constants and elastic modulii at the atomic level are also given. We then survey a comprehensive range of modelling studies concerned with the adsorption and storage of gases, and flow of fluids, in carbon nanotubes of various types. This

  3. Ocean Modeling and Visualization on Massively Parallel Computer

    Science.gov (United States)

    Chao, Yi; Li, P. Peggy; Wang, Ping; Katz, Daniel S.; Cheng, Benny N.

    1997-01-01

    Climate modeling is one of the grand challenges of computational science, and ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change.

  4. Computer modeling of flow induced in-reactor vibrations

    International Nuclear Information System (INIS)

    Turula, P.; Mulcahy, T.M.

    1977-01-01

    An assessment of the reliability of finite element method computer models, as applied to the computation of flow induced vibration response of components used in nuclear reactors, is presented. The prototype under consideration was the Fast Flux Test Facility reactor being constructed for US-ERDA. Data were available from an extensive test program which used a scale model simulating the hydraulic and structural characteristics of the prototype components, subjected to scaled prototypic flow conditions as well as to laboratory shaker excitations. Corresponding analytical solutions of the component vibration problems were obtained using the NASTRAN computer code. Modal analyses and response analyses were performed. The effect of the surrounding fluid was accounted for. Several possible forcing function definitions were considered. Results indicate that modal computations agree well with experimental data. Response amplitude comparisons are good only under conditions favorable to a clear definition of the structural and hydraulic properties affecting the component motion. 20 refs

  5. A quantitative model of regulator's preference factor (RPF) in electricity-environment coordinated regulation system

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Yulong; Fu, Shijun [Economy and Business Administration School of Chongqing University, Chongqing 400030 (China)

    2010-12-15

    This paper explores quantification of regulator's preference factor (RPF) in electricity-environment coordinated regulation system. Based on social welfare economics, we articulately depict RPF's qualitative concept and its economic meaning. Then, applying abstract functions (i.e., abstract social welfare function, abstract utility function, and abstract production function), we deduce the partial-social-welfare elasticity, and build the mathematics model of maximizing social welfare. We nest this elasticity into the model's Kuhn-Tucker conditions, and obtain RPF's definition formula. By solving the Kuhn-Tucker conditions, we get RPF's quantitative formula, which solves the problem of hard to quantify regulator's preference in electricity-environment coordinated regulation system. The result shows that RPF only has relationship to subsystems' production function, and is independent of social welfare function and subsystems' utility function. Finally, we provide an empirical research based on the western region of China from year 1995 to 2004. It reveals that regulator has relative stability preference to mitigating pollutants. And validity test confirms that the empirical result is fit well to the practice. The RPF is truly a more general and valid instrument to measure regulator's preference in its regulated field. (author)

  6. A dynamic model for firm-response to non-credible incentive regulation regimes

    International Nuclear Information System (INIS)

    Agrell, Per J.; Grifell-Tatjé, Emili

    2016-01-01

    Economic network regulation increasingly use quantitative performance models (from econometrics and engineering) to set revenues. In theory, high-powered incentive regulation, such as revenue-caps, induces firms to cost-efficient behavior independent of underlying model. However, anecdotal evidence shows regulated firms occasionally maintaining cost-inefficiency under incentive regulation even under slumping profitability. We present a model for firm-level efficiency under a regime with a probability of failure explaining this phenomenon. The model is based on the hypothesis that the regulatory choice of method can be associated with intrinsic flaws leading to judicial repeal and replacement of it by a low-powered regime. The results show that the cost efficiency policy is proportional to the type of firm (cost of effort), value of time (discount factor) and the credibility of the method (risk of failure). A panel data set for 2000–2006 for 128 electricity distributors in Sweden is used to validate the model predictions (radical productivity slowdown, failing profitability and efficiency) at the launch and demise of a non-credible regulation method. The work highlights the fallacy of viewing incentive regulation as a method-independent instrument, a result applicable in any infrastructure regulation. - Highlights: • Incentive regulation relies on fixed revenue for operators. • In existing theory the efficiency-inducing effect is model-independent. • A dynamic game exposes the firm to a regulation that may fail. • One optimal policy is to pad cost and wait for the failure. • The Swedish DSOs show this policy 2003–2006, when the regime failed.

  7. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    Science.gov (United States)

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  8. A non-linear programming approach to the computer-aided design of regulators using a linear-quadratic formulation

    Science.gov (United States)

    Fleming, P.

    1985-01-01

    A design technique is proposed for linear regulators in which a feedback controller of fixed structure is chosen to minimize an integral quadratic objective function subject to the satisfaction of integral quadratic constraint functions. Application of a non-linear programming algorithm to this mathematically tractable formulation results in an efficient and useful computer-aided design tool. Particular attention is paid to computational efficiency and various recommendations are made. Two design examples illustrate the flexibility of the approach and highlight the special insight afforded to the designer.

  9. Model Infrastruktur dan Manajemen Platform Server Berbasis Cloud Computing

    Directory of Open Access Journals (Sweden)

    Mulki Indana Zulfa

    2017-11-01

    Full Text Available Cloud computing is a new technology that is still very rapidly growing. This technology makes the Internet as the main media for the management of data and applications remotely. Cloud computing allows users to run an application without having to think about infrastructure and its platforms. Other technical aspects such as memory, storage, backup and restore, can be done very easily. This research is intended to modeling the infrastructure and management of computer platform in computer network of Faculty of Engineering, University of Jenderal Soedirman. The first stage in this research is literature study, by finding out the implementation model in previous research. Then the result will be combined with a new approach to existing resources and try to implement directly on the existing server network. The results showed that the implementation of cloud computing technology is able to replace the existing platform network.

  10. Complex system modelling and control through intelligent soft computations

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The book offers a snapshot of the theories and applications of soft computing in the area of complex systems modeling and control. It presents the most important findings discussed during the 5th International Conference on Modelling, Identification and Control, held in Cairo, from August 31-September 2, 2013. The book consists of twenty-nine selected contributions, which have been thoroughly reviewed and extended before their inclusion in the volume. The different chapters, written by active researchers in the field, report on both current theories and important applications of soft-computing. Besides providing the readers with soft-computing fundamentals, and soft-computing based inductive methodologies/algorithms, the book also discusses key industrial soft-computing applications, as well as multidisciplinary solutions developed for a variety of purposes, like windup control, waste management, security issues, biomedical applications and many others. It is a perfect reference guide for graduate students, r...

  11. The Mechanics of Embodiment: A Dialog on Embodiment and Computational Modeling

    Science.gov (United States)

    Pezzulo, Giovanni; Barsalou, Lawrence W.; Cangelosi, Angelo; Fischer, Martin H.; McRae, Ken; Spivey, Michael J.

    2011-01-01

    Embodied theories are increasingly challenging traditional views of cognition by arguing that conceptual representations that constitute our knowledge are grounded in sensory and motor experiences, and processed at this sensorimotor level, rather than being represented and processed abstractly in an amodal conceptual system. Given the established empirical foundation, and the relatively underspecified theories to date, many researchers are extremely interested in embodied cognition but are clamoring for more mechanistic implementations. What is needed at this stage is a push toward explicit computational models that implement sensorimotor grounding as intrinsic to cognitive processes. In this article, six authors from varying backgrounds and approaches address issues concerning the construction of embodied computational models, and illustrate what they view as the critical current and next steps toward mechanistic theories of embodiment. The first part has the form of a dialog between two fictional characters: Ernest, the “experimenter,” and Mary, the “computational modeler.” The dialog consists of an interactive sequence of questions, requests for clarification, challenges, and (tentative) answers, and touches the most important aspects of grounded theories that should inform computational modeling and, conversely, the impact that computational modeling could have on embodied theories. The second part of the article discusses the most important open challenges for embodied computational modeling. PMID:21713184

  12. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  13. Computer Modelling of Photochemical Smog Formation

    Science.gov (United States)

    Huebert, Barry J.

    1974-01-01

    Discusses a computer program that has been used in environmental chemistry courses as an example of modelling as a vehicle for teaching chemical dynamics, and as a demonstration of some of the factors which affect the production of smog. (Author/GS)

  14. Hybrid computer modelling in plasma physics

    International Nuclear Information System (INIS)

    Hromadka, J; Ibehej, T; Hrach, R

    2016-01-01

    Our contribution is devoted to development of hybrid modelling techniques. We investigate sheath structures in the vicinity of solids immersed in low temperature argon plasma of different pressures by means of particle and fluid computer models. We discuss the differences in results obtained by these methods and try to propose a way to improve the results of fluid models in the low pressure area. There is a possibility to employ Chapman-Enskog method to find appropriate closure relations of fluid equations in a case when particle distribution function is not Maxwellian. We try to follow this way to enhance fluid model and to use it in hybrid plasma model further. (paper)

  15. A Computational Model of Selection by Consequences

    Science.gov (United States)

    McDowell, J. J.

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…

  16. Process-Based Development of Competence Models to Computer Science Education

    Science.gov (United States)

    Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter

    2016-01-01

    A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…

  17. Security Management Model in Cloud Computing Environment

    OpenAIRE

    Ahmadpanah, Seyed Hossein

    2016-01-01

    In the cloud computing environment, cloud virtual machine (VM) will be more and more the number of virtual machine security and management faced giant Challenge. In order to address security issues cloud computing virtualization environment, this paper presents a virtual machine based on efficient and dynamic deployment VM security management model state migration and scheduling, study of which virtual machine security architecture, based on AHP (Analytic Hierarchy Process) virtual machine de...

  18. Validation of Computer Models for Homeland Security Purposes

    International Nuclear Information System (INIS)

    Schweppe, John E.; Ely, James; Kouzes, Richard T.; McConn, Ronald J.; Pagh, Richard T.; Robinson, Sean M.; Siciliano, Edward R.; Borgardt, James D.; Bender, Sarah E.; Earnhart, Alison H.

    2005-01-01

    At Pacific Northwest National Laboratory, we are developing computer models of radiation portal monitors for screening vehicles and cargo. Detailed models of the radiation detection equipment, vehicles, cargo containers, cargos, and radioactive sources have been created. These are used to determine the optimal configuration of detectors and the best alarm algorithms for the detection of items of interest while minimizing nuisance alarms due to the presence of legitimate radioactive material in the commerce stream. Most of the modeling is done with the Monte Carlo code MCNP to describe the transport of gammas and neutrons from extended sources through large, irregularly shaped absorbers to large detectors. A fundamental prerequisite is the validation of the computational models against field measurements. We describe the first step of this validation process, the comparison of the models to measurements with bare static sources

  19. Bang-bang Model for Regulation of Local Blood Flow

    Science.gov (United States)

    Golub, Aleksander S.; Pittman, Roland N.

    2013-01-01

    The classical model of metabolic regulation of blood flow in muscle tissue implies the maintenance of basal tone in arterioles of resting muscle and their dilation in response to exercise and/or tissue hypoxia via the evoked production of vasodilator metabolites by myocytes. A century-long effort to identify specific metabolites responsible for explaining active and reactive hyperemia has not been successful. Furthermore, the metabolic theory is not compatible with new knowledge on the role of physiological radicals (e.g., nitric oxide, NO, and superoxide anion, O2−) in the regulation of microvascular tone. We propose a model of regulation in which muscle contraction and active hyperemia are considered the physiologically normal state. We employ the “bang-bang” or “on/off” regulatory model which makes use of a threshold and hysteresis; a float valve to control the water level in a tank is a common example of this type of regulation. Active bang-bang regulation comes into effect when the supply of oxygen and glucose exceeds the demand, leading to activation of membrane NADPH oxidase, release of O2− into the interstitial space and subsequent neutralization of the interstitial NO. Switching arterioles on/off when local blood flow crosses the threshold is realized by a local cell circuit with the properties of a bang-bang controller, determined by its threshold, hysteresis and dead-band. This model provides a clear and unambiguous interpretation of the mechanism to balance tissue demand with a sufficient supply of nutrients and oxygen. PMID:23441827

  20. Computer-animated model of accommodation and presbyopia.

    Science.gov (United States)

    Goldberg, Daniel B

    2015-02-01

    To understand, demonstrate, and further research the mechanisms of accommodation and presbyopia. Private practice, Little Silver, New Jersey, USA. Experimental study. The CAMA 2.0 computer-animated model of accommodation and presbyopia was produced in collaboration with an experienced medical animator using Autodesk Maya animation software and Adobe After Effects. The computer-animated model demonstrates the configuration and synchronous movements of all accommodative elements. A new classification of the zonular apparatus based on structure and function is proposed. There are 3 divisions of zonular fibers; that is, anterior, crossing, and posterior. The crossing zonular fibers form a scaffolding to support the lens; the anterior and posterior zonular fibers work reciprocally to achieve focused vision. The model demonstrates the important support function of Weiger ligament. Dynamic movement of the ora serrata demonstrates that the forces of ciliary muscle contraction store energy for disaccommodation in the elastic choroid. The flow of aqueous and vitreous provides strong evidence for our understanding of the hydrodynamic interactions during the accommodative cycle. The interaction may result from the elastic stretch in the choroid transmitted to the vitreous rather than from vitreous pressue. The model supports the concept that presbyopia results from loss of elasticity and increasing ocular rigidity in both the lenticular and extralenticular structures. The computer-animated model demonstrates the structures of accommodation moving in synchrony and might enhance understanding of the mechanisms of accommodation and presbyopia. Dr. Goldberg is a consultant to Acevision, Inc., and Bausch & Lomb. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  1. Computer - based modeling in extract sciences research -III ...

    African Journals Online (AJOL)

    Molecular modeling techniques have been of great applicability in the study of the biological sciences and other exact science fields like agriculture, mathematics, computer science and the like. In this write up, a list of computer programs for predicting, for instance, the structure of proteins has been provided. Discussions on ...

  2. Application of computer-aided multi-scale modelling framework – Aerosol case study

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Glarborg, Peter

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy and water. This trend is set to continue due to the substantial benefits computer-aided...... methods provide. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task involving...... numerous steps, expert skills and different modelling tools. This motivates the development of a computer-aided modelling framework that supports the user during model development, documentation, analysis, identification, application and re-use with the goal to increase the efficiency of the modelling...

  3. Computer Aided Multi-Data Fusion Dismount Modeling

    Science.gov (United States)

    2012-03-22

    dependent on a particular environmental condition. They are costly, cumbersome, and involve dedicated software practices and particular knowledge to operate...allow manipulation of 2D matrices, like Microsoft Excel or Libre Office. The second alternative is to modify an already created model (MEM). The model... software . Therefore, with the described computer aided multi-data dismount model the researcher will be able to attach signatures to any desired

  4. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  5. One-dimensional computational modeling on nuclear reactor problems

    International Nuclear Information System (INIS)

    Alves Filho, Hermes; Baptista, Josue Costa; Trindade, Luiz Fernando Santos; Heringer, Juan Diego dos Santos

    2013-01-01

    In this article, we present a computational modeling, which gives us a dynamic view of some applications of Nuclear Engineering, specifically in the power distribution and the effective multiplication factor (keff) calculations. We work with one-dimensional problems of deterministic neutron transport theory, with the linearized Boltzmann equation in the discrete ordinates (SN) formulation, independent of time, with isotropic scattering and then built a software (Simulator) for modeling computational problems used in a typical calculations. The program used in the implementation of the simulator was Matlab, version 7.0. (author)

  6. Modelling of hydro and wind power in the regulation market

    International Nuclear Information System (INIS)

    Kiviluoma, J.; Holttinen, H.; Meibom, P.

    2006-01-01

    The amount of required regulation capacity in the power system is affected by the wind power prediction errors. A model has been developed which can evaluate the monetary effects of prediction errors. The model can be used to evaluate (1) the regulation costs of wind power, (2) regulation market prices including effects related to the participation of power producers in the regulating power market, (3) value of accurate wind forecasts and (4) the effect of decreasing the length of the spot market clearance. This article discusses the problems related to developing a realistic model of the regulating power market including the interaction between the spot market and the regulating power market. There are several issues that make things complicated. (1) How to calculate the minimum amount of needed secondary (minute) reserves. Traditionally the Nordic TSOs have used an N-1 criteria in each country to determine the required amounts of positive secondary reserve, but as installed wind power capacity grows, it will become relevant to include the wind power prediction errors in the estimation of secondary reserves. (2) Consumption forecast errors and plant outages also contribute to activation of regulating power and should have stochastic input series besides wind power. (3) Risk premiums and transaction costs in the regulating power market are difficult to estimate as well as the effects of the possible use of market power. This is especially true in the Nordic system with the high share of hydro power, since the water value and hydrological limitations make things more complex than in a thermal system. (4) The available regulation capacity is not necessarily equal to the truly available capacity. All producers don't participate in the regulation market although in principle they could. (orig.)

  7. Visual Attention Modeling for Stereoscopic Video: A Benchmark and Computational Model.

    Science.gov (United States)

    Fang, Yuming; Zhang, Chi; Li, Jing; Lei, Jianjun; Perreira Da Silva, Matthieu; Le Callet, Patrick

    2017-10-01

    In this paper, we investigate the visual attention modeling for stereoscopic video from the following two aspects. First, we build one large-scale eye tracking database as the benchmark of visual attention modeling for stereoscopic video. The database includes 47 video sequences and their corresponding eye fixation data. Second, we propose a novel computational model of visual attention for stereoscopic video based on Gestalt theory. In the proposed model, we extract the low-level features, including luminance, color, texture, and depth, from discrete cosine transform coefficients, which are used to calculate feature contrast for the spatial saliency computation. The temporal saliency is calculated by the motion contrast from the planar and depth motion features in the stereoscopic video sequences. The final saliency is estimated by fusing the spatial and temporal saliency with uncertainty weighting, which is estimated by the laws of proximity, continuity, and common fate in Gestalt theory. Experimental results show that the proposed method outperforms the state-of-the-art stereoscopic video saliency detection models on our built large-scale eye tracking database and one other database (DML-ITRACK-3D).

  8. Computer modeling of commercial refrigerated warehouse facilities

    International Nuclear Information System (INIS)

    Nicoulin, C.V.; Jacobs, P.C.; Tory, S.

    1997-01-01

    The use of computer models to simulate the energy performance of large commercial refrigeration systems typically found in food processing facilities is an area of engineering practice that has seen little development to date. Current techniques employed in predicting energy consumption by such systems have focused on temperature bin methods of analysis. Existing simulation tools such as DOE2 are designed to model commercial buildings and grocery store refrigeration systems. The HVAC and Refrigeration system performance models in these simulations tools model equipment common to commercial buildings and groceries, and respond to energy-efficiency measures likely to be applied to these building types. The applicability of traditional building energy simulation tools to model refrigerated warehouse performance and analyze energy-saving options is limited. The paper will present the results of modeling work undertaken to evaluate energy savings resulting from incentives offered by a California utility to its Refrigerated Warehouse Program participants. The TRNSYS general-purpose transient simulation model was used to predict facility performance and estimate program savings. Custom TRNSYS components were developed to address modeling issues specific to refrigerated warehouse systems, including warehouse loading door infiltration calculations, an evaporator model, single-state and multi-stage compressor models, evaporative condenser models, and defrost energy requirements. The main focus of the paper will be on the modeling approach. The results from the computer simulations, along with overall program impact evaluation results, will also be presented

  9. Untangling the complexity of blood coagulation network: use of computational modelling in pharmacology and diagnostics.

    Science.gov (United States)

    Shibeko, Alexey M; Panteleev, Mikhail A

    2016-05-01

    Blood coagulation is a complex biochemical network that plays critical roles in haemostasis (a physiological process that stops bleeding on injury) and thrombosis (pathological vessel occlusion). Both up- and down-regulation of coagulation remain a major challenge for modern medicine, with the ultimate goal to correct haemostasis without causing thrombosis and vice versa. Mathematical/computational modelling is potentially an important tool for understanding blood coagulation disorders and their treatment. It can save a huge amount of time and resources, and provide a valuable alternative or supplement when clinical studies are limited, or not ethical, or technically impossible. This article reviews contemporary state of the art in the modelling of blood coagulation for practical purposes: to reveal the molecular basis of a disease, to understand mechanisms of drug action, to predict pharmacodynamics and drug-drug interactions, to suggest potential drug targets or to improve quality of diagnostics. Different model types and designs used for this are discussed. Functional mechanisms of procoagulant bypassing agents and investigations of coagulation inhibitors were the two particularly popular applications of computational modelling that gave non-trivial results. Yet, like any other tool, modelling has its limitations, mainly determined by insufficient knowledge of the system, uncertainty and unreliability of complex models. We show how to some extent this can be overcome and discuss what can be expected from the mathematical modelling of coagulation in not-so-far future. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  10. SPLAI: Computational Finite Element Model for Sensor Networks

    Directory of Open Access Journals (Sweden)

    Ruzana Ishak

    2006-01-01

    Full Text Available Wireless sensor network refers to a group of sensors, linked by a wireless medium to perform distributed sensing task. The primary interest is their capability in monitoring the physical environment through the deployment of numerous tiny, intelligent, wireless networked sensor nodes. Our interest consists of a sensor network, which includes a few specialized nodes called processing elements that can perform some limited computational capabilities. In this paper, we propose a model called SPLAI that allows the network to compute a finite element problem where the processing elements are modeled as the nodes in the linear triangular approximation problem. Our model also considers the case of some failures of the sensors. A simulation model to visualize this network has been developed using C++ on the Windows environment.

  11. Computer models for fading channels with applications to digital transmission

    Science.gov (United States)

    Loo, Chun; Secord, Norman

    1991-11-01

    The authors describe computer models for Rayleigh, Rician, log-normal, and land-mobile-satellite fading channels. All computer models for the fading channels are based on the manipulation of a white Gaussian random process. This process is approximated by a sum of sinusoids with random phase angle. These models compare very well with analytical models in terms of their probability distribution of envelope and phase of the fading signal. For the land mobile satellite fading channel, results of level crossing rate and average fade duration are given. These results show that the computer models can provide a good coarse estimate of the time statistic of the faded signal. Also, for the land-mobile-satellite fading channel, the results show that a 3-pole Butterworth shaping filter should be used with the model. An example of the application of the land-mobile-satellite fading-channel model to predict the performance of a differential phase-shift keying signal is described.

  12. Computer-Based Molecular Modelling: Finnish School Teachers' Experiences and Views

    Science.gov (United States)

    Aksela, Maija; Lundell, Jan

    2008-01-01

    Modern computer-based molecular modelling opens up new possibilities for chemistry teaching at different levels. This article presents a case study seeking insight into Finnish school teachers' use of computer-based molecular modelling in teaching chemistry, into the different working and teaching methods used, and their opinions about necessary…

  13. Towards a Computational Model of the Self-attribution of Agency

    NARCIS (Netherlands)

    Hindriks, K.V.; Wiggers, P.; Jonker, C.M.; Haselager, W.F.G.; Mehrotra, K.G.; Mohan, C.K.; Oh, J.C.; Varshney, P.K.; Ali, M.

    2011-01-01

    In this paper, a first step towards a computational model of the self-attribution of agency is presented, based on Wegner’s theory of apparent mental causation. A model to compute a feeling of doing based on first-order Bayesian network theory is introduced that incorporates the main contributing

  14. Towards a computational model of the self-attribution of agency

    NARCIS (Netherlands)

    Hindriks, K.V.; Wiggers, P.; Jonker, C.M.; Haselager, W.F.G.; Olivier, P.; Kray, C.

    2007-01-01

    In this paper, a first step towards a computational model of the self-attribution of agency is presented, based on Wegner’s theory of apparent mental causation. A model to compute a feeling of doing based on first-order Bayesian network theory is introduced that incorporates the main contributing

  15. Ewe: a computer model for ultrasonic inspection

    International Nuclear Information System (INIS)

    Douglas, S.R.; Chaplin, K.R.

    1991-11-01

    The computer program EWE simulates the propagation of elastic waves in solids and liquids. It has been applied to ultrasonic testing to study the echoes generated by cracks and other types of defects. A discussion of the elastic wave equations is given, including the first-order formulation, shear and compression waves, surface waves and boundaries, numerical method of solution, models for cracks and slot defects, input wave generation, returning echo construction, and general computer issues

  16. Soft Tissue Biomechanical Modeling for Computer Assisted Surgery

    CERN Document Server

    2012-01-01

      This volume focuses on the biomechanical modeling of biological tissues in the context of Computer Assisted Surgery (CAS). More specifically, deformable soft tissues are addressed since they are the subject of the most recent developments in this field. The pioneering works on this CAS topic date from the 1980's, with applications in orthopaedics and biomechanical models of bones. More recently, however, biomechanical models of soft tissues have been proposed since most of the human body is made of soft organs that can be deformed by the surgical gesture. Such models are much more complicated to handle since the tissues can be subject to large deformations (non-linear geometrical framework) as well as complex stress/strain relationships (non-linear mechanical framework). Part 1 of the volume presents biomechanical models that have been developed in a CAS context and used during surgery. This is particularly new since most of the soft tissues models already proposed concern Computer Assisted Planning, with ...

  17. A data-driven, mathematical model of mammalian cell cycle regulation.

    Directory of Open Access Journals (Sweden)

    Michael C Weis

    Full Text Available Few of >150 published cell cycle modeling efforts use significant levels of data for tuning and validation. This reflects the difficultly to generate correlated quantitative data, and it points out a critical uncertainty in modeling efforts. To develop a data-driven model of cell cycle regulation, we used contiguous, dynamic measurements over two time scales (minutes and hours calculated from static multiparametric cytometry data. The approach provided expression profiles of cyclin A2, cyclin B1, and phospho-S10-histone H3. The model was built by integrating and modifying two previously published models such that the model outputs for cyclins A and B fit cyclin expression measurements and the activation of B cyclin/Cdk1 coincided with phosphorylation of histone H3. The model depends on Cdh1-regulated cyclin degradation during G1, regulation of B cyclin/Cdk1 activity by cyclin A/Cdk via Wee1, and transcriptional control of the mitotic cyclins that reflects some of the current literature. We introduced autocatalytic transcription of E2F, E2F regulated transcription of cyclin B, Cdc20/Cdh1 mediated E2F degradation, enhanced transcription of mitotic cyclins during late S/early G2 phase, and the sustained synthesis of cyclin B during mitosis. These features produced a model with good correlation between state variable output and real measurements. Since the method of data generation is extensible, this model can be continually modified based on new correlated, quantitative data.

  18. An approximate fractional Gaussian noise model with computational cost

    KAUST Repository

    Sørbye, Sigrunn H.

    2017-09-18

    Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood-based approach is ${\\\\mathcal O}(n^{2})$, exploiting the Toeplitz structure of the covariance matrix. In most realistic cases, we do not observe the fGn process directly but only through indirect Gaussian observations, so the Toeplitz structure is easily lost and the computational cost increases to ${\\\\mathcal O}(n^{3})$. This paper presents an approximate fGn model of ${\\\\mathcal O}(n)$ computational cost, both with direct or indirect Gaussian observations, with or without conditioning. This is achieved by approximating fGn with a weighted sum of independent first-order autoregressive processes, fitting the parameters of the approximation to match the autocorrelation function of the fGn model. The resulting approximation is stationary despite being Markov and gives a remarkably accurate fit using only four components. The performance of the approximate fGn model is demonstrated in simulations and two real data examples.

  19. Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations

    Science.gov (United States)

    Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa

    2013-01-01

    The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…

  20. Validation of a phytoremediation computer model

    Energy Technology Data Exchange (ETDEWEB)

    Corapcioglu, M Y; Sung, K; Rhykerd, R L; Munster, C; Drew, M [Texas A and M Univ., College Station, TX (United States)

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg[sub -1

  1. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  2. Depth-Averaged Non-Hydrostatic Hydrodynamic Model Using a New Multithreading Parallel Computing Method

    Directory of Open Access Journals (Sweden)

    Ling Kang

    2017-03-01

    Full Text Available Compared to the hydrostatic hydrodynamic model, the non-hydrostatic hydrodynamic model can accurately simulate flows that feature vertical accelerations. The model’s low computational efficiency severely restricts its wider application. This paper proposes a non-hydrostatic hydrodynamic model based on a multithreading parallel computing method. The horizontal momentum equation is obtained by integrating the Navier–Stokes equations from the bottom to the free surface. The vertical momentum equation is approximated by the Keller-box scheme. A two-step method is used to solve the model equations. A parallel strategy based on block decomposition computation is utilized. The original computational domain is subdivided into two subdomains that are physically connected via a virtual boundary technique. Two sub-threads are created and tasked with the computation of the two subdomains. The producer–consumer model and the thread lock technique are used to achieve synchronous communication between sub-threads. The validity of the model was verified by solitary wave propagation experiments over a flat bottom and slope, followed by two sinusoidal wave propagation experiments over submerged breakwater. The parallel computing method proposed here was found to effectively enhance computational efficiency and save 20%–40% computation time compared to serial computing. The parallel acceleration rate and acceleration efficiency are approximately 1.45% and 72%, respectively. The parallel computing method makes a contribution to the popularization of non-hydrostatic models.

  3. Applying Web-Based Co-Regulated Learning to Develop Students' Learning and Involvement in a Blended Computing Course

    Science.gov (United States)

    Tsai, Chia-Wen

    2015-01-01

    This research investigated, via quasi-experiments, the effects of web-based co-regulated learning (CRL) on developing students' computing skills. Two classes of 68 undergraduates in a one-semester course titled "Applied Information Technology: Data Processing" were chosen for this research. The first class (CRL group, n = 38) received…

  4. Computer modeling of ORNL storage tank sludge mobilization and mixing

    International Nuclear Information System (INIS)

    Terrones, G.; Eyler, L.L.

    1993-09-01

    This report presents and analyzes the results of the computer modeling of mixing and mobilization of sludge in horizontal, cylindrical storage tanks using submerged liquid jets. The computer modeling uses the TEMPEST computational fluid dynamics computer program. The horizontal, cylindrical storage tank configuration is similar to the Melton Valley Storage Tanks (MVST) at Oak Ridge National (ORNL). The MVST tank contents exhibit non-homogeneous, non-Newtonian rheology characteristics. The eventual goals of the simulations are to determine under what conditions sludge mobilization using submerged liquid jets is feasible in tanks of this configuration, and to estimate mixing times required to approach homogeneity of the contents of the tanks

  5. Computational implementation of the multi-mechanism deformation coupled fracture model for salt

    International Nuclear Information System (INIS)

    Koteras, J.R.; Munson, D.E.

    1996-01-01

    The Multi-Mechanism Deformation (M-D) model for creep in rock salt has been used in three-dimensional computations for the Waste Isolation Pilot Plant (WIPP), a potential waste, repository. These computational studies are relied upon to make key predictions about long-term behavior of the repository. Recently, the M-D model was extended to include creep-induced damage. The extended model, the Multi-Mechanism Deformation Coupled Fracture (MDCF) model, is considerably more complicated than the M-D model and required a different technology from that of the M-D model for a computational implementation

  6. Computational modeling of plasma-flow switched foil implosions

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1985-01-01

    A ''plasma-flow'', or ''commutator'', switch has been proposed as a means of achieving high dI/dt in a radially imploding metallic foil plasma. In this concept, an axially moving foil provides the initial coaxial gun discharge path for the prime power source and provides and ''integral'' inductive storage of magnetic energy. As the axially moving foil reaches the end of the coaxial gun, a radially imploding load foil is switched into the circuit. The authors have begun two-dimensional computer modeling of the two-foil implosion system. They use a magnetohydrodynamic (MHD) model which includes tabulated state and transport properties of the metallic foil material. Moving numerical grids are used to achieve adequate resolution of the moving foils. A variety of radiation models are used to compute the radiation generated when the imploding load foil converges on axis. These computations are attempting to examine the interaction of the switching foil with the load foil. In particular, they examine the relationship between foil placement and implosion quality

  7. Comparing the performance of SIMD computers by running large air pollution models

    DEFF Research Database (Denmark)

    Brown, J.; Hansen, Per Christian; Wasniewski, J.

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on these computers. Using a realistic large-scale model, we gained detailed insight about the performance of the computers involved when used to solve large-scale scientific...... problems that involve several types of numerical computations. The computers used in our study are the Connection Machines CM-200 and CM-5, and the MasPar MP-2216...

  8. Reconstructing a Network of Stress-Response Regulators via Dynamic System Modeling of Gene Regulation

    Directory of Open Access Journals (Sweden)

    Wei-Sheng Wu

    2008-01-01

    Full Text Available Unicellular organisms such as yeasts have evolved mechanisms to respond to environmental stresses by rapidly reorganizing the gene expression program. Although many stress-response genes in yeast have been discovered by DNA microarrays, the stress-response transcription factors (TFs that regulate these stress-response genes remain to be investigated. In this study, we use a dynamic system model of gene regulation to describe the mechanism of how TFs may control a gene’s expression. Then, based on the dynamic system model, we develop the Stress Regulator Identification Algorithm (SRIA to identify stress-response TFs for six kinds of stresses. We identified some general stress-response TFs that respond to various stresses and some specific stress-response TFs that respond to one specifi c stress. The biological significance of our findings is validated by the literature. We found that a small number of TFs is probably suffi cient to control a wide variety of expression patterns in yeast under different stresses. Two implications can be inferred from this observation. First, the response mechanisms to different stresses may have a bow-tie structure. Second, there may be regulatory cross-talks among different stress responses. In conclusion, this study proposes a network of stress-response regulators and the details of their actions.

  9. Basic definitions for discrete modeling of computer worms epidemics

    Directory of Open Access Journals (Sweden)

    Pedro Guevara López

    2015-01-01

    Full Text Available The information technologies have evolved in such a way that communication between computers or hosts has become common, so much that the worldwide organization (governments and corporations depends on it; what could happen if these computers stop working for a long time is catastrophic. Unfortunately, networks are attacked by malware such as viruses and worms that could collapse the system. This has served as motivation for the formal study of computer worms and epidemics to develop strategies for prevention and protection; this is why in this paper, before analyzing epidemiological models, a set of formal definitions based on set theory and functions is proposed for describing 21 concepts used in the study of worms. These definitions provide a basis for future qualitative research on the behavior of computer worms, and quantitative for the study of their epidemiological models.

  10. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  11. The Effect of Computer Models as Formative Assessment on Student Understanding of the Nature of Models

    Science.gov (United States)

    Park, Mihwa; Liu, Xiufeng; Smith, Erica; Waight, Noemi

    2017-01-01

    This study reports the effect of computer models as formative assessment on high school students' understanding of the nature of models. Nine high school teachers integrated computer models and associated formative assessments into their yearlong high school chemistry course. A pre-test and post-test of students' understanding of the nature of…

  12. Undergraduate students’ challenges with computational modelling in physics

    Directory of Open Access Journals (Sweden)

    Simen A. Sørby

    2012-12-01

    Full Text Available In later years, computational perspectives have become essential parts in several of the University of Oslo’s natural science studies. In this paper we discuss some main findings from a qualitative study of the computational perspectives’ impact on the students’ work with their first course in physics– mechanics – and their learning and meaning making of its contents. Discussions of the students’ learning of physics are based on sociocultural theory, which originates in Vygotsky and Bakhtin, and subsequent physics education research. Results imply that the greatest challenge for students when working with computational assignments is to combine knowledge from previously known, but separate contexts. Integrating knowledge of informatics, numerical and analytical mathematics and conceptual understanding of physics appears as a clear challenge for the students. We also observe alack of awareness concerning the limitations of physical modelling. The students need help with identifying the appropriate knowledge system or “tool set”, for the different tasks at hand; they need helpto create a plan for their modelling and to become aware of its limits. In light of this, we propose thatan instructive and dialogic text as basis for the exercises, in which the emphasis is on specification, clarification and elaboration, would be of potential great aid for students who are new to computational modelling.

  13. Handbook of nature-inspired and innovative computing integrating classical models with emerging technologies

    CERN Document Server

    2006-01-01

    As computing devices proliferate, demand increases for an understanding of emerging computing paradigms and models based on natural phenomena. This handbook explores the connection between nature-inspired and traditional computational paradigms. It presents computing paradigms and models based on natural phenomena.

  14. Computer simulation models as a tool to investigate the role of microRNAs in osteoarthritis.

    Directory of Open Access Journals (Sweden)

    Carole J Proctor

    Full Text Available The aim of this study was to show how computational models can be used to increase our understanding of the role of microRNAs in osteoarthritis (OA using miR-140 as an example. Bioinformatics analysis and experimental results from the literature were used to create and calibrate models of gene regulatory networks in OA involving miR-140 along with key regulators such as NF-κB, SMAD3, and RUNX2. The individual models were created with the modelling standard, Systems Biology Markup Language, and integrated to examine the overall effect of miR-140 on cartilage homeostasis. Down-regulation of miR-140 may have either detrimental or protective effects for cartilage, indicating that the role of miR-140 is complex. Studies of individual networks in isolation may therefore lead to different conclusions. This indicated the need to combine the five chosen individual networks involving miR-140 into an integrated model. This model suggests that the overall effect of miR-140 is to change the response to an IL-1 stimulus from a prolonged increase in matrix degrading enzymes to a pulse-like response so that cartilage degradation is temporary. Our current model can easily be modified and extended as more experimental data become available about the role of miR-140 in OA. In addition, networks of other microRNAs that are important in OA could be incorporated. A fully integrated model could not only aid our understanding of the mechanisms of microRNAs in ageing cartilage but could also provide a useful tool to investigate the effect of potential interventions to prevent cartilage loss.

  15. Computational comparison of quantum-mechanical models for multistep direct reactions

    International Nuclear Information System (INIS)

    Koning, A.J.; Akkermans, J.M.

    1993-01-01

    We have carried out a computational comparison of all existing quantum-mechanical models for multistep direct (MSD) reactions. The various MSD models, including the so-called Feshbach-Kerman-Koonin, Tamura-Udagawa-Lenske and Nishioka-Yoshida-Weidenmueller models, have been implemented in a single computer system. All model calculations thus use the same set of parameters and the same numerical techniques; only one adjustable parameter is employed. The computational results have been compared with experimental energy spectra and angular distributions for several nuclear reactions, namely, 90 Zr(p,p') at 80 MeV, 209 Bi(p,p') at 62 MeV, and 93 Nb(n,n') at 25.7 MeV. In addition, the results have been compared with the Kalbach systematics and with semiclassical exciton model calculations. All quantum MSD models provide a good fit to the experimental data. In addition, they reproduce the systematics very well and are clearly better than semiclassical model calculations. We furthermore show that the calculated predictions do not differ very strongly between the various quantum MSD models, leading to the conclusion that the simplest MSD model (the Feshbach-Kerman-Koonin model) is adequate for the analysis of experimental data

  16. An integrated introduction to computer graphics and geometric modeling

    CERN Document Server

    Goldman, Ronald

    2009-01-01

    … this book may be the first book on geometric modelling that also covers computer graphics. In addition, it may be the first book on computer graphics that integrates a thorough introduction to 'freedom' curves and surfaces and to the mathematical foundations for computer graphics. … the book is well suited for an undergraduate course. … The entire book is very well presented and obviously written by a distinguished and creative researcher and educator. It certainly is a textbook I would recommend. …-Computer-Aided Design, 42, 2010… Many books concentrate on computer programming and soon beco

  17. Development of a Sampling-Based Global Sensitivity Analysis Workflow for Multiscale Computational Cancer Models

    Science.gov (United States)

    Wang, Zhihui; Deisboeck, Thomas S.; Cristini, Vittorio

    2014-01-01

    There are two challenges that researchers face when performing global sensitivity analysis (GSA) on multiscale in silico cancer models. The first is increased computational intensity, since a multiscale cancer model generally takes longer to run than does a scale-specific model. The second problem is the lack of a best GSA method that fits all types of models, which implies that multiple methods and their sequence need to be taken into account. In this article, we therefore propose a sampling-based GSA workflow consisting of three phases – pre-analysis, analysis, and post-analysis – by integrating Monte Carlo and resampling methods with the repeated use of analysis of variance (ANOVA); we then exemplify this workflow using a two-dimensional multiscale lung cancer model. By accounting for all parameter rankings produced by multiple GSA methods, a summarized ranking is created at the end of the workflow based on the weighted mean of the rankings for each input parameter. For the cancer model investigated here, this analysis reveals that ERK, a downstream molecule of the EGFR signaling pathway, has the most important impact on regulating both the tumor volume and expansion rate in the algorithm used. PMID:25257020

  18. Model Regulations for Borehole Disposal Facilities for Radioactive Waste

    International Nuclear Information System (INIS)

    2017-10-01

    This publication is designed to assist in the development of an appropriate set of regulations for the predisposal management and disposal of disused sealed radioactive sources and small volumes of associated radioactive waste using the IAEA borehole disposal concept. It allows States to appraise the adequacy of their existing regulations and regulatory guides, and can be used as a reference by those States developing regulations for the first time. The model regulations set out in this publication will need to be adapted to take account of the existing national legal and regulatory framework and other local conditions in the State.

  19. A security model for saas in cloud computing

    International Nuclear Information System (INIS)

    Abbas, R.; Farooq, A.

    2016-01-01

    Cloud computing is a type of computing that relies on sharing computing resources rather than having local servers or personal devices to handle applications. It has many service modes like Software as-a-Service (SaaS), Platform-as-a-Service (PaaS), Infrastructure-as-a-Service (IaaS). In SaaS model, service providers install and activate the applications in cloud and cloud customers access the software from cloud. So, the user does not have the need to purchase and install a particular software on his/her machine. While using SaaS model, there are multiple security issues and problems like Data security, Data breaches, Network security, Authentication and authorization, Data integrity, Availability, Web application security and Backup which are faced by users. Many researchers minimize these security problems by putting in hard work. A large work has been done to resolve these problems but there are a lot of issues that persist and need to overcome. In this research work, we have developed a security model that improves the security of data according to the desire of the End-user. The proposed model for different data security options can be helpful to increase the data security through which trade-off between functionalities can be optimized for private and public data. (author)

  20. Computer modeling of liquid crystals

    International Nuclear Information System (INIS)

    Al-Barwani, M.S.

    1999-01-01

    In this thesis, we investigate several aspects of the behaviour of liquid crystal molecules near interfaces using computer simulation. We briefly discuss experiment, theoretical and computer simulation studies of some of the liquid crystal interfaces. We then describe three essentially independent research topics. The first of these concerns extensive simulations of a liquid crystal formed by long flexible molecules. We examined the bulk behaviour of the model and its structure. Studies of a film of smectic liquid crystal surrounded by vapour were also carried out. Extensive simulations were also done for a long-molecule/short-molecule mixture, studies were then carried out to investigate the liquid-vapour interface of the mixture. Next, we report the results of large scale simulations of soft-spherocylinders of two different lengths. We examined the bulk coexistence of the nematic and isotropic phases of the model. Once the bulk coexistence behaviour was known, properties of the nematic-isotropic interface were investigated. This was done by fitting order parameter and density profiles to appropriate mathematical functions and calculating the biaxial order parameter. We briefly discuss the ordering at the interfaces and make attempts to calculate the surface tension. Finally, in our third project, we study the effects of different surface topographies on creating bistable nematic liquid crystal devices. This was carried out using a model based on the discretisation of the free energy on a lattice. We use simulation to find the lowest energy states and investigate if they are degenerate in energy. We also test our model by studying the Frederiks transition and comparing with analytical and other simulation results. (author)

  1. 1st European-Middle Asian Conference on Computer Modelling 2015

    CERN Document Server

    Kolosov, Dmitrii; Snášel, Václav; Karakeyev, Taalaybek; Abraham, Ajith

    2016-01-01

    This volume of Advances in Intelligent Systems and Computing contains papers presented at the 1st European-Middle Asian Conference on Computer Modelling, EMACOM 2015. This international conference was conceived as a brand new scientific and social event of mutual collaboration between the VSB - Technical University of Ostrava (Ostrava, Czech Republic) and the Kyrgyz National University named after J. Balasagyn (Bishkek, Kyrgyz Republic). The aim of EMACOM 2015 was to present the latest development in the field of computer-aided modelling as an essential aspect of research and development of innovative systems and their applications. The conference showed that together with simulations, various modeling techniques, enabled and encouraged by the rapid development of high-performance computing platforms, are crucial for cost-efficient design, verification, and prototyping of solutions in many diverse industrial fields spanning the whole range from manufacturing, mining, machinery, and automotive industries to in...

  2. Visual and Computational Modelling of Minority Games

    Directory of Open Access Journals (Sweden)

    Robertas Damaševičius

    2017-02-01

    Full Text Available The paper analyses the Minority Game and focuses on analysis and computational modelling of several variants (variable payoff, coalition-based and ternary voting of Minority Game using UAREI (User-Action-Rule-Entities-Interface model. UAREI is a model for formal specification of software gamification, and the UAREI visual modelling language is a language used for graphical representation of game mechanics. The URAEI model also provides the embedded executable modelling framework to evaluate how the rules of the game will work for the players in practice. We demonstrate flexibility of UAREI model for modelling different variants of Minority Game rules for game design.

  3. Computational quench model applicable to the SMES/CICC

    Science.gov (United States)

    Luongo, Cesar A.; Chang, Chih-Lien; Partain, Kenneth D.

    1994-07-01

    A computational quench model accounting for the hydraulic peculiarities of the 200 kA SMES cable-in-conduit conductor has been developed. The model is presented and used to simulate the quench on the SMES-ETM. Conclusions are drawn concerning quench detection and protection. A plan for quench model validation is presented.

  4. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

    International Nuclear Information System (INIS)

    Foster, C.

    2001-01-01

    The development of facilities to deal with the disposition of nuclear materials at an acceptable level of Occupational Radiation Exposure (ORE) is a significant issue facing the nuclear community. One solution is to minimize the worker's exposure though the use of automated systems. However, the adoption of automated systems for these tasks is hampered by the challenging requirements that these systems must meet in order to be cost effective solutions in the hazardous nuclear materials processing environment. Retrofitting current glove box technologies with automation systems represents potential near-term technology that can be applied to reduce worker ORE associated with work in nuclear materials processing facilities. Successful deployment of automation systems for these applications requires the development of testing and deployment strategies to ensure the highest level of safety and effectiveness. Historically, safety tests are conducted with glove box mock-ups around the finished design. This late detection of problems leads to expensive redesigns and costly deployment delays. With wide spread availability of computers and cost effective simulation software it is possible to discover and fix problems early in the design stages. Computer simulators can easily create a complete model of the system allowing a safe medium for testing potential failures and design shortcomings. The majority of design specification is now done on computer and moving that information to a model is relatively straightforward. With a complete model and results from a Failure Mode Effect Analysis (FMEA), redesigns can be worked early. Additional issues such as user accessibility, component replacement, and alignment problems can be tackled early in the virtual environment provided by computer simulation. In this case, a commercial simulation package is used to simulate a lathe process operation at the Los Alamos National Laboratory (LANL). The Lathe process operation is indicative of

  5. Computer model for harmonic ultrasound imaging.

    Science.gov (United States)

    Li, Y; Zagzebski, J A

    2000-01-01

    Harmonic ultrasound imaging has received great attention from ultrasound scanner manufacturers and researchers. In this paper, we present a computer model that can generate realistic harmonic images. In this model, the incident ultrasound is modeled after the "KZK" equation, and the echo signal is modeled using linear propagation theory because the echo signal is much weaker than the incident pulse. Both time domain and frequency domain numerical solutions to the "KZK" equation were studied. Realistic harmonic images of spherical lesion phantoms were generated for scans by a circular transducer. This model can be a very useful tool for studying the harmonic buildup and dissipation processes in a nonlinear medium, and it can be used to investigate a wide variety of topics related to B-mode harmonic imaging.

  6. A novel low-parameter computational model to aid in-silico glycoengineering

    DEFF Research Database (Denmark)

    Spahn, Philipp N.; Hansen, Anders Holmgaard; Hansen, Henning Gram

    2015-01-01

    benefit from computational models that would better meet the requirements for industrial utilization. Here, we introduce a novel approach combining constraints-based and stochastic techniques to derive a computational model that can predict the effects of gene knockouts on protein glycoprofiles while...... it does not follow any direct equivalent of a genetic code. Instead, its complex biogenesis in the Golgi apparatus (Figure 1A) integrates a variety of influencing factors most of which are only incompletely understood. Various attempts have been undertaken so far to computationally model the process...

  7. English Writing Teaching Model Dependent on Computer Network Corpus Drive Model

    Directory of Open Access Journals (Sweden)

    Shi Lei

    2018-03-01

    Full Text Available At present, the mainstream lexicalized English writing methods take only the corpus dependence between words into consideration, without introducing the corpus collocation and other issues. “Drive” is a relatively essential feature of words. And once the drive structure of a word is determined, it will be relatively clear what kinds of words to collocate with, hence the structure of the sentence can be derived relatively directly. In this paper, the English writing model that relies on the computer network corpus drive model is put forward. In this model, rich English corpus is introduced in the decomposition of the rules and the calculation of the probability, which includes not only the corpus dependence information, but also the drive structure and other corpus collocation information. Improved computer network corpus drive model is used to carry out the English writing teaching experiment. The experimental results show that the precision and the recall rate are 88.76% and 87.43%, respectively. The F value of the comprehensive index is improved by 6.65% compared with the Collins headword driven English modes of writing.

  8. Computational fluid-dynamic model of laser-induced breakdown in air

    International Nuclear Information System (INIS)

    Dors, Ivan G.; Parigger, Christian G.

    2003-01-01

    Temperature and pressure profiles are computed by the use of a two-dimensional, axially symmetric, time-accurate computational fluid-dynamic model for nominal 10-ns optical breakdown laser pulses. The computational model includes a kinetics mechanism that implements plasma equilibrium kinetics in ionized regions and nonequilibrium, multistep, finite-rate reactions in nonionized regions. Fluid-physics phenomena following laser-induced breakdown are recorded with high-speed shadowgraph techniques. The predicted fluid phenomena are shown by direct comparison with experimental records to agree with the flow patterns that are characteristic of laser spark decay

  9. ANS main control complex three-dimensional computer model development

    International Nuclear Information System (INIS)

    Cleaves, J.E.; Fletcher, W.M.

    1993-01-01

    A three-dimensional (3-D) computer model of the Advanced Neutron Source (ANS) main control complex is being developed. The main control complex includes the main control room, the technical support center, the materials irradiation control room, computer equipment rooms, communications equipment rooms, cable-spreading rooms, and some support offices and breakroom facilities. The model will be used to provide facility designers and operations personnel with capabilities for fit-up/interference analysis, visual ''walk-throughs'' for optimizing maintain-ability, and human factors and operability analyses. It will be used to determine performance design characteristics, to generate construction drawings, and to integrate control room layout, equipment mounting, grounding equipment, electrical cabling, and utility services into ANS building designs. This paper describes the development of the initial phase of the 3-D computer model for the ANS main control complex and plans for its development and use

  10. An ODP computational model of a cooperative binding object

    Science.gov (United States)

    Logé, Christophe; Najm, Elie; Chen, Ken

    1997-12-01

    A next generation of systems that should appear will have to manage simultaneously several geographically distributed users. These systems belong to the class of computer-supported cooperative work systems (CSCW). The development of such complex systems requires rigorous development methods and flexible open architectures. Open distributed processing (ODP) is a standardization effort that aims at providing such architectures. ODP features appropriate abstraction levels and a clear articulation between requirements, programming and infrastructure support. ODP advocates the use of formal methods for the specification of systems and components. The computational model, an object-based model, one of the abstraction levels identified within ODP, plays a central role in the global architecture. In this model, basic objects can be composed with communication and distribution abstractions (called binding objects) to form a computational specification of distributed systems, or applications. Computational specifications can then be mapped (in a mechanism akin to compilation) onto an engineering solution. We use an ODP-inspired method to computationally specify a cooperative system. We start from a general purpose component that we progressively refine into a collection of basic and binding objects. We focus on two issues of a co-authoring application, namely, dynamic reconfiguration and multiview synchronization. We discuss solutions for these issues and formalize them using the MT-LOTOS specification language that is currently studied in the ISO standardization formal description techniques group.

  11. Phantoms and computational models in therapy, diagnosis and protection

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    The development of realistic body phantoms and computational models is strongly dependent on the availability of comprehensive human anatomical data. This information is often missing, incomplete or not easily available. Therefore, emphasis is given in the Report to organ and body masses and geometries. The influence of age, sex and ethnic origins in human anatomy is considered. Suggestions are given on how suitable anatomical data can be either extracted from published information or obtained from measurements on the local population. Existing types of phantoms and computational models used with photons, electrons, protons and neutrons are reviewed in this Report. Specifications of those considered important to the maintenance and development of reliable radiation dosimetry and measurement are given. The information provided includes a description of the phantom or model, together with diagrams or photographs and physical dimensions. The tissues within body sections are identified and the tissue substitutes used or recommended are listed. The uses of the phantom or model in radiation dosimetry and measurement are outlined. The Report deals predominantly with phantom and computational models representing the human anatomy, with a short Section devoted to animal phantoms in radiobiology

  12. A Novel Computer Virus Propagation Model under Security Classification

    Directory of Open Access Journals (Sweden)

    Qingyi Zhu

    2017-01-01

    Full Text Available In reality, some computers have specific security classification. For the sake of safety and cost, the security level of computers will be upgraded with increasing of threats in networks. Here we assume that there exists a threshold value which determines when countermeasures should be taken to level up the security of a fraction of computers with low security level. And in some specific realistic environments the propagation network can be regarded as fully interconnected. Inspired by these facts, this paper presents a novel computer virus dynamics model considering the impact brought by security classification in full interconnection network. By using the theory of dynamic stability, the existence of equilibria and stability conditions is analysed and proved. And the above optimal threshold value is given analytically. Then, some numerical experiments are made to justify the model. Besides, some discussions and antivirus measures are given.

  13. A computational analysis of the long-term regulation of arterial pressure [v1; ref status: indexed, http://f1000r.es/1xq

    Directory of Open Access Journals (Sweden)

    Daniel A. Beard

    2013-10-01

    Full Text Available The asserted dominant role of the kidneys in the chronic regulation of blood pressure and in the etiology of hypertension has been debated since the 1970s. At the center of the theory is the observation that the acute relationships between arterial pressure and urine production—the acute pressure-diuresis and pressure-natriuresis curves—physiologically adapt to perturbations in pressure and/or changes in the rate of salt and volume intake. These adaptations, modulated by various interacting neurohumoral mechanisms, result in chronic relationships between water and salt excretion and pressure that are much steeper than the acute relationships. While the view that renal function is the dominant controller of arterial pressure has been supported by computer models of the cardiovascular system known as the “Guyton-Coleman model”, no unambiguous description of a computer model capturing chronic adaptation of acute renal function in blood pressure control has been presented. Here, such a model is developed with the goals of: 1. capturing the relevant mechanisms in an identifiable mathematical model; 2. identifying model parameters using appropriate data; 3. validating model predictions in comparison to data; and 4. probing hypotheses regarding the long-term control of arterial pressure and the etiology of primary hypertension. The developed model reveals: long-term control of arterial blood pressure is primarily through the baroreflex arc and the renin-angiotensin system; and arterial stiffening provides a sufficient explanation for the etiology of primary hypertension associated with ageing. Furthermore, the model provides the first consistent explanation of the physiological response to chronic stimulation of the baroreflex.

  14. Computer-controlled mechanical lung model for application in pulmonary function studies

    NARCIS (Netherlands)

    A.F.M. Verbraak (Anton); J.E.W. Beneken; J.M. Bogaard (Jan); A. Versprille (Adrian)

    1995-01-01

    textabstractA computer controlled mechanical lung model has been developed for testing lung function equipment, validation of computer programs and simulation of impaired pulmonary mechanics. The construction, function and some applications are described. The physical model is constructed from two

  15. The Next Generation ARC Middleware and ATLAS Computing Model

    International Nuclear Information System (INIS)

    Filipčič, Andrej; Cameron, David; Konstantinov, Aleksandr; Karpenko, Dmytro; Smirnova, Oxana

    2012-01-01

    The distributed NDGF Tier-1 and associated NorduGrid clusters are well integrated into the ATLAS computing environment but follow a slightly different paradigm than other ATLAS resources. The current paradigm does not divide the sites as in the commonly used hierarchical model, but rather treats them as a single storage endpoint and a pool of distributed computing nodes. The next generation ARC middleware with its several new technologies provides new possibilities in development of the ATLAS computing model, such as pilot jobs with pre-cached input files, automatic job migration between the sites, integration of remote sites without connected storage elements, and automatic brokering for jobs with non-standard resource requirements. ARC's data transfer model provides an automatic way for the computing sites to participate in ATLAS’ global task management system without requiring centralised brokering or data transfer services. The powerful API combined with Python and Java bindings can easily be used to build new services for job control and data transfer. Integration of the ARC core into the EMI middleware provides a natural way to implement the new services using the ARC components

  16. A Computer Model for Analyzing Volatile Removal Assembly

    Science.gov (United States)

    Guo, Boyun

    2010-01-01

    A computer model simulates reactional gas/liquid two-phase flow processes in porous media. A typical process is the oxygen/wastewater flow in the Volatile Removal Assembly (VRA) in the Closed Environment Life Support System (CELSS) installed in the International Space Station (ISS). The volatile organics in the wastewater are combusted by oxygen gas to form clean water and carbon dioxide, which is solved in the water phase. The model predicts the oxygen gas concentration profile in the reactor, which is an indicator of reactor performance. In this innovation, a mathematical model is included in the computer model for calculating the mass transfer from the gas phase to the liquid phase. The amount of mass transfer depends on several factors, including gas-phase concentration, distribution, and reaction rate. For a given reactor dimension, these factors depend on pressure and temperature in the reactor and composition and flow rate of the influent.

  17. A stylized computational model of the head for the reference Japanese male

    International Nuclear Information System (INIS)

    Yamauchi, M.; Ishikawa, M.; Hoshi, M.

    2005-01-01

    Computational models of human anatomy, along with Monte Carlo radiation transport simulations, have been used by Snyder et al. [MIRD Pamphlet No. 5, revised (The Society of Nuclear Medicine, New York, 1978)], Cristy and Eckerman [ORNL/TM-8381/VI, Oak Ridge National Laboratory, Oak Ridge, TN (1987)] and Zubal et al. [Med. Phys. 21, 299-302 (1994)] to estimate internal organ doses from internal and external radiation sources. These were created using physiological data from Caucasoid subjects but not from other races. There is a need for research to determine whether the obvious differences from the Caucasoid anatomy make these models unsuitable for estimating the absorbed dose in other races such as the Mongoloid. We used the cranial region of the adult Japanese male to represent the Mongoloid race. This region contains organs that are highly sensitive to radiation. The cranial region of a physical phantom produced by KYOTO KAGAKU Co., LTD. using numerical data from a Japanese Reference Man [Tanaka, Nippon Acta. Radiol. 48, 509-513 (1988)] was used to supply the data for the geometry of a stylized computational model. Our computational model was constructed with equations rather than voxel-based, in order to deal with as small a number of parameters as possible in the computer simulation experiment. The accuracy of our computational model was checked by comparing simulated experimental results obtained with MCNP4C with actual doses measured with thermoluminescence dosimeters (TLDs) inside the physical phantom from which our computational model was constructed. The TLDs, whose margin of error is less than ±10%, were arranged at six positions. Co-60 was used as the radiation source. The irradiated dose was 2 Gy in terms of air kerma. In the computer simulation experiments, we used our computational model and Cristy's computational model, whose component data are those of the tissue substitute materials and of the human body as published in ICRU Report 46. The

  18. A Framework for Understanding Physics Students' Computational Modeling Practices

    Science.gov (United States)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  19. Is Model-Based Development a Favorable Approach for Complex and Safety-Critical Computer Systems on Commercial Aircraft?

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2014-01-01

    A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.

  20. Universal quantum computation by scattering in the Fermi–Hubbard model

    International Nuclear Information System (INIS)

    Bao, Ning; Hayden, Patrick; Salton, Grant; Thomas, Nathaniel

    2015-01-01

    The Hubbard model may be the simplest model of particles interacting on a lattice, but simulation of its dynamics remains beyond the reach of current numerical methods. In this article, we show that general quantum computations can be encoded into the physics of wave packets propagating through a planar graph, with scattering interactions governed by the fermionic Hubbard model. Therefore, simulating the model on planar graphs is as hard as simulating quantum computation. We give two different arguments, demonstrating that the simulation is difficult both for wave packets prepared as excitations of the fermionic vacuum, and for hole wave packets at filling fraction one-half in the limit of strong coupling. In the latter case, which is described by the t-J model, there is only reflection and no transmission in the scattering events, as would be the case for classical hard spheres. In that sense, the construction provides a quantum mechanical analog of the Fredkin–Toffoli billiard ball computer. (paper)

  1. Two-parametric model of electron beam in computational dosimetry for radiation processing

    International Nuclear Information System (INIS)

    Lazurik, V.M.; Lazurik, V.T.; Popov, G.; Zimek, Z.

    2016-01-01

    Computer simulation of irradiation process of various materials with electron beam (EB) can be applied to correct and control the performances of radiation processing installations. Electron beam energy measurements methods are described in the international standards. The obtained results of measurements can be extended by implementation computational dosimetry. Authors have developed the computational method for determination of EB energy on the base of two-parametric fitting of semi-empirical model for the depth dose distribution initiated by mono-energetic electron beam. The analysis of number experiments show that described method can effectively consider random displacements arising from the use of aluminum wedge with a continuous strip of dosimetric film and minimize the magnitude uncertainty value of the electron energy evaluation, calculated from the experimental data. Two-parametric fitting method is proposed for determination of the electron beam model parameters. These model parameters are as follow: E 0 – energy mono-energetic and mono-directional electron source, X 0 – the thickness of the aluminum layer, located in front of irradiated object. That allows obtain baseline data related to the characteristic of the electron beam, which can be later on applied for computer modeling of the irradiation process. Model parameters which are defined in the international standards (like E p – the most probably energy and R p – practical range) can be linked with characteristics of two-parametric model (E 0 , X 0 ), which allows to simulate the electron irradiation process. The obtained data from semi-empirical model were checked together with the set of experimental results. The proposed two-parametric model for electron beam energy evaluation and estimation of accuracy for computational dosimetry methods on the base of developed model are discussed. - Highlights: • Experimental and computational methods of electron energy evaluation. • Development

  2. Financial Regulation in an Agent Based Macroeconomic Model

    OpenAIRE

    Riccetti, Luca; Russo, Alberto; Mauro, Gallegati

    2013-01-01

    Starting from the agent-based decentralized matching macroeconomic model proposed in Riccetti et al. (2012), we explore the effects of banking regulation on macroeconomic dynamics. In particular, we study the overall credit exposure and the lending concentration towards a single counterparty, finding that the portfolio composition seems to be more relevant than the overall exposure for banking stability, even if both features are very important. We show that a too tight regulation is dangerou...

  3. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  4. Dynamical Trust and Reputation Computation Model for B2C E-Commerce

    Directory of Open Access Journals (Sweden)

    Bo Tian

    2015-10-01

    Full Text Available Trust is one of the most important factors that influence the successful application of network service environments, such as e-commerce, wireless sensor networks, and online social networks. Computation models associated with trust and reputation have been paid special attention in both computer societies and service science in recent years. In this paper, a dynamical computation model of reputation for B2C e-commerce is proposed. Firstly, conceptions associated with trust and reputation are introduced, and the mathematical formula of trust for B2C e-commerce is given. Then a dynamical computation model of reputation is further proposed based on the conception of trust and the relationship between trust and reputation. In the proposed model, classical varying processes of reputation of B2C e-commerce are discussed. Furthermore, the iterative trust and reputation computation models are formulated via a set of difference equations based on the closed-loop feedback mechanism. Finally, a group of numerical simulation experiments are performed to illustrate the proposed model of trust and reputation. Experimental results show that the proposed model is effective in simulating the dynamical processes of trust and reputation for B2C e-commerce.

  5. Getting computer models to communicate; Faire communiquer les modeles numeriques

    Energy Technology Data Exchange (ETDEWEB)

    Caremoli, Ch. [Electricite de France (EDF), 75 - Paris (France). Dept. Mecanique et Modeles Numeriques; Erhard, P. [Electricite de France (EDF), 75 - Paris (France). Dept. Physique des Reacteurs

    1999-07-01

    Today's computers have the processing power to deliver detailed and global simulations of complex industrial processes such as the operation of a nuclear reactor core. So should we be producing new, global numerical models to take full advantage of this new-found power? If so, it would be a long-term job. There is, however, another solution; to couple the existing validated numerical models together so that they work as one. (authors)

  6. Computer model for economic study of unbleached kraft paperboard production

    Science.gov (United States)

    Peter J. Ince

    1984-01-01

    Unbleached kraft paperboard is produced from wood fiber in an industrial papermaking process. A highly specific and detailed model of the process is presented. The model is also presented as a working computer program. A user of the computer program will provide data on physical parameters of the process and on prices of material inputs and outputs. The program is then...

  7. A Situative Space Model for Mobile Mixed-Reality Computing

    DEFF Research Database (Denmark)

    Pederson, Thomas; Janlert, Lars-Erik; Surie, Dipak

    2011-01-01

    This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time.......This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time....

  8. Computer-Aided Multiscale Modelling for Chemical Process Engineering

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Gani, Rafiqul

    2007-01-01

    Chemical processes are generally modeled through monoscale approaches, which, while not adequate, satisfy a useful role in product-process design. In this case, use of a multi-dimensional and multi-scale model-based approach has importance in product-process development. A computer-aided framework...

  9. A Computational Model of Fraction Arithmetic

    Science.gov (United States)

    Braithwaite, David W.; Pyke, Aryn A.; Siegler, Robert S.

    2017-01-01

    Many children fail to master fraction arithmetic even after years of instruction, a failure that hinders their learning of more advanced mathematics as well as their occupational success. To test hypotheses about why children have so many difficulties in this area, we created a computational model of fraction arithmetic learning and presented it…

  10. Advanced computational modelling for drying processes – A review

    International Nuclear Information System (INIS)

    Defraeye, Thijs

    2014-01-01

    Highlights: • Understanding the product dehydration process is a key aspect in drying technology. • Advanced modelling thereof plays an increasingly important role for developing next-generation drying technology. • Dehydration modelling should be more energy-oriented. • An integrated “nexus” modelling approach is needed to produce more energy-smart products. • Multi-objective process optimisation requires development of more complete multiphysics models. - Abstract: Drying is one of the most complex and energy-consuming chemical unit operations. R and D efforts in drying technology have skyrocketed in the past decades, as new drivers emerged in this industry next to procuring prime product quality and high throughput, namely reduction of energy consumption and carbon footprint as well as improving food safety and security. Solutions are sought in optimising existing technologies or developing new ones which increase energy and resource efficiency, use renewable energy, recuperate waste heat and reduce product loss, thus also the embodied energy therein. Novel tools are required to push such technological innovations and their subsequent implementation. Particularly computer-aided drying process engineering has a large potential to develop next-generation drying technology, including more energy-smart and environmentally-friendly products and dryers systems. This review paper deals with rapidly emerging advanced computational methods for modelling dehydration of porous materials, particularly for foods. Drying is approached as a combined multiphysics, multiscale and multiphase problem. These advanced methods include computational fluid dynamics, several multiphysics modelling methods (e.g. conjugate modelling), multiscale modelling and modelling of material properties and the associated propagation of material property variability. Apart from the current challenges for each of these, future perspectives should be directed towards material property

  11. Computer Simulation (Microcultures): An Effective Model for Multicultural Education.

    Science.gov (United States)

    Nelson, Jorge O.

    This paper presents a rationale for using high-fidelity computer simulation in planning for and implementing effective multicultural education strategies. Using computer simulation, educators can begin to understand and plan for the concept of cultural sensitivity in delivering instruction. The model promises to emphasize teachers' understanding…

  12. Spatial Structures and Regulation in Biological Systems

    DEFF Research Database (Denmark)

    Yde, Pernille

    , and the other is the spatial regulation of biological systems, here related to different aspects of the inflammatory response. All systems are studied using computational modelling and mathematical analysis. The first part of the thesis explores different protein aggregation scenarios. In Chapter 1, we consider...... a previously studied and very general aggregation model describing frangible linear filaments. This model is especially relevant for the growth of amyloid fibres, that have been related to a number of serious human diseases, and which are known to grow in an accelerated self-enhanced manner.We derive...... model of the tissue and show how coupled cells are able to function as an excitable medium and propagate waves of high cytokine concentration through the tissue. If the internal regulation in the cells is over-productive, the model predicts a continuous amplification of cytokines, which spans the entire...

  13. Cloud Computing, Tieto Cloud Server Model

    OpenAIRE

    Suikkanen, Saara

    2013-01-01

    The purpose of this study is to find out what is cloud computing. To be able to make wise decisions when moving to cloud or considering it, companies need to understand what cloud is consists of. Which model suits best to they company, what should be taken into account before moving to cloud, what is the cloud broker role and also SWOT analysis of cloud? To be able to answer customer requirements and business demands, IT companies should develop and produce new service models. IT house T...

  14. Computing Models of M-type Host Stars and their Panchromatic Spectral Output

    Science.gov (United States)

    Linsky, Jeffrey; Tilipman, Dennis; France, Kevin

    2018-06-01

    We have begun a program of computing state-of-the-art model atmospheres from the photospheres to the coronae of M stars that are the host stars of known exoplanets. For each model we are computing the emergent radiation at all wavelengths that are critical for assessingphotochemistry and mass-loss from exoplanet atmospheres. In particular, we are computing the stellar extreme ultraviolet radiation that drives hydrodynamic mass loss from exoplanet atmospheres and is essential for determing whether an exoplanet is habitable. The model atmospheres are computed with the SSRPM radiative transfer/statistical equilibrium code developed by Dr. Juan Fontenla. The code solves for the non-LTE statistical equilibrium populations of 18,538 levels of 52 atomic and ion species and computes the radiation from all species (435,986 spectral lines) and about 20,000,000 spectral lines of 20 diatomic species.The first model computed in this program was for the modestly active M1.5 V star GJ 832 by Fontenla et al. (ApJ 830, 152 (2016)). We will report on a preliminary model for the more active M5 V star GJ 876 and compare this model and its emergent spectrum with GJ 832. In the future, we will compute and intercompare semi-empirical models and spectra for all of the stars observed with the HST MUSCLES Treasury Survey, the Mega-MUSCLES Treasury Survey, and additional stars including Proxima Cen and Trappist-1.This multiyear theory program is supported by a grant from the Space Telescope Science Institute.

  15. A collaborative computer auditing system under SOA-based conceptual model

    Science.gov (United States)

    Cong, Qiushi; Huang, Zuoming; Hu, Jibing

    2013-03-01

    Some of the current challenges of computer auditing are the obstacles to retrieving, converting and translating data from different database schema. During the last few years, there are many data exchange standards under continuous development such as Extensible Business Reporting Language (XBRL). These XML document standards can be used for data exchange among companies, financial institutions, and audit firms. However, for many companies, it is still expensive and time-consuming to translate and provide XML messages with commercial application packages, because it is complicated and laborious to search and transform data from thousands of tables in the ERP databases. How to transfer transaction documents for supporting continuous auditing or real time auditing between audit firms and their client companies is a important topic. In this paper, a collaborative computer auditing system under SOA-based conceptual model is proposed. By utilizing the widely used XML document standards and existing data transformation applications developed by different companies and software venders, we can wrap these application as commercial web services that will be easy implemented under the forthcoming application environments: service-oriented architecture (SOA). Under the SOA environments, the multiagency mechanism will help the maturity and popularity of data assurance service over the Internet. By the wrapping of data transformation components with heterogeneous databases or platforms, it will create new component markets composed by many software vendors and assurance service companies to provide data assurance services for audit firms, regulators or third parties.

  16. How computational models can help unlock biological systems.

    Science.gov (United States)

    Brodland, G Wayne

    2015-12-01

    With computation models playing an ever increasing role in the advancement of science, it is important that researchers understand what it means to model something; recognize the implications of the conceptual, mathematical and algorithmic steps of model construction; and comprehend what models can and cannot do. Here, we use examples to show that models can serve a wide variety of roles, including hypothesis testing, generating new insights, deepening understanding, suggesting and interpreting experiments, tracing chains of causation, doing sensitivity analyses, integrating knowledge, and inspiring new approaches. We show that models can bring together information of different kinds and do so across a range of length scales, as they do in multi-scale, multi-faceted embryogenesis models, some of which connect gene expression, the cytoskeleton, cell properties, tissue mechanics, morphogenetic movements and phenotypes. Models cannot replace experiments nor can they prove that particular mechanisms are at work in a given situation. But they can demonstrate whether or not a proposed mechanism is sufficient to produce an observed phenomenon. Although the examples in this article are taken primarily from the field of embryo mechanics, most of the arguments and discussion are applicable to any form of computational modelling. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  17. Computational Software to Fit Seismic Data Using Epidemic-Type Aftershock Sequence Models and Modeling Performance Comparisons

    Science.gov (United States)

    Chu, A.

    2016-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work implements three of the homogeneous ETAS models described in Ogata (1998). With a model's log-likelihood function, my software finds the Maximum-Likelihood Estimates (MLEs) of the model's parameters to estimate the homogeneous background rate and the temporal and spatial parameters that govern triggering effects. EM-algorithm is employed for its advantages of stability and robustness (Veen and Schoenberg, 2008). My work also presents comparisons among the three models in robustness, convergence speed, and implementations from theory to computing practice. Up-to-date regional seismic data of seismic active areas such as Southern California and Japan are used to demonstrate the comparisons. Data analysis has been done using computer languages Java and R. Java has the advantages of being strong-typed and easiness of controlling memory resources, while R has the advantages of having numerous available functions in statistical computing. Comparisons are also made between the two programming languages in convergence and stability, computational speed, and easiness of implementation. Issues that may affect convergence such as spatial shapes are discussed.

  18. Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments

    Science.gov (United States)

    Lane, Peter C. R.; Gobet, Fernand

    2013-03-01

    Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.

  19. Emotion regulation strategies: procedure modeling of J. Gross and cultural activity approach

    Directory of Open Access Journals (Sweden)

    Elena I. Pervichko

    2015-03-01

    Full Text Available The first part of this paper argued the desirability of structural-dynamic model of emotion regulation in the theoretical and methodological framework of cultural activity paradigm with the construction of a psychologically-based typology of emotion regulation strategies in norm and pathology, and also psychological mechanisms enabling the regulation of emotions. This conclusion was based on the analysis of the basic concepts and paradigms in which the issue of emotion regulation is studied: cognitive and psychoanalytic approaches, concept and emotional development of emotional intelligence, cultural activity approach. The paper considers the procedure model of emotion regulation by J. Gross, identifies emotion regulation strategies and evaluates their effectiveness. The possibilities and limitations of the model. Based on the review of the today research the conclusion is arrived at that the existing labels on a wide range of regulatory strategies remain an open issue.The author’s definition of emotion regulation is drawn. Emotion regulation is deemed as a set of mental processes, psychological mechanisms and regulatory strategies that people use to preserve the capacity for productive activities in a situation of emotional stress; to ensure optimal impulse control and emotions; to maintain the excitement at the optimum level. The second part of this paper provides the general description of emotion regulation strategies, the approach to their typology, the psychological mechanisms of emotion regulation that lie in the basis of this typology, i.e. the main elements of the structural-dynamic model of emotion regulation. The work shows theoretical and methodological efficacy of empirical significance of signs and symbols and also personal reflection. The diagnostic system to allow empirically identify a wide range of emotion regulation strategies is suggested. The psychological mechanisms used by the subject to solve the problem of emotional

  20. GRAVTool, Advances on the Package to Compute Geoid Model path by the Remove-Compute-Restore Technique, Following Helmert's Condensation Method

    Science.gov (United States)

    Marotta, G. S.

    2017-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astrogeodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove Compute Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and Global Geopotential Model (GGM), respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and adjust these models to one local vertical datum. This research presents the advances on the package called GRAVTool to compute geoid models path by the RCR, following Helmert's condensation method, and its application in a study area. The studied area comprehends the federal district of Brazil, with 6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show a geoid model computed by the GRAVTool package, after analysis of the density, DTM and GGM values, more adequate to the reference values used on the study area. The accuracy of the computed model (σ = ± 0.058 m, RMS = 0.067 m, maximum = 0.124 m and minimum = -0.155 m), using density value of 2.702 g/cm³ ±0.024 g/cm³, DTM SRTM Void Filled 3 arc-second and GGM EIGEN-6C4 up to degree and order 250, matches the uncertainty (σ =± 0.073) of 26 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.076 m, RMS = 0.098 m, maximum = 0.320 m and minimum = -0.061 m).

  1. Computationally-optimized bone mechanical modeling from high-resolution structural images.

    Directory of Open Access Journals (Sweden)

    Jeremy F Magland

    Full Text Available Image-based mechanical modeling of the complex micro-structure of human bone has shown promise as a non-invasive method for characterizing bone strength and fracture risk in vivo. In particular, elastic moduli obtained from image-derived micro-finite element (μFE simulations have been shown to correlate well with results obtained by mechanical testing of cadaveric bone. However, most existing large-scale finite-element simulation programs require significant computing resources, which hamper their use in common laboratory and clinical environments. In this work, we theoretically derive and computationally evaluate the resources needed to perform such simulations (in terms of computer memory and computation time, which are dependent on the number of finite elements in the image-derived bone model. A detailed description of our approach is provided, which is specifically optimized for μFE modeling of the complex three-dimensional architecture of trabecular bone. Our implementation includes domain decomposition for parallel computing, a novel stopping criterion, and a system for speeding up convergence by pre-iterating on coarser grids. The performance of the system is demonstrated on a dual quad-core Xeon 3.16 GHz CPUs equipped with 40 GB of RAM. Models of distal tibia derived from 3D in-vivo MR images in a patient comprising 200,000 elements required less than 30 seconds to converge (and 40 MB RAM. To illustrate the system's potential for large-scale μFE simulations, axial stiffness was estimated from high-resolution micro-CT images of a voxel array of 90 million elements comprising the human proximal femur in seven hours CPU time. In conclusion, the system described should enable image-based finite-element bone simulations in practical computation times on high-end desktop computers with applications to laboratory studies and clinical imaging.

  2. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1990-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems

  3. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1991-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab

  4. Computational Modeling in Tissue Engineering

    CERN Document Server

    2013-01-01

    One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in:  (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...

  5. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer......-format and COM-objects, are incorporated to allow the export and import of mathematical models; 5) a user interface that provides the work-flow and data-flow to guide the user through the different modelling tasks....

  6. The Self-Regulated Learning Model and Music Education

    Directory of Open Access Journals (Sweden)

    Maja Marijan

    2017-02-01

    Full Text Available Self-regulation and self-regulated learning (SRL are important features in music education. In this research self-regulated learning model is presented as a complex, multidimensional structure. SRL starts with the self-regulation. Self-regulation is formed through interaction with the environment, thus self-learning, self-analysis, self-judgment, self-instruction, and self-monitoring are the main functions in self-regulatory structure. Co-regulation is needed, and helps self-regulation to be activated and monitored. In music education, co-regulation refers to the instructions that teacher introduces in the lessons. These instructions have to enhance learning and develop regulation over emotions, cognitive, auditor, and motor skills in students. Learning techniques and learning strategies are core components in music education. Adapting those, students become aware of their learning processes, actions, thoughts, feelings and behaviors that are involved in learning. It is suggested that every teaching methodology has to develop learning techniques, as well as metamemory and metacognition in students, in order to gain expertise. The author has emphasized her attention to every aspect that is believed to belong to SRL. There are not many articles on the SRL in music education, written by musicians, in compare with those written by psychologists and neurologists,. Therefore, the author has suggested that this paper would encourage music teachers and performers to take an advantage in the research of SRL. These researches would help music educational systems and teachers to develop and promote learning techniques and strategies. The results would show improvement in student’s learning and self-regulation.

  7. A Generative Computer Model for Preliminary Design of Mass Housing

    Directory of Open Access Journals (Sweden)

    Ahmet Emre DİNÇER

    2014-05-01

    Full Text Available Today, we live in what we call the “Information Age”, an age in which information technologies are constantly being renewed and developed. Out of this has emerged a new approach called “Computational Design” or “Digital Design”. In addition to significantly influencing all fields of engineering, this approach has come to play a similar role in all stages of the design process in the architectural field. In providing solutions for analytical problems in design such as cost estimate, circulation systems evaluation and environmental effects, which are similar to engineering problems, this approach is being used in the evaluation, representation and presentation of traditionally designed buildings. With developments in software and hardware technology, it has evolved as the studies based on design of architectural products and production implementations with digital tools used for preliminary design stages. This paper presents a digital model which may be used in the preliminary stage of mass housing design with Cellular Automata, one of generative design systems based on computational design approaches. This computational model, developed by scripts of 3Ds Max software, has been implemented on a site plan design of mass housing, floor plan organizations made by user preferences and facade designs. By using the developed computer model, many alternative housing types could be rapidly produced. The interactive design tool of this computational model allows the user to transfer dimensional and functional housing preferences by means of the interface prepared for model. The results of the study are discussed in the light of innovative architectural approaches.

  8. Regulation models for district heating. Background report; Denmark; Reguleringsmodeller for fjernvarmen. Baggrundsrapport

    Energy Technology Data Exchange (ETDEWEB)

    2012-02-15

    The background report describes in detail the elements of the analysis: the present regulation, experiences from other countries and sectors, the aim of regulation, and detailed analysis of four regulation models. (LN)

  9. Inflammation, Self-Regulation, and Health: An Immunologic Model of Self-Regulatory Failure.

    Science.gov (United States)

    Shields, Grant S; Moons, Wesley G; Slavich, George M

    2017-07-01

    Self-regulation is a fundamental human process that refers to multiple complex methods by which individuals pursue goals in the face of distractions. Whereas superior self-regulation predicts better academic achievement, relationship quality, financial and career success, and lifespan health, poor self-regulation increases a person's risk for negative outcomes in each of these domains and can ultimately presage early mortality. Given its centrality to understanding the human condition, a large body of research has examined cognitive, emotional, and behavioral aspects of self-regulation. In contrast, relatively little attention has been paid to specific biologic processes that may underlie self-regulation. We address this latter issue in the present review by examining the growing body of research showing that components of the immune system involved in inflammation can alter neural, cognitive, and motivational processes that lead to impaired self-regulation and poor health. Based on these findings, we propose an integrated, multilevel model that describes how inflammation may cause widespread biobehavioral alterations that promote self-regulatory failure. This immunologic model of self-regulatory failure has implications for understanding how biological and behavioral factors interact to influence self-regulation. The model also suggests new ways of reducing disease risk and enhancing human potential by targeting inflammatory processes that affect self-regulation.

  10. Image analysis and modeling in medical image computing. Recent developments and advances.

    Science.gov (United States)

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body

  11. A Perspective on Computational Human Performance Models as Design Tools

    Science.gov (United States)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  12. Computational Modelling of Piston Ring Dynamics in 3D

    Directory of Open Access Journals (Sweden)

    Dlugoš Jozef

    2014-12-01

    Full Text Available Advanced computational models of a piston assembly based on the level of virtual prototypes require a detailed description of piston ring behaviour. Considering these requirements, the piston rings operate in regimes that cannot, in general, be simplified into an axisymmetric model. The piston and the cylinder liner do not have a perfect round shape, mainly due to machining tolerances and external thermo-mechanical loads. If the ring cannot follow the liner deformations, a local loss of contact occurs resulting in blow-by and increased consumption of lubricant oil in the engine. Current computational models are unable to implement such effects. The paper focuses on the development of a flexible 3D piston ring model based on the Timoshenko beam theory using the multibody system (MBS. The MBS model is compared to the finite element method (FEM solution.

  13. Computationally efficient statistical differential equation modeling using homogenization

    Science.gov (United States)

    Hooten, Mevin B.; Garlick, Martha J.; Powell, James A.

    2013-01-01

    Statistical models using partial differential equations (PDEs) to describe dynamically evolving natural systems are appearing in the scientific literature with some regularity in recent years. Often such studies seek to characterize the dynamics of temporal or spatio-temporal phenomena such as invasive species, consumer-resource interactions, community evolution, and resource selection. Specifically, in the spatial setting, data are often available at varying spatial and temporal scales. Additionally, the necessary numerical integration of a PDE may be computationally infeasible over the spatial support of interest. We present an approach to impose computationally advantageous changes of support in statistical implementations of PDE models and demonstrate its utility through simulation using a form of PDE known as “ecological diffusion.” We also apply a statistical ecological diffusion model to a data set involving the spread of mountain pine beetle (Dendroctonus ponderosae) in Idaho, USA.

  14. The deterministic computational modelling of radioactivity

    International Nuclear Information System (INIS)

    Damasceno, Ralf M.; Barros, Ricardo C.

    2009-01-01

    This paper describes a computational applicative (software) that modelling the simply radioactive decay, the stable nuclei decay, and tbe chain decay directly coupled with superior limit of thirteen radioactive decays, and a internal data bank with the decay constants of the various existent decays, facilitating considerably the use of program by people who does not have access to the program are not connected to the nuclear area; this makes access of the program to people that do not have acknowledgment of that area. The paper presents numerical results for typical problem-models

  15. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C

    2013-01-01

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  16. Computational fluid dynamics modelling of displacement natural ventilation.

    OpenAIRE

    Ji, Yingchun

    2005-01-01

    Natural ventilation is widely recognised as contributing towards low-energy building design. The requirement to reduce energy usage in new buildings has rejuvenated interest in natural ventilation. This thesis deals with computer modelling of natural displacement ventilation driven either by buoyancy or buoyancy combined with wind forces. Two benchmarks have been developed using computational fluid dynamics (CFD) in order to evaluate the accuracy with which CFD is able to mo...

  17. The Social Regulation of Emotion: An Integrative, Cross-Disciplinary Model.

    Science.gov (United States)

    Reeck, Crystal; Ames, Daniel R; Ochsner, Kevin N

    2016-01-01

    Research in emotion regulation has largely focused on how people manage their own emotions, but there is a growing recognition that the ways in which we regulate the emotions of others also are important. Drawing on work from diverse disciplines, we propose an integrative model of the psychological and neural processes supporting the social regulation of emotion. This organizing framework, the 'social regulatory cycle', specifies at multiple levels of description the act of regulating another person's emotions as well as the experience of being a target of regulation. The cycle describes the processing stages that lead regulators to attempt to change the emotions of a target person, the impact of regulation on the processes that generate emotions in the target, and the underlying neural systems. Copyright © 2015. Published by Elsevier Ltd.

  18. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  19. Computer-aided modelling template: Concept and application

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2015-01-01

    decomposition technique which identifies generic steps and workflow involved, the computer-aided template concept has been developed. This concept is implemented as a software tool, which provides a user-friendly interface for following the workflow steps and guidance through the steps providing additional......Modelling is an important enabling technology in modern chemical engineering applications. A template-based approach is presented in this work to facilitate the construction and documentation of the models and enable their maintenance for reuse in a wider application range. Based on a model...

  20. Computational Modeling of Ablation on an Irradiated Target

    Science.gov (United States)

    Mehmedagic, Igbal; Thangam, Siva

    2017-11-01

    Computational modeling of pulsed nanosecond laser interaction with an irradiated metallic target is presented. The model formulation involves ablation of the metallic target irradiated by pulsed high intensity laser at normal atmospheric conditions. Computational findings based on effective representation and prediction of the heat transfer, melting and vaporization of the targeting material as well as plume formation and expansion are presented along with its relevance for the development of protective shields. In this context, the available results for a representative irradiation from 1064 nm laser pulse is used to analyze various ablation mechanisms, variable thermo-physical and optical properties, plume expansion and surface geometry. Funded in part by U. S. Army ARDEC, Picatinny Arsenal, NJ.

  1. Challenges in Soft Computing: Case Study with Louisville MSD CSO Modeling

    Science.gov (United States)

    Ormsbee, L.; Tufail, M.

    2005-12-01

    The principal constituents of soft computing include fuzzy logic, neural computing, evolutionary computation, machine learning, and probabilistic reasoning. There are numerous applications of these constituents (both individually and combination of two or more) in the area of water resources and environmental systems. These range from development of data driven models to optimal control strategies to assist in more informed and intelligent decision making process. Availability of data is critical to such applications and having scarce data may lead to models that do not represent the response function over the entire domain. At the same time, too much data has a tendency to lead to over-constraining of the problem. This paper will describe the application of a subset of these soft computing techniques (neural computing and genetic algorithms) to the Beargrass Creek watershed in Louisville, Kentucky. The application include development of inductive models as substitutes for more complex process-based models to predict water quality of key constituents (such as dissolved oxygen) and use them in an optimization framework for optimal load reductions. Such a process will facilitate the development of total maximum daily loads for the impaired water bodies in the watershed. Some of the challenges faced in this application include 1) uncertainty in data sets, 2) model application, and 3) development of cause-and-effect relationships between water quality constituents and watershed parameters through use of inductive models. The paper will discuss these challenges and how they affect the desired goals of the project.

  2. Applied modelling and computing in social science

    CERN Document Server

    Povh, Janez

    2015-01-01

    In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

  3. Catabolite regulation analysis of Escherichia coli for acetate overflow mechanism and co-consumption of multiple sugars based on systems biology approach using computer simulation.

    Science.gov (United States)

    Matsuoka, Yu; Shimizu, Kazuyuki

    2013-10-20

    It is quite important to understand the basic principle embedded in the main metabolism for the interpretation of the fermentation data. For this, it may be useful to understand the regulation mechanism based on systems biology approach. In the present study, we considered the perturbation analysis together with computer simulation based on the models which include the effects of global regulators on the pathway activation for the main metabolism of Escherichia coli. Main focus is the acetate overflow metabolism and the co-fermentation of multiple carbon sources. The perturbation analysis was first made to understand the nature of the feed-forward loop formed by the activation of Pyk by FDP (F1,6BP), and the feed-back loop formed by the inhibition of Pfk by PEP in the glycolysis. Those together with the effect of transcription factor Cra caused by FDP level affected the glycolysis activity. The PTS (phosphotransferase system) acts as the feed-back system by repressing the glucose uptake rate for the increase in the glucose uptake rate. It was also shown that the increased PTS flux (or glucose consumption rate) causes PEP/PYR ratio to be decreased, and EIIA-P, Cya, cAMP-Crp decreased, where cAMP-Crp in turn repressed TCA cycle and more acetate is formed. This was further verified by the detailed computer simulation. In the case of multiple carbon sources such as glucose and xylose, it was shown that the sequential utilization of carbon sources was observed for wild type, while the co-consumption of multiple carbon sources with slow consumption rates were observed for the ptsG mutant by computer simulation, and this was verified by experiments. Moreover, the effect of a specific gene knockout such as Δpyk on the metabolic characteristics was also investigated based on the computer simulation. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Computational fluid dynamic modelling of cavitation

    Science.gov (United States)

    Deshpande, Manish; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    Models in sheet cavitation in cryogenic fluids are developed for use in Euler and Navier-Stokes codes. The models are based upon earlier potential-flow models but enable the cavity inception point, length, and shape to be determined as part of the computation. In the present paper, numerical solutions are compared with experimental measurements for both pressure distribution and cavity length. Comparisons between models are also presented. The CFD model provides a relatively simple modification to an existing code to enable cavitation performance predictions to be included. The analysis also has the added ability of incorporating thermodynamic effects of cryogenic fluids into the analysis. Extensions of the current two-dimensional steady state analysis to three-dimensions and/or time-dependent flows are, in principle, straightforward although geometrical issues become more complicated. Linearized models, however offer promise of providing effective cavitation modeling in three-dimensions. This analysis presents good potential for improved understanding of many phenomena associated with cavity flows.

  5. Computer modeling of inelastic wave propagation in porous rock

    International Nuclear Information System (INIS)

    Cheney, J.A.; Schatz, J.F.; Snell, C.

    1979-01-01

    Computer modeling of wave propagation in porous rock has several important applications. Among them are prediction of fragmentation and permeability changes to be caused by chemical explosions used for in situ resource recovery, and the understanding of nuclear explosion effects such as seismic wave generation, containment, and site hardness. Of interest in all these applications are the distance from the source to which inelastic effects persist and the amount of porosity change within the inelastic region. In order to study phenomena related to these applications, the Cam Clay family of models developed at Cambridge University was used to develop a similar model that is applicable to wave propagation in porous rock. That model was incorporated into a finite-difference wave propagation computer code SOC. 10 figures, 1 table

  6. OXYGEN PRESSURE REGULATOR DESIGN AND ANALYSIS THROUGH FINITE ELEMENT MODELING

    Directory of Open Access Journals (Sweden)

    Asterios KOSMARAS

    2017-05-01

    Full Text Available Oxygen production centers produce oxygen in high pressure that needs to be defused. A regulator is designed and analyzed in the current paper for medical use in oxygen production centers. This study aims to design a new oxygen pressure regulator and perform an analysis using Finite Element Modeling in order to evaluate its working principle. In the design procedure,the main elements and the operating principles of a pressure regulator are taking into account. The regulator is designed and simulations take place in order to assessthe proposed design. Stress analysis results are presented for the main body of the regulator, as well as, flow analysis to determine some important flow characteristics in the inlet and outlet of the regulator.

  7. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  8. Regulation models for district heating. Main report; Denmark; Reguleringsmodeller for fjernvarmen. Hovedrapport

    Energy Technology Data Exchange (ETDEWEB)

    2012-02-15

    With regard to choice of model for the regulation of district heating prices the report points out that a detailed analysis of a cost+ model could be considered. Such an analysis could provide further definition of the extended right to recoup the investment for heating companies, the shaping of the possibility of recognition of opportunity costs and the fixed cost allocation, and the clarified definition of necessary costs. The report also suggests that a price cap regulation or a completely free pricing in the entire sector is hardly appropriate forms of regulation. The report's analysis clearly shows that the choice of price regulation in the heat sector has impact on the incentives in terms of investment, green conversion, etc. It also appears that the different regulatory models have very different advantages and disadvantages, and lessons learned from other sectors and abroad show that changing price regulation rules can be a difficult and lengthy process with unintended consequences along the way. (LN)

  9. Computational Fluid Dynamics Modeling of Bacillus anthracis ...

    Science.gov (United States)

    Journal Article Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived from computed tomography (CT) or µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation-exhalation breathing conditions using average species-specific minute volumes. Four different exposure scenarios were modeled in the rabbit based upon experimental inhalation studies. For comparison, human simulations were conducted at the highest exposure concentration used during the rabbit experimental exposures. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Despite the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the upper conducting airways of the human at the same air concentration of anthrax spores. This greater deposition of spores in the upper airways in the human resulted in lower penetration and deposition in the tracheobronchial airways and the deep lung than that predict

  10. Dynamic Model of Kaplan Turbine Regulating System Suitable for Power System Analysis

    OpenAIRE

    Zhao, Jie; Wang, Li; Liu, Dichen; Wang, Jun; Zhao, Yu; Liu, Tian; Wang, Haoyu

    2015-01-01

    Accurate modeling of Kaplan turbine regulating system is of great significance for grid security and stability analysis. In this paper, Kaplan turbine regulating system model is divided into the governor system model, the blade control system model, and the turbine and water diversion system model. The Kaplan turbine has its particularity, and the on-cam relationship between the wicket gate opening and the runner blade angle under a certain water head on the whole range was obtained by high-o...

  11. Impact of changing computer technology on hydrologic and water resource modeling

    OpenAIRE

    Loucks, D.P.; Fedra, K.

    1987-01-01

    The increasing availability of substantial computer power at relatively low costs and the increasing ease of using computer graphics, of communicating with other computers and data bases, and of programming using high-level problem-oriented computer languages, is providing new opportunities and challenges for those developing and using hydrologic and water resources models. This paper reviews some of the progress made towards the development and application of computer support systems designe...

  12. Computer-Aided Model Based Analysis for Design and Operation of a Copolymerization Process

    DEFF Research Database (Denmark)

    Lopez-Arenas, Maria Teresa; Sales-Cruz, Alfonso Mauricio; Gani, Rafiqul

    2006-01-01

    . This will allow analysis of the process behaviour, contribute to a better understanding of the polymerization process, help to avoid unsafe conditions of operation, and to develop operational and optimizing control strategies. In this work, through a computer-aided modeling system ICAS-MoT, two first......The advances in computer science and computational algorithms for process modelling, process simulation, numerical methods and design/synthesis algorithms, makes it advantageous and helpful to employ computer-aided modelling systems and tools for integrated process analysis. This is illustrated......-principles models have been investigated with respect to design and operational issues for solution copolymerization reactors in general, and for the methyl methacrylate/vinyl acetate system in particular. The Model 1 is taken from literature and is commonly used for low conversion region, while the Model 2 has...

  13. Modeling baroreflex regulation of heart rate during orthostatic stress

    DEFF Research Database (Denmark)

    Olufsen, Mette; Tran, Hien T.; Ottesen, Johnny T.

    2006-01-01

    . The model uses blood pressure measured in the finger as an input to model heart rate dynamics in response to changes in baroreceptor nerve firing rate, sympathetic and parasympathetic responses, vestibulo-sympathetic reflex, and concentrations of norepinephrine and acetylcholine. We formulate an inverse...... in healthy and hypertensive elderly people the hysteresis loop shifts to higher blood pressure values and its area is diminished. Finally, for hypertensive elderly people the hysteresis loop is generally not closed indicating that during postural change from sitting to standing, the blood pressure resettles......During orthostatic stress, arterial and cardiopulmonary baroreflexes play a key role in maintaining arterial pressure by regulating heart rate. This study, presents a mathematical model that can predict the dynamics of heart rate regulation in response to postural change from sitting to standing...

  14. Coupling of EIT with computational lung modeling for predicting patient-specific ventilatory responses.

    Science.gov (United States)

    Roth, Christian J; Becher, Tobias; Frerichs, Inéz; Weiler, Norbert; Wall, Wolfgang A

    2017-04-01

    Providing optimal personalized mechanical ventilation for patients with acute or chronic respiratory failure is still a challenge within a clinical setting for each case anew. In this article, we integrate electrical impedance tomography (EIT) monitoring into a powerful patient-specific computational lung model to create an approach for personalizing protective ventilatory treatment. The underlying computational lung model is based on a single computed tomography scan and able to predict global airflow quantities, as well as local tissue aeration and strains for any ventilation maneuver. For validation, a novel "virtual EIT" module is added to our computational lung model, allowing to simulate EIT images based on the patient's thorax geometry and the results of our numerically predicted tissue aeration. Clinically measured EIT images are not used to calibrate the computational model. Thus they provide an independent method to validate the computational predictions at high temporal resolution. The performance of this coupling approach has been tested in an example patient with acute respiratory distress syndrome. The method shows good agreement between computationally predicted and clinically measured airflow data and EIT images. These results imply that the proposed framework can be used for numerical prediction of patient-specific responses to certain therapeutic measures before applying them to an actual patient. In the long run, definition of patient-specific optimal ventilation protocols might be assisted by computational modeling. NEW & NOTEWORTHY In this work, we present a patient-specific computational lung model that is able to predict global and local ventilatory quantities for a given patient and any selected ventilation protocol. For the first time, such a predictive lung model is equipped with a virtual electrical impedance tomography module allowing real-time validation of the computed results with the patient measurements. First promising results

  15. Computational Models for Nonlinear Aeroelastic Systems, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate new and efficient computational methods of modeling nonlinear aeroelastic systems. The...

  16. Validation of a phytoremediation computer model

    International Nuclear Information System (INIS)

    Corapcioglu, M.Y.; Sung, K.; Rhykerd, R.L.; Munster, C.; Drew, M.

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg -1 ] TNT, PBB and chrysene. Vegetated and unvegetated treatments were conducted in triplicate to obtain data regarding contaminant concentrations in the soil, plant roots, root distribution, microbial activity, plant water use and soil moisture. When given the parameters of time and depth, the model successfully predicted contaminant concentrations under actual field conditions. Other model parameters are currently being evaluated. 15 refs., 2 figs

  17. Computer models for optimizing radiation therapy

    International Nuclear Information System (INIS)

    Duechting, W.

    1998-01-01

    The aim of this contribution is to outline how methods of system analysis, control therapy and modelling can be applied to simulate normal and malignant cell growth and to optimize cancer treatment as for instance radiation therapy. Based on biological observations and cell kinetic data, several types of models have been developed describing the growth of tumor spheroids and the cell renewal of normal tissue. The irradiation model is represented by the so-called linear-quadratic model describing the survival fraction as a function of the dose. Based thereon, numerous simulation runs for different treatment schemes can be performed. Thus, it is possible to study the radiation effect on tumor and normal tissue separately. Finally, this method enables a computer-assisted recommendation for an optimal patient-specific treatment schedule prior to clinical therapy. (orig.) [de

  18. Quantum computing with photons: introduction to the circuit model, the one-way quantum computer, and the fundamental principles of photonic experiments

    International Nuclear Information System (INIS)

    Barz, Stefanie

    2015-01-01

    Quantum physics has revolutionized our understanding of information processing and enables computational speed-ups that are unattainable using classical computers. This tutorial reviews the fundamental tools of photonic quantum information processing. The basics of theoretical quantum computing are presented and the quantum circuit model as well as measurement-based models of quantum computing are introduced. Furthermore, it is shown how these concepts can be implemented experimentally using photonic qubits, where information is encoded in the photons’ polarization. (tutorial)

  19. Dynamical Models for Computer Viruses Propagation

    Directory of Open Access Journals (Sweden)

    José R. C. Piqueira

    2008-01-01

    Full Text Available Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network.

  20. Computational Modeling of Photonic Crystal Microcavity Single-Photon Emitters

    Science.gov (United States)

    Saulnier, Nicole A.

    Conventional cryptography is based on algorithms that are mathematically complex and difficult to solve, such as factoring large numbers. The advent of a quantum computer would render these schemes useless. As scientists work to develop a quantum computer, cryptographers are developing new schemes for unconditionally secure cryptography. Quantum key distribution has emerged as one of the potential replacements of classical cryptography. It relics on the fact that measurement of a quantum bit changes the state of the bit and undetected eavesdropping is impossible. Single polarized photons can be used as the quantum bits, such that a quantum system would in some ways mirror the classical communication scheme. The quantum key distribution system would include components that create, transmit and detect single polarized photons. The focus of this work is on the development of an efficient single-photon source. This source is comprised of a single quantum dot inside of a photonic crystal microcavity. To better understand the physics behind the device, a computational model is developed. The model uses Finite-Difference Time-Domain methods to analyze the electromagnetic field distribution in photonic crystal microcavities. It uses an 8-band k · p perturbation theory to compute the energy band structure of the epitaxially grown quantum dots. We discuss a method that combines the results of these two calculations for determining the spontaneous emission lifetime of a quantum dot in bulk material or in a microcavity. The computational models developed in this thesis are used to identify and characterize microcavities for potential use in a single-photon source. The computational tools developed are also used to investigate novel photonic crystal microcavities that incorporate 1D distributed Bragg reflectors for vertical confinement. It is found that the spontaneous emission enhancement in the quasi-3D cavities can be significantly greater than in traditional suspended slab

  1. A climatological model for risk computations incorporating site- specific dry deposition influences

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.

    1991-07-01

    A gradient-flux dry deposition module was developed for use in a climatological atmospheric transport model, the Multimedia Environmental Pollutant Assessment System (MEPAS). The atmospheric pathway model computes long-term average contaminant air concentration and surface deposition patterns surrounding a potential release site incorporating location-specific dry deposition influences. Gradient-flux formulations are used to incorporate site and regional data in the dry deposition module for this atmospheric sector-average climatological model. Application of these formulations provide an effective means of accounting for local surface roughness in deposition computations. Linkage to a risk computation module resulted in a need for separate regional and specific surface deposition computations. 13 refs., 4 figs., 2 tabs

  2. T-tubule disruption promotes calcium alternans in failing ventricular myocytes: mechanistic insights from computational modeling.

    Science.gov (United States)

    Nivala, Michael; Song, Zhen; Weiss, James N; Qu, Zhilin

    2015-02-01

    In heart failure (HF), T-tubule (TT) disruption contributes to dyssynchronous calcium (Ca) release and impaired contraction, but its role in arrhythmogenesis remains unclear. In this study, we investigate the effects of TT disruption and other HF remodeling factors on Ca alternans in ventricular myocytes using computer modeling. A ventricular myocyte model with detailed spatiotemporal Ca cycling modeled by a coupled Ca release unit (CRU) network was used, in which the L-type Ca channels and the ryanodine receptor (RyR) channels were simulated by random Markov transitions. TT disruption, which removes the L-type Ca channels from the associated CRUs, results in "orphaned" RyR clusters and thus provides increased opportunity for spark-induced Ca sparks to occur. This effect combined with other HF remodeling factors promoted alternans by two distinct mechanisms: 1) for normal sarco-endoplasmic reticulum Ca ATPase (SERCA) activity, alternans was caused by both CRU refractoriness and coupling. The increased opportunity for spark-induced sparks by TT disruption combined with the enhanced CRU coupling by Ca elevation in the presence or absence of increased RyR leakiness facilitated spark synchronization on alternate beats to promote Ca alternans; 2) for down-regulated SERCA, alternans was caused by the sarcoplasmic reticulum (SR) Ca load-dependent mechanism, independent of CRU refractoriness. TT disruption and increased RyR leakiness shifted and steepened the SR Ca release-load relationship, which combines with down-regulated SERCA to promote Ca alternans. In conclusion, the mechanisms of Ca alternans for normal and down-regulated SERCA are different, and TT disruption promotes Ca alternans by both mechanisms, which may contribute to alternans at different stages of HF. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Computer-Aided Construction of Chemical Kinetic Models

    Energy Technology Data Exchange (ETDEWEB)

    Green, William H. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2014-12-31

    The combustion chemistry of even simple fuels can be extremely complex, involving hundreds or thousands of kinetically significant species. The most reasonable way to deal with this complexity is to use a computer not only to numerically solve the kinetic model, but also to construct the kinetic model in the first place. Because these large models contain so many numerical parameters (e.g. rate coefficients, thermochemistry) one never has sufficient data to uniquely determine them all experimentally. Instead one must work in “predictive” mode, using theoretical rather than experimental values for many of the numbers in the model, and as appropriate refining the most sensitive numbers through experiments. Predictive chemical kinetics is exactly what is needed for computer-aided design of combustion systems based on proposed alternative fuels, particularly for early assessment of the value and viability of proposed new fuels before those fuels are commercially available. This project was aimed at making accurate predictive chemical kinetics practical; this is a challenging goal which requires a range of science advances. The project spanned a wide range from quantum chemical calculations on individual molecules and elementary-step reactions, through the development of improved rate/thermo calculation procedures, the creation of algorithms and software for constructing and solving kinetic simulations, the invention of methods for model-reduction while maintaining error control, and finally comparisons with experiment. Many of the parameters in the models were derived from quantum chemistry calculations, and the models were compared with experimental data measured in our lab or in collaboration with others.

  4. New weighted sum of gray gases model applicable to Computational Fluid Dynamics (CFD) modeling of oxy-fuel combustion

    DEFF Research Database (Denmark)

    Yin, Chungen; Johansen, Lars Christian Riis; Rosendahl, Lasse

    2010-01-01

    gases model (WSGGM) is derived, which is applicable to computational fluid dynamics (CFD) modeling of both air-fuel and oxy-fuel combustion. First, a computer code is developed to evaluate the emissivity of any gas mixture at any condition by using the exponential wide band model (EWBM...

  5. Computational Modeling of Micrometastatic Breast Cancer Radiation Dose Response

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Daniel L.; Debeb, Bisrat G. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Morgan Welch Inflammatory Breast Cancer Research Program and Clinic, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Thames, Howard D. [Department of Biostatistics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Woodward, Wendy A., E-mail: wwoodward@mdanderson.org [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Morgan Welch Inflammatory Breast Cancer Research Program and Clinic, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States)

    2016-09-01

    Purpose: Prophylactic cranial irradiation (PCI) involves giving radiation to the entire brain with the goals of reducing the incidence of brain metastasis and improving overall survival. Experimentally, we have demonstrated that PCI prevents brain metastases in a breast cancer mouse model. We developed a computational model to expand on and aid in the interpretation of our experimental results. Methods and Materials: MATLAB was used to develop a computational model of brain metastasis and PCI in mice. Model input parameters were optimized such that the model output would match the experimental number of metastases per mouse from the unirradiated group. An independent in vivo–limiting dilution experiment was performed to validate the model. The effect of whole brain irradiation at different measurement points after tumor cells were injected was evaluated in terms of the incidence, number of metastases, and tumor burden and was then compared with the corresponding experimental data. Results: In the optimized model, the correlation between the number of metastases per mouse and the experimental fits was >95. Our attempt to validate the model with a limiting dilution assay produced 99.9% correlation with respect to the incidence of metastases. The model accurately predicted the effect of whole-brain irradiation given 3 weeks after cell injection but substantially underestimated its effect when delivered 5 days after cell injection. The model further demonstrated that delaying whole-brain irradiation until the development of gross disease introduces a dose threshold that must be reached before a reduction in incidence can be realized. Conclusions: Our computational model of mouse brain metastasis and PCI correlated strongly with our experiments with unirradiated mice. The results further suggest that early treatment of subclinical disease is more effective than irradiating established disease.

  6. Mathematical model of accelerator output characteristics and their calculation on a computer

    International Nuclear Information System (INIS)

    Mishulina, O.A.; Ul'yanina, M.N.; Kornilova, T.V.

    1975-01-01

    A mathematical model is described of output characteristics of a linear accelerator. The model is a system of differential equations. Presence of phase limitations is a specific feature of setting the problem which makes it possible to ensure higher simulation accuracy and determine a capture coefficient. An algorithm is elaborated of computing output characteristics based upon the mathematical model suggested. A capture coefficient, coordinate expectation characterizing an average phase value of the beam particles, coordinate expectation characterizing an average value of the reverse relative velocity of the beam particles as well as dispersion of these coordinates are output characteristics of the accelerator. Calculation methods of the accelerator output characteristics are described in detail. The computations have been performed on the BESM-6 computer, the characteristics computing time being 2 min 20 sec. Relative error of parameter computation averages 10 -2

  7. Computational neural network regression model for Host based Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Gautam

    2016-09-01

    Full Text Available The current scenario of information gathering and storing in secure system is a challenging task due to increasing cyber-attacks. There exists computational neural network techniques designed for intrusion detection system, which provide security to single machine and entire network's machine. In this paper, we have used two types of computational neural network models, namely, Generalized Regression Neural Network (GRNN model and Multilayer Perceptron Neural Network (MPNN model for Host based Intrusion Detection System using log files that are generated by a single personal computer. The simulation results show correctly classified percentage of normal and abnormal (intrusion class using confusion matrix. On the basis of results and discussion, we found that the Host based Intrusion Systems Model (HISM significantly improved the detection accuracy while retaining minimum false alarm rate.

  8. Defining epidemics in computer simulation models: How do definitions influence conclusions?

    Directory of Open Access Journals (Sweden)

    Carolyn Orbann

    2017-06-01

    Full Text Available Computer models have proven to be useful tools in studying epidemic disease in human populations. Such models are being used by a broader base of researchers, and it has become more important to ensure that descriptions of model construction and data analyses are clear and communicate important features of model structure. Papers describing computer models of infectious disease often lack a clear description of how the data are aggregated and whether or not non-epidemic runs are excluded from analyses. Given that there is no concrete quantitative definition of what constitutes an epidemic within the public health literature, each modeler must decide on a strategy for identifying epidemics during simulation runs. Here, an SEIR model was used to test the effects of how varying the cutoff for considering a run an epidemic changes potential interpretations of simulation outcomes. Varying the cutoff from 0% to 15% of the model population ever infected with the illness generated significant differences in numbers of dead and timing variables. These results are important for those who use models to form public health policy, in which questions of timing or implementation of interventions might be answered using findings from computer simulation models.

  9. Computational Models for Nonlinear Aeroelastic Systems, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate a new and efficient computational method of modeling nonlinear aeroelastic systems. The...

  10. A neuro-fuzzy computing technique for modeling hydrological time series

    Science.gov (United States)

    Nayak, P. C.; Sudheer, K. P.; Rangan, D. M.; Ramasastri, K. S.

    2004-05-01

    Intelligent computing tools such as artificial neural network (ANN) and fuzzy logic approaches are proven to be efficient when applied individually to a variety of problems. Recently there has been a growing interest in combining both these approaches, and as a result, neuro-fuzzy computing techniques have evolved. This approach has been tested and evaluated in the field of signal processing and related areas, but researchers have only begun evaluating the potential of this neuro-fuzzy hybrid approach in hydrologic modeling studies. This paper presents the application of an adaptive neuro fuzzy inference system (ANFIS) to hydrologic time series modeling, and is illustrated by an application to model the river flow of Baitarani River in Orissa state, India. An introduction to the ANFIS modeling approach is also presented. The advantage of the method is that it does not require the model structure to be known a priori, in contrast to most of the time series modeling techniques. The results showed that the ANFIS forecasted flow series preserves the statistical properties of the original flow series. The model showed good performance in terms of various statistical indices. The results are highly promising, and a comparative analysis suggests that the proposed modeling approach outperforms ANNs and other traditional time series models in terms of computational speed, forecast errors, efficiency, peak flow estimation etc. It was observed that the ANFIS model preserves the potential of the ANN approach fully, and eases the model building process.

  11. Grid computing in large pharmaceutical molecular modeling.

    Science.gov (United States)

    Claus, Brian L; Johnson, Stephen R

    2008-07-01

    Most major pharmaceutical companies have employed grid computing to expand their compute resources with the intention of minimizing additional financial expenditure. Historically, one of the issues restricting widespread utilization of the grid resources in molecular modeling is the limited set of suitable applications amenable to coarse-grained parallelization. Recent advances in grid infrastructure technology coupled with advances in application research and redesign will enable fine-grained parallel problems, such as quantum mechanics and molecular dynamics, which were previously inaccessible to the grid environment. This will enable new science as well as increase resource flexibility to load balance and schedule existing workloads.

  12. Human performance models for computer-aided engineering

    Science.gov (United States)

    Elkind, Jerome I. (Editor); Card, Stuart K. (Editor); Hochberg, Julian (Editor); Huey, Beverly Messick (Editor)

    1989-01-01

    This report discusses a topic important to the field of computational human factors: models of human performance and their use in computer-based engineering facilities for the design of complex systems. It focuses on a particular human factors design problem -- the design of cockpit systems for advanced helicopters -- and on a particular aspect of human performance -- vision and related cognitive functions. By focusing in this way, the authors were able to address the selected topics in some depth and develop findings and recommendations that they believe have application to many other aspects of human performance and to other design domains.

  13. A Parallel and Distributed Surrogate Model Implementation for Computational Steering

    KAUST Repository

    Butnaru, Daniel; Buse, Gerrit; Pfluger, Dirk

    2012-01-01

    of the input parameters. Such an exploration process is however not possible if the simulation is computationally too expensive. For these cases we present in this paper a scalable computational steering approach utilizing a fast surrogate model as substitute

  14. Modeling promoter grammars with evolving hidden Markov models

    DEFF Research Database (Denmark)

    Won, Kyoung-Jae; Sandelin, Albin; Marstrand, Troels Torben

    2008-01-01

    MOTIVATION: Describing and modeling biological features of eukaryotic promoters remains an important and challenging problem within computational biology. The promoters of higher eukaryotes in particular display a wide variation in regulatory features, which are difficult to model. Often several...... factors are involved in the regulation of a set of co-regulated genes. If so, promoters can be modeled with connected regulatory features, where the network of connections is characteristic for a particular mode of regulation. RESULTS: With the goal of automatically deciphering such regulatory structures......, we present a method that iteratively evolves an ensemble of regulatory grammars using a hidden Markov Model (HMM) architecture composed of interconnected blocks representing transcription factor binding sites (TFBSs) and background regions of promoter sequences. The ensemble approach reduces the risk...

  15. Computational Aerodynamic Modeling of Small Quadcopter Vehicles

    Science.gov (United States)

    Yoon, Seokkwan; Ventura Diaz, Patricia; Boyd, D. Douglas; Chan, William M.; Theodore, Colin R.

    2017-01-01

    High-fidelity computational simulations have been performed which focus on rotor-fuselage and rotor-rotor aerodynamic interactions of small quad-rotor vehicle systems. The three-dimensional unsteady Navier-Stokes equations are solved on overset grids using high-order accurate schemes, dual-time stepping, low Mach number preconditioning, and hybrid turbulence modeling. Computational results for isolated rotors are shown to compare well with available experimental data. Computational results in hover reveal the differences between a conventional configuration where the rotors are mounted above the fuselage and an unconventional configuration where the rotors are mounted below the fuselage. Complex flow physics in forward flight is investigated. The goal of this work is to demonstrate that understanding of interactional aerodynamics can be an important factor in design decisions regarding rotor and fuselage placement for next-generation multi-rotor drones.

  16. Computational fluid dynamics modelling in cardiovascular medicine.

    Science.gov (United States)

    Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2016-01-01

    This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges. Published by the BMJ Publishing Group Limited. For permission

  17. What can be learned from computer modeling? Comparing expository and modeling approaches to teaching dynamic systems behavior

    NARCIS (Netherlands)

    van Borkulo, S.P.; van Joolingen, W.R.; Savelsbergh, E.R.; de Jong, T.

    2012-01-01

    Computer modeling has been widely promoted as a means to attain higher order learning outcomes. Substantiating these benefits, however, has been problematic due to a lack of proper assessment tools. In this study, we compared computer modeling with expository instruction, using a tailored assessment

  18. Ex Vivo Methods for Informing Computational Models of the Mitral Valve

    OpenAIRE

    Bloodworth, Charles H.; Pierce, Eric L.; Easley, Thomas F.; Drach, Andrew; Khalighi, Amir H.; Toma, Milan; Jensen, Morten O.; Sacks, Michael S.; Yoganathan, Ajit P.

    2016-01-01

    Computational modeling of the mitral valve (MV) has potential applications for determining optimal MV repair techniques and risk of recurrent mitral regurgitation. Two key concerns for informing these models are (1) sensitivity of model performance to the accuracy of the input geometry, and, (2) acquisition of comprehensive data sets against which the simulation can be validated across clinically relevant geometries. Addressing the first concern, ex vivo micro-computed tomography (microCT) wa...

  19. FCJ-131 Pervasive Computing and Prosopopoietic Modelling – Notes on computed function and creative action

    Directory of Open Access Journals (Sweden)

    Anders Michelsen

    2011-12-01

    Full Text Available This article treats the philosophical underpinnings of the notions of ubiquity and pervasive computing from a historical perspective. The current focus on these notions reflects the ever increasing impact of new media and the underlying complexity of computed function in the broad sense of ICT that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997 terms, ‘invisible’, on the horizon, ’calm’, it also points to a much more important and slightly different perspective: that of creative action upon novel forms of artifice. Most importantly for this article, ubiquity and pervasive computing is seen to point to the continuous existence throughout the computational heritage since the mid-20th century of a paradoxical distinction/complicity between the technical organisation of computed function and the human Being, in the sense of creative action upon such function. This paradoxical distinction/complicity promotes a chiastic (Merleau-Ponty relationship of extension of one into the other. It also indicates a generative creation that itself points to important issues of ontology with methodological implications for the design of computing. In this article these implications will be conceptualised as prosopopoietic modeling on the basis of Bernward Joerges introduction of the classical rhetoric term of ’prosopopoeia’ into the debate on large technological systems. First, the paper introduces the paradoxical distinction/complicity by debating Gilbert Simondon’s notion of a ‘margin of indeterminacy’ vis-a-vis computing. Second, it debates the idea of prosopopoietic modeling, pointing to a principal role of the paradoxical distinction/complicity within the computational heritage in three cases: a. Prosopopoietic

  20. Predictive Behavior of a Computational Foot/Ankle Model through Artificial Neural Networks.

    Science.gov (United States)

    Chande, Ruchi D; Hargraves, Rosalyn Hobson; Ortiz-Robinson, Norma; Wayne, Jennifer S

    2017-01-01

    Computational models are useful tools to study the biomechanics of human joints. Their predictive performance is heavily dependent on bony anatomy and soft tissue properties. Imaging data provides anatomical requirements while approximate tissue properties are implemented from literature data, when available. We sought to improve the predictive capability of a computational foot/ankle model by optimizing its ligament stiffness inputs using feedforward and radial basis function neural networks. While the former demonstrated better performance than the latter per mean square error, both networks provided reasonable stiffness predictions for implementation into the computational model.

  1. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    . To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient......'s CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns......Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging...

  2. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs).

    Science.gov (United States)

    Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong

    2014-01-01

    Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  3. Phenomenological optical potentials and optical model computer codes

    International Nuclear Information System (INIS)

    Prince, A.

    1980-01-01

    An introduction to the Optical Model is presented. Starting with the purpose and nature of the physical problems to be analyzed, a general formulation and the various phenomenological methods of solution are discussed. This includes the calculation of observables based on assumed potentials such as local and non-local and their forms, e.g. Woods-Saxon, folded model etc. Also discussed are the various calculational methods and model codes employed to describe nuclear reactions in the spherical and deformed regions (e.g. coupled-channel analysis). An examination of the numerical solutions and minimization techniques associated with the various codes, is briefly touched upon. Several computer programs are described for carrying out the calculations. The preparation of input, (formats and options), determination of model parameters and analysis of output are described. The class is given a series of problems to carry out using the available computer. Interpretation and evaluation of the samples includes the effect of varying parameters, and comparison of calculations with the experimental data. Also included is an intercomparison of the results from the various model codes, along with their advantages and limitations. (author)

  4. Security Issues Model on Cloud Computing: A Case of Malaysia

    OpenAIRE

    Komeil Raisian; Jamaiah Yahaya

    2015-01-01

    By developing the cloud computing, viewpoint of many people regarding the infrastructure architectures, software distribution and improvement model changed significantly. Cloud computing associates with the pioneering deployment architecture, which could be done through grid calculating, effectiveness calculating and autonomic calculating. The fast transition towards that, has increased the worries regarding a critical issue for the effective transition of cloud computing. From the security v...

  5. Computer Modeling of the Effects of Atmospheric Conditions on Sound Signatures

    Science.gov (United States)

    2016-02-01

    simulation. 11 5. References 1. Attenborough K. Sound propagation in the atmosphere. In: Rossing TD, editor. Springer handbook of...ARL-TR-7602 ● FEB 2016 US Army Research Laboratory Computer Modeling of the Effects of Atmospheric Conditions on Sound ...Laboratory Computer Modeling of the Effects of Atmospheric Conditions on Sound Signatures by Sarah Wagner Science and Engineering Apprentice

  6. The Effectiveness of Self-Regulated Learning Scaffolds on Academic Performance in Computer-Based Learning Environments: A Meta-Analysis

    Science.gov (United States)

    Zheng, Lanqin

    2016-01-01

    This meta-analysis examined research on the effects of self-regulated learning scaffolds on academic performance in computer-based learning environments from 2004 to 2015. A total of 29 articles met inclusion criteria and were included in the final analysis with a total sample size of 2,648 students. Moderator analyses were performed using a…

  7. An approximate fractional Gaussian noise model with computational cost

    KAUST Repository

    Sø rbye, Sigrunn H.; Myrvoll-Nilsen, Eirik; Rue, Haavard

    2017-01-01

    Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood

  8. Approach to Computer Implementation of Mathematical Model of 3-Phase Induction Motor

    Science.gov (United States)

    Pustovetov, M. Yu

    2018-03-01

    This article discusses the development of the computer model of an induction motor based on the mathematical model in a three-phase stator reference frame. It uses an approach that allows combining during preparation of the computer model dual methods: means of visual programming circuitry (in the form of electrical schematics) and logical one (in the form of block diagrams). The approach enables easy integration of the model of an induction motor as part of more complex models of electrical complexes and systems. The developed computer model gives the user access to the beginning and the end of a winding of each of the three phases of the stator and rotor. This property is particularly important when considering the asymmetric modes of operation or when powered by the special circuitry of semiconductor converters.

  9. Computer Models and Automata Theory in Biology and Medicine

    CERN Document Server

    Baianu, I C

    2004-01-01

    The applications of computers to biological and biomedical problem solving goes back to the very beginnings of computer science, automata theory [1], and mathematical biology [2]. With the advent of more versatile and powerful computers, biological and biomedical applications of computers have proliferated so rapidly that it would be virtually impossible to compile a comprehensive review of all developments in this field. Limitations of computer simulations in biology have also come under close scrutiny, and claims have been made that biological systems have limited information processing power [3]. Such general conjectures do not, however, deter biologists and biomedical researchers from developing new computer applications in biology and medicine. Microprocessors are being widely employed in biological laboratories both for automatic data acquisition/processing and modeling; one particular area, which is of great biomedical interest, involves fast digital image processing and is already established for rout...

  10. Computational model of a whole tree combustor

    Energy Technology Data Exchange (ETDEWEB)

    Bryden, K.M.; Ragland, K.W. [Univ. of Wisconsin, Madison, WI (United States)

    1993-12-31

    A preliminary computational model has been developed for the whole tree combustor and compared to test results. In the simulation model presented hardwood logs, 15 cm in diameter are burned in a 4 m deep fuel bed. Solid and gas temperature, solid and gas velocity, CO, CO{sub 2}, H{sub 2}O, HC and O{sub 2} profiles are calculated. This deep, fixed bed combustor obtains high energy release rates per unit area due to the high inlet air velocity and extended reaction zone. The lowest portion of the overall bed is an oxidizing region and the remainder of the bed acts as a gasification and drying region. The overfire air region completes the combustion. Approximately 40% of the energy is released in the lower oxidizing region. The wood consumption rate obtained from the computational model is 4,110 kg/m{sup 2}-hr which matches well the consumption rate of 3,770 kg/m{sup 2}-hr observed during the peak test period of the Aurora, MN test. The predicted heat release rate is 16 MW/m{sup 2} (5.0*10{sup 6} Btu/hr-ft{sup 2}).

  11. Computational 3-D Model of the Human Respiratory System

    Science.gov (United States)

    We are developing a comprehensive, morphologically-realistic computational model of the human respiratory system that can be used to study the inhalation, deposition, and clearance of contaminants, while being adaptable for age, race, gender, and health/disease status. The model ...

  12. Interactive computer graphics for bio-stereochemical modelling

    Indian Academy of Sciences (India)

    Proc, Indian Acad. Sci., Vol. 87 A (Chem. Sci.), No. 4, April 1978, pp. 95-113, (e) printed in India. Interactive computer graphics for bio-stereochemical modelling. ROBERT REIN, SHLOMONIR, KAREN HAYDOCK and. ROBERTD MACELROY. Department of Experimental Pathology, Roswell Park Memorial Institute,. 666 Elm ...

  13. Simulation model of load balancing in distributed computing systems

    Science.gov (United States)

    Botygin, I. A.; Popov, V. N.; Frolov, S. G.

    2017-02-01

    The availability of high-performance computing, high speed data transfer over the network and widespread of software for the design and pre-production in mechanical engineering have led to the fact that at the present time the large industrial enterprises and small engineering companies implement complex computer systems for efficient solutions of production and management tasks. Such computer systems are generally built on the basis of distributed heterogeneous computer systems. The analytical problems solved by such systems are the key models of research, but the system-wide problems of efficient distribution (balancing) of the computational load and accommodation input, intermediate and output databases are no less important. The main tasks of this balancing system are load and condition monitoring of compute nodes, and the selection of a node for transition of the user’s request in accordance with a predetermined algorithm. The load balancing is one of the most used methods of increasing productivity of distributed computing systems through the optimal allocation of tasks between the computer system nodes. Therefore, the development of methods and algorithms for computing optimal scheduling in a distributed system, dynamically changing its infrastructure, is an important task.

  14. Computational brain models: Advances from system biology and future challenges

    Directory of Open Access Journals (Sweden)

    George E. Barreto

    2015-02-01

    Full Text Available Computational brain models focused on the interactions between neurons and astrocytes, modeled via metabolic reconstructions, are reviewed. The large source of experimental data provided by the -omics techniques and the advance/application of computational and data-management tools are being fundamental. For instance, in the understanding of the crosstalk between these cells, the key neuroprotective mechanisms mediated by astrocytes in specific metabolic scenarios (1 and the identification of biomarkers for neurodegenerative diseases (2,3. However, the modeling of these interactions demands a clear view of the metabolic and signaling pathways implicated, but most of them are controversial and are still under evaluation (4. Hence, to gain insight into the complexity of these interactions a current view of the main pathways implicated in the neuron-astrocyte communication processes have been made from recent experimental reports and reviews. Furthermore, target problems, limitations and main conclusions have been identified from metabolic models of the brain reported from 2010. Finally, key aspects to take into account into the development of a computational model of the brain and topics that could be approached from a systems biology perspective in future research are highlighted.

  15. Some Comparisons of Complexity in Dictionary-Based and Linear Computational Models

    Czech Academy of Sciences Publication Activity Database

    Gnecco, G.; Kůrková, Věra; Sanguineti, M.

    2011-01-01

    Roč. 24, č. 2 (2011), s. 171-182 ISSN 0893-6080 R&D Project s: GA ČR GA201/08/1744 Grant - others:CNR - AV ČR project 2010-2012(XE) Complexity of Neural-Network and Kernel Computational Models Institutional research plan: CEZ:AV0Z10300504 Keywords : linear approximation schemes * variable-basis approximation schemes * model complexity * worst-case errors * neural networks * kernel models Subject RIV: IN - Informatics, Computer Science Impact factor: 2.182, year: 2011

  16. Method of generating a computer readable model

    DEFF Research Database (Denmark)

    2008-01-01

    A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element. The met......A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element....... The method comprises encoding a first and a second one of the construction elements as corresponding data structures, each representing the connection elements of the corresponding construction element, and each of the connection elements having associated with it a predetermined connection type. The method...... further comprises determining a first connection element of the first construction element and a second connection element of the second construction element located in a predetermined proximity of each other; and retrieving connectivity information of the corresponding connection types of the first...

  17. A survey on computational intelligence approaches for predictive modeling in prostate cancer

    OpenAIRE

    Cosma, G; Brown, D; Archer, M; Khan, M; Pockley, AG

    2017-01-01

    Predictive modeling in medicine involves the development of computational models which are capable of analysing large amounts of data in order to predict healthcare outcomes for individual patients. Computational intelligence approaches are suitable when the data to be modelled are too complex forconventional statistical techniques to process quickly and eciently. These advanced approaches are based on mathematical models that have been especially developed for dealing with the uncertainty an...

  18. Recommendations for computer modeling codes to support the UMTRA groundwater restoration project

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, M.D. [Sandia National Labs., Albuquerque, NM (United States); Khan, M.A. [IT Corp., Albuquerque, NM (United States)

    1996-04-01

    The Uranium Mill Tailings Remediation Action (UMTRA) Project is responsible for the assessment and remedial action at the 24 former uranium mill tailings sites located in the US. The surface restoration phase, which includes containment and stabilization of the abandoned uranium mill tailings piles, has a specific termination date and is nearing completion. Therefore, attention has now turned to the groundwater restoration phase, which began in 1991. Regulated constituents in groundwater whose concentrations or activities exceed maximum contaminant levels (MCLs) or background levels at one or more sites include, but are not limited to, uranium, selenium, arsenic, molybdenum, nitrate, gross alpha, radium-226 and radium-228. The purpose of this report is to recommend computer codes that can be used to assist the UMTRA groundwater restoration effort. The report includes a survey of applicable codes in each of the following areas: (1) groundwater flow and contaminant transport modeling codes, (2) hydrogeochemical modeling codes, (3) pump and treat optimization codes, and (4) decision support tools. Following the survey of the applicable codes, specific codes that can best meet the needs of the UMTRA groundwater restoration program in each of the four areas are recommended.

  19. Recommendations for computer modeling codes to support the UMTRA groundwater restoration project

    International Nuclear Information System (INIS)

    Tucker, M.D.; Khan, M.A.

    1996-04-01

    The Uranium Mill Tailings Remediation Action (UMTRA) Project is responsible for the assessment and remedial action at the 24 former uranium mill tailings sites located in the US. The surface restoration phase, which includes containment and stabilization of the abandoned uranium mill tailings piles, has a specific termination date and is nearing completion. Therefore, attention has now turned to the groundwater restoration phase, which began in 1991. Regulated constituents in groundwater whose concentrations or activities exceed maximum contaminant levels (MCLs) or background levels at one or more sites include, but are not limited to, uranium, selenium, arsenic, molybdenum, nitrate, gross alpha, radium-226 and radium-228. The purpose of this report is to recommend computer codes that can be used to assist the UMTRA groundwater restoration effort. The report includes a survey of applicable codes in each of the following areas: (1) groundwater flow and contaminant transport modeling codes, (2) hydrogeochemical modeling codes, (3) pump and treat optimization codes, and (4) decision support tools. Following the survey of the applicable codes, specific codes that can best meet the needs of the UMTRA groundwater restoration program in each of the four areas are recommended

  20. "Let's get physical": advantages of a physical model over 3D computer models and textbooks in learning imaging anatomy.

    Science.gov (United States)

    Preece, Daniel; Williams, Sarah B; Lam, Richard; Weller, Renate

    2013-01-01

    Three-dimensional (3D) information plays an important part in medical and veterinary education. Appreciating complex 3D spatial relationships requires a strong foundational understanding of anatomy and mental 3D visualization skills. Novel learning resources have been introduced to anatomy training to achieve this. Objective evaluation of their comparative efficacies remains scarce in the literature. This study developed and evaluated the use of a physical model in demonstrating the complex spatial relationships of the equine foot. It was hypothesized that the newly developed physical model would be more effective for students to learn magnetic resonance imaging (MRI) anatomy of the foot than textbooks or computer-based 3D models. Third year veterinary medicine students were randomly assigned to one of three teaching aid groups (physical model; textbooks; 3D computer model). The comparative efficacies of the three teaching aids were assessed through students' abilities to identify anatomical structures on MR images. Overall mean MRI assessment scores were significantly higher in students utilizing the physical model (86.39%) compared with students using textbooks (62.61%) and the 3D computer model (63.68%) (P < 0.001), with no significant difference between the textbook and 3D computer model groups (P = 0.685). Student feedback was also more positive in the physical model group compared with both the textbook and 3D computer model groups. Our results suggest that physical models may hold a significant advantage over alternative learning resources in enhancing visuospatial and 3D understanding of complex anatomical architecture, and that 3D computer models have significant limitations with regards to 3D learning. © 2013 American Association of Anatomists.

  1. Multi-objective reverse logistics model for integrated computer waste management.

    Science.gov (United States)

    Ahluwalia, Poonam Khanijo; Nema, Arvind K

    2006-12-01

    This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.

  2. Computational models of the pulmonary circulation: Insights and the move towards clinically directed studies

    Science.gov (United States)

    Tawhai, Merryn H.; Clark, Alys R.; Burrowes, Kelly S.

    2011-01-01

    Biophysically-based computational models provide a tool for integrating and explaining experimental data, observations, and hypotheses. Computational models of the pulmonary circulation have evolved from minimal and efficient constructs that have been used to study individual mechanisms that contribute to lung perfusion, to sophisticated multi-scale and -physics structure-based models that predict integrated structure-function relationships within a heterogeneous organ. This review considers the utility of computational models in providing new insights into the function of the pulmonary circulation, and their application in clinically motivated studies. We review mathematical and computational models of the pulmonary circulation based on their application; we begin with models that seek to answer questions in basic science and physiology and progress to models that aim to have clinical application. In looking forward, we discuss the relative merits and clinical relevance of computational models: what important features are still lacking; and how these models may ultimately be applied to further increasing our understanding of the mechanisms occurring in disease of the pulmonary circulation. PMID:22034608

  3. International Nuclear Model personal computer (PCINM): Model documentation

    International Nuclear Information System (INIS)

    1992-08-01

    The International Nuclear Model (INM) was developed to assist the Energy Information Administration (EIA), U.S. Department of Energy (DOE) in producing worldwide projections of electricity generation, fuel cycle requirements, capacities, and spent fuel discharges from commercial nuclear reactors. The original INM was developed, maintained, and operated on a mainframe computer system. In spring 1992, a streamlined version of INM was created for use on a microcomputer utilizing CLIPPER and PCSAS software. This new version is known as PCINM. This documentation is based on the new PCINM version. This document is designed to satisfy the requirements of several categories of users of the PCINM system including technical analysts, theoretical modelers, and industry observers. This document assumes the reader is familiar with the nuclear fuel cycle and each of its components. This model documentation contains four chapters and seven appendices. Chapter Two presents the model overview containing the PCINM structure and process flow, the areas for which projections are made, and input data and output reports. Chapter Three presents the model technical specifications showing all model equations, algorithms, and units of measure. Chapter Four presents an overview of all parameters, variables, and assumptions used in PCINM. The appendices present the following detailed information: variable and parameter listings, variable and equation cross reference tables, source code listings, file layouts, sample report outputs, and model run procedures. 2 figs

  4. Mathematical modeling and computational prediction of cancer drug resistance.

    Science.gov (United States)

    Sun, Xiaoqiang; Hu, Bin

    2017-06-23

    Diverse forms of resistance to anticancer drugs can lead to the failure of chemotherapy. Drug resistance is one of the most intractable issues for successfully treating cancer in current clinical practice. Effective clinical approaches that could counter drug resistance by restoring the sensitivity of tumors to the targeted agents are urgently needed. As numerous experimental results on resistance mechanisms have been obtained and a mass of high-throughput data has been accumulated, mathematical modeling and computational predictions using systematic and quantitative approaches have become increasingly important, as they can potentially provide deeper insights into resistance mechanisms, generate novel hypotheses or suggest promising treatment strategies for future testing. In this review, we first briefly summarize the current progress of experimentally revealed resistance mechanisms of targeted therapy, including genetic mechanisms, epigenetic mechanisms, posttranslational mechanisms, cellular mechanisms, microenvironmental mechanisms and pharmacokinetic mechanisms. Subsequently, we list several currently available databases and Web-based tools related to drug sensitivity and resistance. Then, we focus primarily on introducing some state-of-the-art computational methods used in drug resistance studies, including mechanism-based mathematical modeling approaches (e.g. molecular dynamics simulation, kinetic model of molecular networks, ordinary differential equation model of cellular dynamics, stochastic model, partial differential equation model, agent-based model, pharmacokinetic-pharmacodynamic model, etc.) and data-driven prediction methods (e.g. omics data-based conventional screening approach for node biomarkers, static network approach for edge biomarkers and module biomarkers, dynamic network approach for dynamic network biomarkers and dynamic module network biomarkers, etc.). Finally, we discuss several further questions and future directions for the use of

  5. Modeling and Analysis of Power Processing Systems (MAPPS). Volume 2: Appendices

    Science.gov (United States)

    Lee, F. C.; Radman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.

    1980-01-01

    The computer programs and derivations generated in support of the modeling and design optimization program are presented. Programs for the buck regulator, boost regulator, and buck-boost regulator are described. The computer program for the design optimization calculations is presented. Constraints for the boost and buck-boost converter were derived. Derivations of state-space equations and transfer functions are presented. Computer lists for the converters are presented, and the input parameters justified.

  6. The European computer model for optronic system performance prediction (ECOMOS)

    NARCIS (Netherlands)

    Kessler, S.; Bijl, P.; Labarre, L.; Repasi, E.; Wittenstein, W.; Bürsing, H.

    2017-01-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The

  7. Two-Language, Two-Paradigm Introductory Computing Curriculum Model and its Implementation

    OpenAIRE

    Zanev, Vladimir; Radenski, Atanas

    2011-01-01

    This paper analyzes difficulties with the introduction of object-oriented concepts in introductory computing education and then proposes a two-language, two-paradigm curriculum model that alleviates such difficulties. Our two-language, two-paradigm curriculum model begins with teaching imperative programming using Python programming language, continues with teaching object-oriented computing using Java, and concludes with teaching object-oriented data structures with Java.

  8. Computational and Game-Theoretic Approaches for Modeling Bounded Rationality

    NARCIS (Netherlands)

    L. Waltman (Ludo)

    2011-01-01

    textabstractThis thesis studies various computational and game-theoretic approaches to economic modeling. Unlike traditional approaches to economic modeling, the approaches studied in this thesis do not rely on the assumption that economic agents behave in a fully rational way. Instead, economic

  9. Computational modeling for fluid flow and interfacial transport

    CERN Document Server

    Shyy, Wei

    2006-01-01

    Practical applications and examples highlight this treatment of computational modeling for handling complex flowfields. A reference for researchers and graduate students of many different backgrounds, it also functions as a text for learning essential computation elements.Drawing upon his own research, the author addresses both macroscopic and microscopic features. He begins his three-part treatment with a survey of the basic concepts of finite difference schemes for solving parabolic, elliptic, and hyperbolic partial differential equations. The second part concerns issues related to computati

  10. Dynamical Trust and Reputation Computation Model for B2C E-Commerce

    OpenAIRE

    Bo Tian; Kecheng Liu; Yuanzhong Chen

    2015-01-01

    Trust is one of the most important factors that influence the successful application of network service environments, such as e-commerce, wireless sensor networks, and online social networks. Computation models associated with trust and reputation have been paid special attention in both computer societies and service science in recent years. In this paper, a dynamical computation model of reputation for B2C e-commerce is proposed. Firstly, conceptions associated with trust and reputation are...

  11. Computer Modeling of Platinum Reforming Reactors | Momoh ...

    African Journals Online (AJOL)

    This paper, instead of using a theoretical approach has considered a computer model as means of assessing the reformate composition for three-stage fixed bed reactors in platforming unit. This is done by identifying many possible hydrocarbon transformation reactions that are peculiar to the process unit, identify the ...

  12. Multi-resolution voxel phantom modeling: a high-resolution eye model for computational dosimetry.

    Science.gov (United States)

    Caracappa, Peter F; Rhodes, Ashley; Fiedler, Derek

    2014-09-21

    Voxel models of the human body are commonly used for simulating radiation dose with a Monte Carlo radiation transport code. Due to memory limitations, the voxel resolution of these computational phantoms is typically too large to accurately represent the dimensions of small features such as the eye. Recently reduced recommended dose limits to the lens of the eye, which is a radiosensitive tissue with a significant concern for cataract formation, has lent increased importance to understanding the dose to this tissue. A high-resolution eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and combined with an existing set of whole-body models to form a multi-resolution voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole-body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.

  13. Predictive Behavior of a Computational Foot/Ankle Model through Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Ruchi D. Chande

    2017-01-01

    Full Text Available Computational models are useful tools to study the biomechanics of human joints. Their predictive performance is heavily dependent on bony anatomy and soft tissue properties. Imaging data provides anatomical requirements while approximate tissue properties are implemented from literature data, when available. We sought to improve the predictive capability of a computational foot/ankle model by optimizing its ligament stiffness inputs using feedforward and radial basis function neural networks. While the former demonstrated better performance than the latter per mean square error, both networks provided reasonable stiffness predictions for implementation into the computational model.

  14. Improved object optimal synthetic description, modeling, learning, and discrimination by GEOGINE computational kernel

    Science.gov (United States)

    Fiorini, Rodolfo A.; Dacquino, Gianfranco

    2005-03-01

    GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous

  15. Positioning graphical objects on computer screens: a three-phase model.

    Science.gov (United States)

    Pastel, Robert

    2011-02-01

    This experiment identifies and models phases during the positioning of graphical objects (called cursors in this article) on computer displays. The human computer-interaction community has traditionally used Fitts' law to model selection in graphical user interfaces, whereas human factors experiments have found the single-component Fitts' law inadequate to model positioning of real objects. Participants (N=145) repeatedly positioned variably sized square cursors within variably sized rectangular targets using computer mice. The times for the cursor to just touch the target, for the cursor to enter the target, and for participants to indicate positioning completion were observed. The positioning tolerances were varied from very precise and difficult to imprecise and easy. The time for the cursor to touch the target was proportional to the initial cursor-target distance. The time for the cursor to completely enter the target after touching was proportional to the logarithms of cursor size divided by target tolerances. The time for participants to indicate positioning after entering was inversely proportional to the tolerance. A three-phase model defined by regions--distant, proximate, and inside the target--was proposed and could model the positioning tasks. The three-phase model provides a framework for ergonomists to evaluate new positioning techniques and can explain their deficiencies. The model provides a means to analyze tasks and enhance interaction during positioning.

  16. Modeling Techniques for a Computational Efficient Dynamic Turbofan Engine Model

    Directory of Open Access Journals (Sweden)

    Rory A. Roberts

    2014-01-01

    Full Text Available A transient two-stream engine model has been developed. Individual component models developed exclusively in MATLAB/Simulink including the fan, high pressure compressor, combustor, high pressure turbine, low pressure turbine, plenum volumes, and exit nozzle have been combined to investigate the behavior of a turbofan two-stream engine. Special attention has been paid to the development of transient capabilities throughout the model, increasing physics model, eliminating algebraic constraints, and reducing simulation time through enabling the use of advanced numerical solvers. The lessening of computation time is paramount for conducting future aircraft system-level design trade studies and optimization. The new engine model is simulated for a fuel perturbation and a specified mission while tracking critical parameters. These results, as well as the simulation times, are presented. The new approach significantly reduces the simulation time.

  17. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs.

    Directory of Open Access Journals (Sweden)

    Qifan Kuang

    Full Text Available Early and accurate identification of adverse drug reactions (ADRs is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs.In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper.Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  18. An efficient hysteresis modeling methodology and its implementation in field computation applications

    Energy Technology Data Exchange (ETDEWEB)

    Adly, A.A., E-mail: adlyamr@gmail.com [Electrical Power and Machines Dept., Faculty of Engineering, Cairo University, Giza 12613 (Egypt); Abd-El-Hafiz, S.K. [Engineering Mathematics Department, Faculty of Engineering, Cairo University, Giza 12613 (Egypt)

    2017-07-15

    Highlights: • An approach to simulate hysteresis while taking shape anisotropy into consideration. • Utilizing the ensemble of triangular sub-regions hysteresis models in field computation. • A novel tool capable of carrying out field computation while keeping track of hysteresis losses. • The approach may be extended for 3D tetra-hedra sub-volumes. - Abstract: Field computation in media exhibiting hysteresis is crucial to a variety of applications such as magnetic recording processes and accurate determination of core losses in power devices. Recently, Hopfield neural networks (HNN) have been successfully configured to construct scalar and vector hysteresis models. This paper presents an efficient hysteresis modeling methodology and its implementation in field computation applications. The methodology is based on the application of the integral equation approach on discretized triangular magnetic sub-regions. Within every triangular sub-region, hysteresis properties are realized using a 3-node HNN. Details of the approach and sample computation results are given in the paper.

  19. Cognitive Emotional Regulation Model in Human-Robot Interaction

    OpenAIRE

    Liu, Xin; Xie, Lun; Liu, Anqi; Li, Dan

    2015-01-01

    This paper integrated Gross cognitive process into the HMM (hidden Markov model) emotional regulation method and implemented human-robot emotional interaction with facial expressions and behaviors. Here, energy was the psychological driving force of emotional transition in the cognitive emotional model. The input facial expression was translated into external energy by expression-emotion mapping. Robot’s next emotional state was determined by the cognitive energy (the stimulus after cognition...

  20. Petri Net Modeling of Computer Virus Life Cycle | Ikekonwu ...

    African Journals Online (AJOL)

    Virus life cycle, which refers to the stages of development of a computer virus, is presented as a suitable area for the application of Petri nets. Petri nets a powerful modeling tool in the field of dynamic system analysis is applied to model the virus life cycle. Simulation of the derived model is also presented. The intention of ...