WorldWideScience

Sample records for preliminary computer modeling

  1. Preliminary Phase Field Computational Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hu, Shenyang Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xu, Ke [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Suter, Jonathan D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCloy, John S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnson, Bradley R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ramuhalli, Pradeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  2. Modeling the complete Otto cycle: Preliminary version. [computer programming

    Science.gov (United States)

    Zeleznik, F. J.; Mcbride, B. J.

    1977-01-01

    A description is given of the equations and the computer program being developed to model the complete Otto cycle. The program incorporates such important features as: (1) heat transfer, (2) finite combustion rates, (3) complete chemical kinetics in the burned gas, (4) exhaust gas recirculation, and (5) manifold vacuum or supercharging. Changes in thermodynamic, kinetic and transport data as well as model parameters can be made without reprogramming. Preliminary calculations indicate that: (1) chemistry and heat transfer significantly affect composition and performance, (2) there seems to be a strong interaction among model parameters, and (3) a number of cycles must be calculated in order to obtain steady-state conditions.

  3. Developing ontological model of computational linear algebra - preliminary considerations

    Science.gov (United States)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Lirkov, I.

    2013-10-01

    The aim of this paper is to propose a method for application of ontologically represented domain knowledge to support Grid users. The work is presented in the context provided by the Agents in Grid system, which aims at development of an agent-semantic infrastructure for efficient resource management in the Grid. Decision support within the system should provide functionality beyond the existing Grid middleware, specifically, help the user to choose optimal algorithm and/or resource to solve a problem from a given domain. The system assists the user in at least two situations. First, for users without in-depth knowledge about the domain, it should help them to select the method and the resource that (together) would best fit the problem to be solved (and match the available resources). Second, if the user explicitly indicates the method and the resource configuration, it should "verify" if her choice is consistent with the expert recommendations (encapsulated in the knowledge base). Furthermore, one of the goals is to simplify the use of the selected resource to execute the job; i.e., provide a user-friendly method of submitting jobs, without required technical knowledge about the Grid middleware. To achieve the mentioned goals, an adaptable method of expert knowledge representation for the decision support system has to be implemented. The selected approach is to utilize ontologies and semantic data processing, supported by multicriterial decision making. As a starting point, an area of computational linear algebra was selected to be modeled, however, the paper presents a general approach that shall be easily extendable to other domains.

  4. A Preliminary Jupiter Model

    CERN Document Server

    Hubbard, W B

    2016-01-01

    In anticipation of new observational results for Jupiter's axial moment of inertia and gravitational zonal harmonic coefficients from the forthcoming Juno orbiter, we present a number of preliminary Jupiter interior models. We combine results from ab initio computer simulations of hydrogen-helium mixtures, including immiscibility calculations, with a new nonperturbative calculation of Jupiter's zonal harmonic coefficients, to derive a self-consistent model for the planet's external gravity and moment of inertia. We assume helium rain modified the interior temperature and composition profiles. Our calculation predicts zonal harmonic values to which measurements can be compared. Although some models fit the observed (pre-Juno) second- and fourth-order zonal harmonics to within their error bars, our preferred reference model predicts a fourth-order zonal harmonic whose absolute value lies above the pre-Juno error bars. This model has a dense core of about 12 Earth masses, and a hydrogen-helium-rich envelope with...

  5. A Preliminary Jupiter Model

    Science.gov (United States)

    Hubbard, W. B.; Militzer, B.

    2016-03-01

    In anticipation of new observational results for Jupiter's axial moment of inertia and gravitational zonal harmonic coefficients from the forthcoming Juno orbiter, we present a number of preliminary Jupiter interior models. We combine results from ab initio computer simulations of hydrogen-helium mixtures, including immiscibility calculations, with a new nonperturbative calculation of Jupiter's zonal harmonic coefficients, to derive a self-consistent model for the planet's external gravity and moment of inertia. We assume helium rain modified the interior temperature and composition profiles. Our calculation predicts zonal harmonic values to which measurements can be compared. Although some models fit the observed (pre-Juno) second- and fourth-order zonal harmonics to within their error bars, our preferred reference model predicts a fourth-order zonal harmonic whose absolute value lies above the pre-Juno error bars. This model has a dense core of about 12 Earth masses and a hydrogen-helium-rich envelope with approximately three times solar metallicity.

  6. Preliminary analysis of the MER magnetic properties experiment using a computational fluid dynamics model

    DEFF Research Database (Denmark)

    Kinch, K.M.; Merrison, J.P.; Gunnlaugsson, H.P.;

    2006-01-01

    Motivated by questions raised by the magnetic properties experiments on the NASA Mars Pathfinder and Mars Exploration Rover (MER) missions, we have studied in detail the capture of airborne magnetic dust by permanent magnets using a computational fluid dynamics (CFD) model supported by laboratory...

  7. A PRELIMINARY JUPITER MODEL

    Energy Technology Data Exchange (ETDEWEB)

    Hubbard, W. B. [Lunar and Planetary Laboratory, The University of Arizona, Tucson, AZ 85721 (United States); Militzer, B. [Department of Earth and Planetary Science, Department of Astronomy, University of California, Berkeley, CA 94720 (United States)

    2016-03-20

    In anticipation of new observational results for Jupiter's axial moment of inertia and gravitational zonal harmonic coefficients from the forthcoming Juno orbiter, we present a number of preliminary Jupiter interior models. We combine results from ab initio computer simulations of hydrogen–helium mixtures, including immiscibility calculations, with a new nonperturbative calculation of Jupiter's zonal harmonic coefficients, to derive a self-consistent model for the planet's external gravity and moment of inertia. We assume helium rain modified the interior temperature and composition profiles. Our calculation predicts zonal harmonic values to which measurements can be compared. Although some models fit the observed (pre-Juno) second- and fourth-order zonal harmonics to within their error bars, our preferred reference model predicts a fourth-order zonal harmonic whose absolute value lies above the pre-Juno error bars. This model has a dense core of about 12 Earth masses and a hydrogen–helium-rich envelope with approximately three times solar metallicity.

  8. Computable models

    CERN Document Server

    Turner, Raymond

    2009-01-01

    Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al

  9. Preliminary reference Earth model

    Science.gov (United States)

    Dziewonski, Adam M.; Anderson, Don L.

    1981-06-01

    A large data set consisting of about 1000 normal mode periods, 500 summary travel time observations, 100 normal mode Q values, mass and moment of inertia have been inverted to obtain the radial distribution of elastic properties, Q values and density in the Earth's interior. The data set was supplemented with a special study of 12 years of ISC phase data which yielded an additional 1.75 × 10 6 travel time observations for P and S waves. In order to obtain satisfactory agreement with the entire data set we were required to take into account anelastic dispersion. The introduction of transverse isotropy into the outer 220 km of the mantle was required in order to satisfy the shorter period fundamental toroidal and spheroidal modes. This anisotropy also improved the fit of the larger data set. The horizontal and vertical velocities in the upper mantle differ by 2-4%, both for P and S waves. The mantle below 220 km is not required to be anisotropic. Mantle Rayleigh waves are surprisingly sensitive to compressional velocity in the upper mantle. High S n velocities, low P n velocities and a pronounced low-velocity zone are features of most global inversion models that are suppressed when anisotropy is allowed for in the inversion. The Preliminary Reference Earth Model, PREM, and auxiliary tables showing fits to the data are presented.

  10. Preliminary assessment of facial soft tissue thickness utilizing three-dimensional computed tomography models of living individuals.

    Science.gov (United States)

    Parks, Connie L; Richard, Adam H; Monson, Keith L

    2014-04-01

    Facial approximation is the technique of developing a representation of the face from the skull of an unknown individual. Facial approximation relies heavily on average craniofacial soft tissue depths. For more than a century, researchers have employed a broad array of tissue depth collection methodologies, a practice which has resulted in a lack of standardization in craniofacial soft tissue depth research. To combat such methodological inconsistencies, Stephan and Simpson 2008 [15] examined and synthesized a large number of previously published soft tissue depth studies. Their comprehensive meta-analysis produced a pooled dataset of averaged tissue depths and a simplified methodology, which the researchers suggest be utilized as a minimum standard protocol for future craniofacial soft tissue depth research. The authors of the present paper collected craniofacial soft tissue depths using three-dimensional models generated from computed tomography scans of living males and females of four self-identified ancestry groups from the United States ranging in age from 18 to 62 years. This paper assesses the differences between: (i) the pooled mean tissue depth values from the sample utilized in this paper and those published by Stephan 2012 [21] and (ii) the mean tissue depth values of two demographically similar subsets of the sample utilized in this paper and those published by Rhine and Moore 1984 [16]. Statistical test results indicate that the tissue depths collected from the sample evaluated in this paper are significantly and consistently larger than those published by Stephan 2012 [21]. Although a lack of published variance data by Rhine and Moore 1984 [16] precluded a direct statistical assessment, a substantive difference was also concluded. Further, the dataset presented in this study is representative of modern American adults and is, therefore, appropriate for use in constructing contemporary facial approximations.

  11. Preliminary ECLSS waste water model

    Science.gov (United States)

    Carter, Donald L.; Holder, Donald W., Jr.; Alexander, Kevin; Shaw, R. G.; Hayase, John K.

    1991-01-01

    A preliminary waste water model for input to the Space Station Freedom (SSF) Environmental Control and Life Support System (ECLSS) Water Processor (WP) has been generated for design purposes. Data have been compiled from various ECLSS tests and flight sample analyses. A discussion of the characterization of the waste streams comprising the model is presented, along with a discussion of the waste water model and the rationale for the inclusion of contaminants in their respective concentrations. The major objective is to establish a methodology for the development of a waste water model and to present the current state of that model.

  12. Preliminary ECLSS waste water model

    Science.gov (United States)

    Carter, Donald L.; Holder, Donald W., Jr.; Alexander, Kevin; Shaw, R. G.; Hayase, John K.

    1991-01-01

    A preliminary waste water model for input to the Space Station Freedom (SSF) Environmental Control and Life Support System (ECLSS) Water Processor (WP) has been generated for design purposes. Data have been compiled from various ECLSS tests and flight sample analyses. A discussion of the characterization of the waste streams comprising the model is presented, along with a discussion of the waste water model and the rationale for the inclusion of contaminants in their respective concentrations. The major objective is to establish a methodology for the development of a waste water model and to present the current state of that model.

  13. Motion analysis of total cervical disc replacements using computed tomography: Preliminary experience with nine patients and a model

    Energy Technology Data Exchange (ETDEWEB)

    Svedmark, Per (Div. of Orthopedics, Dept. of Molecular Medicine and Surgery, Karolinska Institutet, Stockholm (Sweden); Stockholm Spine Center, Lowenstromska Hospital, Stockholm (Sweden)), email: per.svedmark@spinecenter.se; Lundh, Fredrik; Olivecrona, Henrik (Div. of Orthopedics, Dept. of Molecular Medicine and Surgery, Karolinska Institutet, Stockholm (Sweden)); Nemeth, Gunnar (Capio group, Stockholm (Sweden)); Noz, Marilyn E. (Dept. of Radiology, New York Univ. School of Medicine, New York (United States)); Maguire Jr, Gerald Q. (School of Information and Communication Technology, Royal Inst. of Technology, Kista (Sweden)); Zeleznik, Michael P. (Saya Systems Inc., Salt Lake City (United States))

    2011-12-15

    Background. Cervical total disc replacement (CTDR) is an alternative to anterior fusion. Therefore, it is desirable to have an accurate in vivo measurement of prosthetic kinematics and assessment of implant stability relative to the adjacent vertebrae. Purpose. To devise an in vivo CT-based method to analyze the kinematics of cervical total disc replacements (CTDR), specifically of two prosthetic components between two CT scans obtained under different conditions. Material and Methods. Nine patients with CTDR were scanned in flexion and extension of the cervical spine using a clinical CT scanner with a routine low-dose protocol. The flexion and extension CT volume data were spatially registered, and the prosthetic kinematics of two prosthetic components, an upper and a lower, was calculated and expressed in Euler angles and orthogonal linear translations relative to the upper component. For accuracy analysis, a cervical spine model incorporating the same disc replacement as used in the patients was also scanned and processed in the same manner. Results. Analysis of both the model and patients showed good repeatability, i.e. within 2 standard deviations of the mean using the 95% limits of agreement with no overlapping confidence intervals. The accuracy analysis showed that the median error was close to zero. Conclusion. The mobility of the cervical spine after total disc replacement can be effectively measured in vivo using CT. This method requires an appropriate patient positioning and scan parameters to achieve suitable image quality

  14. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  15. Modeling Trusted Computing

    Institute of Scientific and Technical Information of China (English)

    CHEN Shuyi; WEN Yingyou; ZHAO Hong

    2006-01-01

    In this paper, a formal approach based on predicate logic is proposed for representing and reasoning of trusted computing models. Predicates are defined to represent the characteristics of the objects and the relationship among these objects in a trusted system according to trusted computing specifications. Inference rules of trusted relation are given too. With the semantics proposed, some trusted computing models are formalized and verified, which shows that Predicate calculus logic provides a general and effective method for modeling and reasoning trusted computing systems.

  16. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  17. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  18. Computational human body models

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Dommelen, J.A.W. van

    2005-01-01

    Computational human body models are widely used for automotive crashsafety research and design and as such have significantly contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models based on crash-dummies. However crash dummies dif

  19. Computational human body models

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Dommelen, J.A.W. van

    2005-01-01

    Computational human body models are widely used for automotive crashsafety research and design and as such have significantly contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models based on crash-dummies. However crash dummies

  20. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  1. Computationally modeling interpersonal trust.

    Science.gov (United States)

    Lee, Jin Joo; Knox, W Bradley; Wormwood, Jolie B; Breazeal, Cynthia; Desteno, David

    2013-01-01

    We present a computational model capable of predicting-above human accuracy-the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind's readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naiveté of this domain knowledge. We then present the construction of hidden Markov models to investigate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  2. Computer-aided hepatic tumour ablation requirements and preliminary results

    CERN Document Server

    Voirin, D; Amavizca, M; Letoublon, C; Troccaz, J; Voirin, David; Payan, Yohan; Amavizca, Miriam; Letoublon, Christian; Troccaz, Jocelyne

    2002-01-01

    Surgical resection of hepatic tumours is not always possible, since it depends on different factors, among which their location inside the liver functional segments. Alternative techniques consist in local use of chemical or physical agents to destroy the tumour. Radio frequency and cryosurgical ablations are examples of such alternative techniques that may be performed percutaneously. This requires a precise localisation of the tumour placement during ablation. Computer-assisted surgery tools may be used in conjunction with these new ablation techniques to improve the therapeutic efficiency, whilst they benefit from minimal invasiveness. This paper introduces the principles of a system for computer-assisted hepatic tumour ablation and describes preliminary experiments focusing on data registration evaluation. To keep close to conventional protocols, we consider registration of pre-operative CT or MRI data to intra-operative echographic data.

  3. Analysis of vector models in quantification of artifacts produced by standard prosthetic inlays in Cone-Beam Computed Tomography (CBCT)--a preliminary study.

    Science.gov (United States)

    Różyło-Kalinowska, Ingrid; Miechowicz, Sławomir; Sarna-Boś, Katarzyna; Borowicz, Janusz; Kalinowski, Paweł

    2014-11-17

    Cone-beam computed tomography (CBCT) is a relatively new, but highly efficient imaging method applied first in dentistry in 1998. However, the quality of the obtained slices depends among other things on artifacts generated by dental restorations as well as orthodontic and prosthetic appliances. The aim of the study was to quantify the artifacts produced by standard prosthetic inlays in CBCT images. The material consisted of 17 standard prosthetic inlays mounted in dental roots embedded in resin. The samples were examined by means of a large field of view CBCT unit, Galileos (Sirona, Germany), at 85 kV and 14 mAs. The analysis was performed using Able 3DDoctor software for data in the CT raster space as well as by means of Materialise Magics software for generated vector models (STL). The masks generated in the raster space included the area of the inlays together with image artifacts. The region of interest (ROI) of the raster space is a set of voxels from a selected range of Hounsfield units (109-3071). Ceramic inlay with zirconium dioxide (Cera Post) as well as epoxy resin inlay including silica fibers enriched with zirconium (Easy Post) produced the most intense artifacts. The smallest image distortions were created by titanium inlays, both passive (Harald Nordin) and active (Flexi Flange). Inlays containing zirconium generated the strongest artifacts, thus leading to the greatest distortions in the CBCT images. Carbon fiber inlay did not considerably affect the image quality.

  4. Analysis of Vector Models in Quantification of Artifacts Produced by Standard Prosthetic Inlays in Cone-Beam Computed Tomography (CBCT – a Preliminary Study

    Directory of Open Access Journals (Sweden)

    Ingrid Różyło-Kalinowska

    2014-11-01

    Full Text Available Cone-beam computed tomography (CBCT is a relatively new, but highly efficient imaging method applied first in dentistry in 1998. However, the quality of the obtained slices depends among other things on artifacts generated by dental restorations as well as orthodontic and prosthetic appliances. The aim of the study was to quantify the artifacts produced by standard prosthetic inlays in CBCT images. The material consisted of 17 standard prosthetic inlays mounted in dental roots embedded in resin. The samples were examined by means of a large field of view CBCT unit, Galileos (Sirona, Germany, at 85 kV and 14 mAs. The analysis was performed using Able 3DDoctor software for data in the CT raster space as well as by means of Materialise Magics software for generated vector models (STL. The masks generated in the raster space included the area of the inlays together with image artifacts. The region of interest (ROI of the raster space is a set of voxels from a selected range of Hounsfield units (109-3071. Ceramic inlay with zirconium dioxide (Cera Post as well as epoxy resin inlay including silica fibers enriched with zirconium (Easy Post produced the most intense artifacts. The smallest image distortions were created by titanium inlays, both passive (Harald Nordin and active (Flexi Flange. Inlays containing zirconium generated the strongest artifacts, thus leading to the greatest distortions in the CBCT images. Carbon fiber inlay did not considerably affect the image quality.

  5. Understanding Student Computational Thinking with Computational Modeling

    CERN Document Server

    Aiken, John M; Douglas, Scott S; Burk, John B; Scanlon, Erin M; Thoms, Brian D; Schatz, Michael F

    2012-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". Students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than obs...

  6. Computationally modeling interpersonal trust

    OpenAIRE

    Jin Joo eLee; Brad eKnox; Jolie eBaumann; Cynthia eBreazeal; David eDeSteno

    2013-01-01

    We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our pr...

  7. LHCb computing model

    CERN Document Server

    Frank, M; Pacheco, Andreu

    1998-01-01

    This document is a first attempt to describe the LHCb computing model. The CPU power needed to process data for the event filter and reconstruction is estimated to be 2.2 \\Theta 106 MIPS. This will be installed at the experiment and will be reused during non data-taking periods for reprocessing. The maximal I/O of these activities is estimated to be around 40 MB/s.We have studied three basic models concerning the placement of the CPU resources for the other computing activities, Monte Carlo-simulation (1:4 \\Theta 106 MIPS) and physics analysis (0:5 \\Theta 106 MIPS): CPU resources may either be located at the physicist's homelab, national computer centres (Regional Centres) or at CERN.The CPU resources foreseen for analysis are sufficient to allow 100 concurrent analyses. It is assumed that physicists will work in physics groups that produce analysis data at an average rate of 4.2 MB/s or 11 TB per month. However, producing these group analysis data requires reading capabilities of 660 MB/s. It is further assu...

  8. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  9. Simulation of the preliminary General Electric SP-100 space reactor concept using the ATHENA computer code

    Science.gov (United States)

    Fletcher, C. D.

    The capability to perform thermal-hydraulic analyses of a space reactor using the ATHENA computer code is demonstrated. The fast reactor, liquid-lithium coolant loops, and lithium-filled heat pipes of the preliminary General electric SP-100 design were modeled with ATHENA. Two demonstration transient calculations were performed simulating accident conditions. Calculated results are available for display using the Nuclear Plant Analyzer color graphics analysis tool in addition to traditional plots. ATHENA-calculated results appear reasonable, both for steady state full power conditions, and for the two transients. This analysis represents the first known transient thermal-hydraulic simulation using an integral space reactor system model incorporating heat pipes.

  10. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  11. Preliminary Model of Porphyry Copper Deposits

    Science.gov (United States)

    Berger, Byron R.; Ayuso, Robert A.; Wynn, Jeffrey C.; Seal, Robert R., II

    2008-01-01

    The U.S. Geological Survey (USGS) Mineral Resources Program develops mineral-deposit models for application in USGS mineral-resource assessments and other mineral resource-related activities within the USGS as well as for nongovernmental applications. Periodic updates of models are published in order to incorporate new concepts and findings on the occurrence, nature, and origin of specific mineral deposit types. This update is a preliminary model of porphyry copper deposits that begins an update process of porphyry copper models published in USGS Bulletin 1693 in 1986. This update includes a greater variety of deposit attributes than were included in the 1986 model as well as more information about each attribute. It also includes an expanded discussion of geophysical and remote sensing attributes and tools useful in resource evaluations, a summary of current theoretical concepts of porphyry copper deposit genesis, and a summary of the environmental attributes of unmined and mined deposits.

  12. Computational modeling and preliminary iroN, fepA, and cirA gene expression in Salmonella Enteritidis under iron-deficiency-induced conditions.

    Science.gov (United States)

    Zárate-Bonilla, Lina J; Del Portillo, Patricia; Sáenz-Suárez, Homero; Gonzáles-Santos, Janneth; Barreto-Sampaio, George E; Poutou-Piñales, Raúl A; Rey, Andrés Felipe; Rey, Jairo Guillermo

    2014-01-01

    Salmonellosis outbreaks in Europe, the United States, and Latin America have been associated with contaminated food derivatives including meat from the poultry industry. Salmonella grown under iron-limiting conditions has the capability to increase concentration of several iron-regulated outer-membrane proteins to augment the acquisition of the metal. These proteins have been proved to have immunogenic properties. Our aim was to increase the relative expression of iroN, fepA, and cirA in Salmonella Enteritidis domestic strain. Furthermore, we proposed a 3-dimensional structure model for each protein to predict and locate antigenic peptides. Our eventual objective is to produce an effective vaccine against regional avian salmonellosis. Two simple factorial designs were carried out to discriminate between 2 nitrogen sources and determine chelating-agent addition timing to augment relative gene expression. Two antigenic peptides located at the external face of each protein and 2 typical domains of iron-regulated outer-membrane proteins, plug and TonB-dep-Rec, were identified from the 3-dimensional models. Tryptone was selected as the best nitrogen source based on growth rate (μx = 0.36 h(-1)) and biomass productivity (Px = 0.9 g•h(-1)•L(-1)) as determined by a general factorial design. Optimum timing for chelating agent addition was in the middle of the log phase, which allowed relative expressions at 4 h of culture. Increase in iroN, fepA, and cirA relative expression was favored by the length of log phase and the addition of chelating agent, which decreased chelating toxicity and enhanced cell growth rate.

  13. Plasticity modeling & computation

    CERN Document Server

    Borja, Ronaldo I

    2013-01-01

    There have been many excellent books written on the subject of plastic deformation in solids, but rarely can one find a textbook on this subject. “Plasticity Modeling & Computation” is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids. It adopts a simple narrative style that is not mathematically overbearing, and has been written to emulate a professor giving a lecture on this subject inside a classroom. Each section is written to provide a balance between the relevant equations and the explanations behind them. Where relevant, sections end with one or more exercises designed to reinforce the understanding of the “lecture.” Color figures enhance the presentation and make the book very pleasant to read. For professors planning to use this textbook for their classes, the contents are sufficient for Parts A and B that can be taught in sequence over a period of two semesters or quarters.

  14. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  15. Modeling Malaysia's Energy System: Some Preliminary Results

    Directory of Open Access Journals (Sweden)

    Ahmad M. Yusof

    2011-01-01

    Full Text Available Problem statement: The current dynamic and fragile world energy environment necessitates the development of new energy model that solely caters to analyze Malaysia’s energy scenarios. Approach: The model is a network flow model that traces the flow of energy carriers from its sources (import and mining through some conversion and transformation processes for the production of energy products to final destinations (energy demand sectors. The integration to the economic sectors is done exogeneously by specifying the annual sectoral energy demand levels. The model in turn optimizes the energy variables for a specified objective function to meet those demands. Results: By minimizing the inter temporal petroleum product imports for the crude oil system the annual extraction level of Tapis blend is projected at 579600 barrels per day. The aggregate demand for petroleum products is projected to grow at 2.1% year-1 while motor gasoline and diesel constitute 42 and 38% of the petroleum products demands mix respectively over the 5 year planning period. Petroleum products import is expected to grow at 6.0% year-1. Conclusion: The preliminary results indicate that the model performs as expected. Thus other types of energy carriers such as natural gas, coal and biomass will be added to the energy system for the overall development of Malaysia energy model.

  16. Mathematical modeling of normal pharyngeal bolus transport: a preliminary study.

    Science.gov (United States)

    Chang, M W; Rosendall, B; Finlayson, B A

    1998-07-01

    Dysphagia (difficulty in swallowing) is a common clinical symptom associated with many diseases, such as stroke, multiple sclerosis, neuromuscular diseases, and cancer. Its complications include choking, aspiration, malnutrition, cachexia, and dehydration. The goal in dysphagia management is to provide adequate nutrition and hydration while minimizing the risk of choking and aspiration. It is important to advance the individual toward oral feeding in a timely manner to enhance the recovery of swallowing function and preserve the quality of life. Current clinical assessments of dysphagia are limited in providing adequate guidelines for oral feeding. Mathematical modeling of the fluid dynamics of pharyngeal bolus transport provides a unique opportunity for studying the physiology and pathophysiology of swallowing. Finite element analysis (FEA) is a special case of computational fluid dynamics (CFD). In CFD, the flow of a fluid in a space is modeled by covering the space with a grid and predicting how the fluid moves from grid point to grid point. FEA is capable of solving problems with complex geometries and free surfaces. A preliminary pharyngeal model has been constructed using FEA. This model incorporates literature-reported, normal, anatomical data with time-dependent pharyngeal/upper esophageal sphincter (UES) wall motion obtained from videofluorography (VFG). This time-dependent wall motion can be implemented as a moving boundary condition in the model. Clinical kinematic data can be digitized from VFG studies to construct and test the mathematical model. The preliminary model demonstrates the feasibility of modeling pharyngeal bolus transport, which, to our knowledge, has not been attempted before. This model also addresses the need and the potential for CFD in understanding the physiology and pathophysiology of the pharyngeal phase of swallowing. Improvements of the model are underway. Combining the model with individualized clinical data should potentially

  17. Computational modeling of concrete flow

    DEFF Research Database (Denmark)

    Roussel, Nicolas; Geiker, Mette Rica; Dufour, Frederic

    2007-01-01

    This paper provides a general overview of the present status regarding computational modeling of the flow of fresh concrete. The computational modeling techniques that can be found in the literature may be divided into three main families: single fluid simulations, numerical modeling of discrete...

  18. A Novel Forensic Computing Model

    Institute of Scientific and Technical Information of China (English)

    XU Yunfeng; LU Yansheng

    2006-01-01

    According to the requirement of computer forensic and network forensic, a novel forensic computing model is presented, which exploits XML/OEM/RM data model, Data fusion technology, forensic knowledgebase, inference mechanism of expert system and evidence mining engine. This model takes advantage of flexility and openness, so it can be widely used in mining evidence.

  19. In vivo bioprinting for computer- and robotic-assisted medical intervention: preliminary study in mice.

    Science.gov (United States)

    Keriquel, Virginie; Guillemot, Fabien; Arnault, Isabelle; Guillotin, Bertrand; Miraux, Sylvain; Amédée, Joëlle; Fricain, Jean-Christophe; Catros, Sylvain

    2010-03-01

    We present the first attempt to apply bioprinting technologies in the perspective of computer-assisted medical interventions. A workstation dedicated to high-throughput biological laser printing has been designed. Nano-hydroxyapatite (n-HA) was printed in the mouse calvaria defect model in vivo. Critical size bone defects were performed in OF-1 male mice calvaria with a 4 mm diameter trephine. Prior to laser printing experiments, the absence of inflammation due to laser irradiation onto mice dura mater was shown by means of magnetic resonance imaging. Procedures for in vivo bioprinting and results obtained using decalcified sections and x-ray microtomography are discussed. Although heterogeneous, these preliminary results demonstrate that in vivo bioprinting is possible. Bioprinting may prove to be helpful in the future for medical robotics and computer-assisted medical interventions.

  20. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...... are generated through the template in ICAS-MoT and translated into a model object. Once in ICAS-MoT, the model is numerical analyzed, solved and identified. A computer-aided modeling framework integrating systematic model derivation and development tools has been developed. It includes features for model...

  1. Computational modeling in biomechanics

    CERN Document Server

    Mofrad, Mohammad

    2010-01-01

    This book provides a glimpse of the diverse and important roles that modern computational technology is playing in various areas of biomechanics. It includes unique chapters on ab initio quantum mechanical, molecular dynamic and scale coupling methods..

  2. Computational modelling flow and transport

    NARCIS (Netherlands)

    Stelling, G.S.; Booij, N.

    1999-01-01

    Lecture notes CT wa4340. Derivation of equations using balance principles; numerical treatment of ordinary differential equations; time dependent partial differential equations; the strucure of a computer model:DUFLO; usage of numerical models.

  3. Mathematical Modeling and Computational Thinking

    Science.gov (United States)

    Sanford, John F.; Naidu, Jaideep T.

    2017-01-01

    The paper argues that mathematical modeling is the essence of computational thinking. Learning a computer language is a valuable assistance in learning logical thinking but of less assistance when learning problem-solving skills. The paper is third in a series and presents some examples of mathematical modeling using spreadsheets at an advanced…

  4. Tidal Response of Preliminary Jupiter Model

    OpenAIRE

    Wahl, Sean M; Hubbard, Willam B.; Militzer, Burkhard

    2016-01-01

    In anticipation of improved observational data for Jupiter's gravitational field from the Juno spacecraft, we predict the static tidal response for a variety of Jupiter interior models based on ab initio computer simulations of hydrogen-helium mixtures. We calculate hydrostatic-equilibrium gravity terms using the non-perturbative concentric Maclaurin Spheroid (CMS) method that eliminates lengthy expansions used in the theory of figures. Our method captures terms arising from the coupled tidal...

  5. Computation models of discourse

    Energy Technology Data Exchange (ETDEWEB)

    Brady, M.; Berwick, R.C.

    1983-01-01

    This book presents papers on artificial intelligence and natural language. Topics considered include recognizing intentions from natural language utterances, cooperative responses from a portable natural language database query system, natural language generation as a computational problem, focusing in the comprehension of definite anaphora, and factors in forming discourse-dependent descriptions.

  6. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    with them. As the required models may be complex and require multiple time and/or length scales, their development and application for product-process design is not trivial. Therefore, a systematic modeling framework can contribute by significantly reducing the time and resources needed for model...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...

  7. Computational models of syntactic acquisition.

    Science.gov (United States)

    Yang, Charles

    2012-03-01

    The computational approach to syntactic acquisition can be fruitfully pursued by integrating results and perspectives from computer science, linguistics, and developmental psychology. In this article, we first review some key results in computational learning theory and their implications for language acquisition. We then turn to examine specific learning models, some of which exploit distributional information in the input while others rely on a constrained space of hypotheses, yet both approaches share a common set of characteristics to overcome the learning problem. We conclude with a discussion of how computational models connects with the empirical study of child grammar, making the case for computationally tractable, psychologically plausible and developmentally realistic models of acquisition. WIREs Cogn Sci 2012, 3:205-213. doi: 10.1002/wcs.1154 For further resources related to this article, please visit the WIREs website.

  8. A preliminary model of wheelchair service delivery.

    Science.gov (United States)

    Eggers, Sara L; Myaskovsky, Larissa; Burkitt, Kelly H; Tolerico, Michelle; Switzer, Galen E; Fine, Michael J; Boninger, Michael L

    2009-06-01

    To integrate and expand on previously published models of wheelchair service delivery, and provide a preliminary framework for developing more comprehensive, descriptive models of wheelchair service delivery for adults with spinal cord injury within the U.S. health care system. Literature review and a qualitative analysis of in-depth interviews. Not applicable. Ten academic, clinical, regulatory, and industry experts (Department of Veterans Affairs [VA] and non-VA) in wheelchair service delivery. Not applicable. Interviewees were asked to discuss the full range of variables and stakeholders involved in wheelchair service delivery, and to limit their scope to the provision of primary subsequent or replacement chairs (not backup chairs) to adults within the United States. Most experts we interviewed stressed that clients who require a wheelchair play a central role in the wheelchair service delivery process. Providers (including clinicians, rehabilitation engineers, and rehabilitation counselors) are also critical stakeholders. More so than in other health care settings, suppliers play an integral role in the provision of wheelchairs to clients and may significantly influence the appropriateness of the wheelchair provided. Suppliers often have a direct role in wheelchair service delivery through their interactions with the clinician and/or client. This model also identified a number of system-level factors (including facility administration and standards, policies, and regulations) that influence wheelchair service delivery and ultimately the appropriateness of the wheelchair provided. We developed a detailed, descriptive model of wheelchair service delivery that integrates the delivery process and device outcomes, and includes the patient-level, provider-level, and system-level factors that may directly influence those processes and outcomes. We believe that this detailed model can help clinicians and researchers describe and consider the complexities of wheelchair

  9. The Architectural Designs of a Nanoscale Computing Model

    Directory of Open Access Journals (Sweden)

    Mary M. Eshaghian-Wilner

    2004-08-01

    Full Text Available A generic nanoscale computing model is presented in this paper. The model consists of a collection of fully interconnected nanoscale computing modules, where each module is a cube of cells made out of quantum dots, spins, or molecules. The cells dynamically switch between two states by quantum interactions among their neighbors in all three dimensions. This paper includes a brief introduction to the field of nanotechnology from a computing point of view and presents a set of preliminary architectural designs for fabricating the nanoscale model studied.

  10. The Architectural Designs of a Nanoscale Computing Model

    Directory of Open Access Journals (Sweden)

    Mary M. Eshaghian-Wilner

    2004-08-01

    Full Text Available A generic nanoscale computing model is presented in this paper. The model consists of a collection of fully interconnected nanoscale computing modules, where each module is a cube of cells made out of quantum dots, spins, or molecules. The cells dynamically switch between two states by quantum interactions among their neighbors in all three dimensions. This paper includes a brief introduction to the field of nanotechnology from a computing point of view and presents a set of preliminary architectural designs for fabricating the nanoscale model studied.

  11. toolkit computational mesh conceptual model.

    Energy Technology Data Exchange (ETDEWEB)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-03-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  12. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  13. Trust Models in Ubiquitous Computing

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2008-01-01

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.......We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models....

  14. Integrated Computational Model Development

    Science.gov (United States)

    2014-03-01

    68.5%, 9.6% and 21.9%, respectively. The alloy density and Vickers microhardness were ρ = 8.23 ± 0.01 g/cm3 and Hv = 5288 ± 1 MPa. [3...and 3-D. Techniques to mechanically test materials at smaller scales were developed to better inform the deformation models. Also methods were...situ microscale tension testing technique was adapted to enable microscale fatigue testing on tensile dog-bone specimens. Microscale tensile fatigue

  15. Electromagnetic Physics Models for Parallel Computing Architectures

    Science.gov (United States)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Duhem, L.; Elvira, D.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2016-10-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part of the GeantV project. Results of preliminary performance evaluation and physics validation are presented as well.

  16. Tidal Response of Preliminary Jupiter Model

    CERN Document Server

    Wahl, Sean M; Militzer, Burkhard

    2016-01-01

    In anticipation of improved observational data for Jupiter's gravitational field from the Juno spacecraft, we predict the static tidal response for a variety of Jupiter interior models based on ab initio computer simulations of hydrogen-helium mixtures. We calculate hydrostatic-equilibrium gravity terms using the non-perturbative concentric Maclaurin Spheroid (CMS) method that eliminates lengthy expansions used in the theory of figures. Our method captures terms arising from the coupled tidal and rotational perturbations, which we find to be important for a rapidly-rotating planet like Jupiter. Our predicted static tidal Love number $k_2 = 0.5900$ is $\\sim$10\\% larger than previous estimates. The value is, as expected, highly correlated with the zonal harmonic coefficient $J_2$, and is thus nearly constant when plausible changes are made to interior structure while holding $J_2$ fixed at the observed value. We note that the predicted static $k_2$ might change due to Jupiter's dynamical response to the Galilea...

  17. Component Breakout Computer Model

    Science.gov (United States)

    1987-04-29

    Weapon Systems: A Policy Analysis." The Rand Graduate Institute. November 1983. Boger . D. "Statistical Models for Estimating Overhead Costs." M. S...SQUARE SCREEN PROGRAM BO DLS 70 LOCATE 3,5 100 PRINT " I I I I I I I I I I I I I I t I I I t I I i iiitiii I I I I i t I i 110 LOCATE 4,5 I 20...GOTO 4620 4610 REM ***********«««*«««**#«***********#******»,*###!^5|[^,„<c#,5|c„ dl -r C^M EED SUPPORT .c.50 REM A6(6)...N0 OF EMPLOYEES 4660 IF

  18. [Tuscan Chronic Care Model: a preliminary analysis].

    Science.gov (United States)

    Barbato, Angelo; Meggiolaro, Angela; Rossi, Luigi; Fioravanti, C; Palermita, F; La Torre, Giuseppe

    2015-01-01

    the aim of this study is to present a preliminary analysis of efficacy and effectiveness of a model of chronically ill care (Chronic Care Model, CCM). the analysis took into account 106 territorial modules, 1016 General Practitioners and 1,228,595 patients. The diagnostic and therapeutic pathways activated (PDTA), involved four chronic conditions, selected according to the prevalence and incidence, in Tuscany Region: Diabetes Mellitus (DM), Heart Failure (SC), Chronic Obstructive Pulmonary Disease (COPD) and stroke. Six epidemiological indicators of process and output were selected, in order to measure the model of care performed, before and after its application: adherence to specific follow-up for each pathology (use of clinical and laboratory indicators), annual average of expenditure per/capita/euro for diagnostic tests, in laboratory and instrumental, average expenditure per/capita/year for specialist visits; hospitalization rate for diseases related to the main pathology, hospitalization rate for long-term complications and rate of access to the emergency department (ED). Data were collected through the database; the differences before and after the intervention and between exposed and unexposed, were analyzed by method "Before-After (Controlled and Uncontrolled) Studies". The impact of the intervention was calculated as DD (difference of the differences). DM management showed an increased adhesion to follow-up (DD: +8.1%), and the use of laboratory diagnostics (DD: +4,9 €/year/pc), less hospitalization for long-term complications and for endocrine related diseases (DD respectively: 5.8/1000 and DD: +1.2/1000), finally a smaller increase of access to PS (DD: -1.6/1000), despite a slight increase of specialistic visits (DD: +0,38 €/year/pc). The management of SC initially showed a rising adherence to follow-up (DD: +2.3%), a decrease of specialist visits (DD:E 1.03 €/year/pc), hospitalization and access to PS for exacerbations (DD: -4.4/1000 and DD: -6

  19. Efficient Computational Model of Hysteresis

    Science.gov (United States)

    Shields, Joel

    2005-01-01

    A recently developed mathematical model of the output (displacement) versus the input (applied voltage) of a piezoelectric transducer accounts for hysteresis. For the sake of computational speed, the model is kept simple by neglecting the dynamic behavior of the transducer. Hence, the model applies to static and quasistatic displacements only. A piezoelectric transducer of the type to which the model applies is used as an actuator in a computer-based control system to effect fine position adjustments. Because the response time of the rest of such a system is usually much greater than that of a piezoelectric transducer, the model remains an acceptably close approximation for the purpose of control computations, even though the dynamics are neglected. The model (see Figure 1) represents an electrically parallel, mechanically series combination of backlash elements, each having a unique deadband width and output gain. The zeroth element in the parallel combination has zero deadband width and, hence, represents a linear component of the input/output relationship. The other elements, which have nonzero deadband widths, are used to model the nonlinear components of the hysteresis loop. The deadband widths and output gains of the elements are computed from experimental displacement-versus-voltage data. The hysteresis curve calculated by use of this model is piecewise linear beyond deadband limits.

  20. Tidal Response of Preliminary Jupiter Model

    Science.gov (United States)

    Wahl, Sean M.; Hubbard, William B.; Militzer, Burkhard

    2016-11-01

    In anticipation of improved observational data for Jupiter’s gravitational field, from the Juno spacecraft, we predict the static tidal response for a variety of Jupiter interior models based on ab initio computer simulations of hydrogen-helium mixtures. We calculate hydrostatic-equilibrium gravity terms, using the non-perturbative concentric Maclaurin Spheroid method that eliminates lengthy expansions used in the theory of figures. Our method captures terms arising from the coupled tidal and rotational perturbations, which we find to be important for a rapidly rotating planet like Jupiter. Our predicted static tidal Love number, {k}2=0.5900, is ˜10% larger than previous estimates. The value is, as expected, highly correlated with the zonal harmonic coefficient J 2, and is thus nearly constant when plausible changes are made to the interior structure while holding J 2 fixed at the observed value. We note that the predicted static k 2 might change, due to Jupiter’s dynamical response to the Galilean moons, and find reasons to argue that the change may be detectable—although we do not present here a theory of dynamical tides for highly oblate Jovian planets. An accurate model of Jupiter’s tidal response will be essential for interpreting Juno observations and identifying tidal signals from effects of other interior dynamics of Jupiter’s gravitational field.

  1. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  2. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  3. A Preliminary Model of Infrared Image Generation for Exhaust Plume

    Directory of Open Access Journals (Sweden)

    Fei Mei

    2011-06-01

    Full Text Available Based on the irradiance calculation of all pixels on the focal plane array, a preliminary infrared imaging prediction model of exhaust plume that have considered the geometrical and the thermal resolution of the camera was developed to understanding the infrared characteristics of exhaust plume. In order to compute the irradiance incident on each pixel, the gas radiation transfer path in the plume for the instantaneous field of view corresponds to the pixel was solved by the simultaneous equation of a enclosure cylinder which covers the exhaust plume and the line of sight. Radiance of the transfer path was calculated by radiation transfer equation for nonscattering gas. The radiative properties of combustion needed in the equation was provided by employing Malkmus model with EM2C narrow band database(25cm-1. The pressure, species concentration along the path was determination by CFD analysis. The relative irradiance intensity of each pixel was converted to color in the display according to gray map coding and hot map coding. Infrared image of the exhaust plumes from a subsonic axisymmetric nozzle with different relative position of camera and the plume was predicted with the model. By changing the parameters, such as FOV and space resolution, the image of different imaging system can be predicted.

  4. Preliminary results of steel containment vessel model test

    Energy Technology Data Exchange (ETDEWEB)

    Luk, V.K.; Hessheimer, M.F. [Sandia National Labs., Albuquerque, NM (United States); Matsumoto, T.; Komine, K.; Arai, S. [Nuclear Power Engineering Corp., Tokyo (Japan); Costello, J.F. [Nuclear Regulatory Commission, Washington, DC (United States)

    1998-04-01

    A high pressure test of a mixed-scaled model (1:10 in geometry and 1:4 in shell thickness) of a steel containment vessel (SCV), representing an improved boiling water reactor (BWR) Mark II containment, was conducted on December 11--12, 1996 at Sandia National Laboratories. This paper describes the preliminary results of the high pressure test. In addition, the preliminary post-test measurement data and the preliminary comparison of test data with pretest analysis predictions are also presented.

  5. Computational models of adult neurogenesis

    Science.gov (United States)

    Cecchi, Guillermo A.; Magnasco, Marcelo O.

    2005-10-01

    Experimental results in recent years have shown that adult neurogenesis is a significant phenomenon in the mammalian brain. Little is known, however, about the functional role played by the generation and destruction of neurons in the context of an adult brain. Here, we propose two models where new projection neurons are incorporated. We show that in both models, using incorporation and removal of neurons as a computational tool, it is possible to achieve a higher computational efficiency that in purely static, synapse-learning-driven networks. We also discuss the implication for understanding the role of adult neurogenesis in specific brain areas like the olfactory bulb and the dentate gyrus.

  6. Computational Modeling of Simulation Tests.

    Science.gov (United States)

    1980-06-01

    Mexico , March 1979. 14. Kinney, G. F.,.::. IeiN, .hoce 1h Ir, McMillan, p. 57, 1962. 15. Courant and Friedrichs, ,U: r. on moca an.: Jho...AD 79 275 NEW MEXICO UNIV ALBUGUERGUE ERIC H WANG CIVIL ENGINE-ETC F/6 18/3 COMPUTATIONAL MODELING OF SIMULATION TESTS.(U) JUN 80 6 LEIGH, W CHOWN, B...COMPUTATIONAL MODELING OF SIMULATION TESTS00 0G. Leigh W. Chown B. Harrison Eric H. Wang Civil Engineering Research Facility University of New Mexico

  7. Modeling groundwater flow on massively parallel computers

    Energy Technology Data Exchange (ETDEWEB)

    Ashby, S.F.; Falgout, R.D.; Fogwell, T.W.; Tompson, A.F.B.

    1994-12-31

    The authors will explore the numerical simulation of groundwater flow in three-dimensional heterogeneous porous media. An interdisciplinary team of mathematicians, computer scientists, hydrologists, and environmental engineers is developing a sophisticated simulation code for use on workstation clusters and MPPs. To date, they have concentrated on modeling flow in the saturated zone (single phase), which requires the solution of a large linear system. they will discuss their implementation of preconditioned conjugate gradient solvers. The preconditioners under consideration include simple diagonal scaling, s-step Jacobi, adaptive Chebyshev polynomial preconditioning, and multigrid. They will present some preliminary numerical results, including simulations of groundwater flow at the LLNL site. They also will demonstrate the code`s scalability.

  8. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  9. Finite element modelling of the tricuspid valve: A preliminary study.

    Science.gov (United States)

    Stevanella, Marco; Votta, Emiliano; Lemma, Massimo; Antona, Carlo; Redaelli, Alberto

    2010-12-01

    The incomplete efficacy of current surgical repair procedures of the tricuspid valve (TV) demands a deeper comprehension of the physiological TV biomechanics. To this purpose, computational models can provide quantitative insight into TV biomechanical response and allow analysing the role of each TV substructure. We present here a three-dimensional finite element model of the tricuspid valve that takes into account most of its peculiar features. Experimental measurements were performed on human and porcine valves to obtain a more detailed TV anatomical framework. To overcome the complete lack of information on leaflets mechanical properties, we performed a sensitivity analysis on the parameters of the adopted non-linear hyperelastic constitutive model, hypothesizing three different parameter sets for three significant collagen fibre distributions. Results showed that leaflets' motion and maximum principal stress distribution were almost insensitive to the different material parameters considered. Highest stresses (about 100kPa) were located near the annulus of the anterior and septal leaflets, while the posterior leaflet experienced lower stresses (about 55kPa); stresses at the commissures were nearly zero. Conversely, changes in constitutive parameters deeply affected leaflets' strains magnitude, but not their overall pattern. Strains computed assuming that TV leaflets tissue are reinforced by a sparse and loosely arranged network of collagen fibres fitted best experimental data, thus suggesting that this may be the actual microstructure of TV leaflets. In a long-term perspective, this preliminary study aims at providing a starting point for the development of a predictive tool to quantitatively evaluate TV diseases and surgical repair procedures.

  10. Pervasive Computing and Prosopopoietic Modelling

    DEFF Research Database (Denmark)

    Michelsen, Anders Ib

    2011-01-01

    into the other. It also indicates a generative creation that itself points to important issues of ontology with methodological implications for the design of computing. In this article these implications will be conceptualised as prosopopoietic modeling on the basis of Bernward Joerges introduction...... of the classical rhetoric term of ’prosopopoeia’ into the debate on large technological systems. First, the paper introduces the paradoxical distinction/complicity by debating Gilbert Simondon’s notion of a ‘margin of indeterminacy’ vis-a-vis computing. Second, it debates the idea of prosopopoietic modeling......, pointing to a principal role of the paradoxical distinction/complicity within the computational heritage in three cases: a. Prosopopoietic aspects of John von Neumann’s First Draft of a Report on the EDVAC from 1945. b. Herbert Simon’s notion of simulation in The Science of the Artificial from the 1970s. c...

  11. Preliminary 2D numerical modeling of common granular problems

    Science.gov (United States)

    Wyser, Emmanuel; Jaboyedoff, Michel

    2017-04-01

    Granular studies received an increasing interest during the last decade. Many scientific investigations were successfully addressed to acknowledge the ubiquitous behavior of granular matter. We investigate liquid impacts onto granular beds, i.e. the influence of the packing and compaction-dilation transition. However, a physically-based model is still lacking to address complex microscopic features of granular bed response during liquid impacts such as compaction-dilation transition or granular bed uplifts (Wyser et al. in review). We present our preliminary 2D numerical modeling based on the Discrete Element Method (DEM) using nonlinear contact force law (the Hertz-Mindlin model) for disk shape particles. The algorithm is written in C programming language. Our 2D model provides an analytical tool to address granular problems such as i) granular collapses and ii) static granular assembliy problems. This provides a validation framework of our numerical approach by comparing our numerical results with previous laboratory experiments or numerical works. Inspired by the work of Warnett et al. (2014) and Staron & Hinch (2005), we studied i) the axisymetric collapse of granular columns. We addressed the scaling between the initial aspect ratio and the final runout distance. Our numerical results are in good aggreement with the previous studies of Warnett et al. (2014) and Staron & Hinch (2005). ii) Reproducing static problems for regular and randomly stacked particles provides a valid comparison to results of Egholm (2007). Vertical and horizontal stresses within the assembly are quite identical to stresses obtained by Egholm (2007), thus demonstating the consistency of our 2D numerical model. Our 2D numerical model is able to reproduce common granular case studies such as granular collapses or static problems. However, a sufficient small timestep should be used to ensure a good numerical consistency, resulting in higher computational time. The latter becomes critical

  12. Analyzing high energy physics data using database computing: Preliminary report

    Science.gov (United States)

    Baden, Andrew; Day, Chris; Grossman, Robert; Lifka, Dave; Lusk, Ewing; May, Edward; Price, Larry

    1991-01-01

    A proof of concept system is described for analyzing high energy physics (HEP) data using data base computing. The system is designed to scale up to the size required for HEP experiments at the Superconducting SuperCollider (SSC) lab. These experiments will require collecting and analyzing approximately 10 to 100 million 'events' per year during proton colliding beam collisions. Each 'event' consists of a set of vectors with a total length of approx. one megabyte. This represents an increase of approx. 2 to 3 orders of magnitude in the amount of data accumulated by present HEP experiments. The system is called the HEPDBC System (High Energy Physics Database Computing System). At present, the Mark 0 HEPDBC System is completed, and can produce analysis of HEP experimental data approx. an order of magnitude faster than current production software on data sets of approx. 1 GB. The Mark 1 HEPDBC System is currently undergoing testing and is designed to analyze data sets 10 to 100 times larger.

  13. Computer Profiling Based Model for Investigation

    Directory of Open Access Journals (Sweden)

    Neeraj Choudhary

    2011-10-01

    Full Text Available Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a computer system. The computer profiling object model can be implemented so as to support automated analysis to provide an investigator with the informationneeded to decide whether manual analysis is required.

  14. Hydronic distribution system computer model

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, J.W.; Strasser, J.J.

    1994-10-01

    A computer model of a hot-water boiler and its associated hydronic thermal distribution loop has been developed at Brookhaven National Laboratory (BNL). It is intended to be incorporated as a submodel in a comprehensive model of residential-scale thermal distribution systems developed at Lawrence Berkeley. This will give the combined model the capability of modeling forced-air and hydronic distribution systems in the same house using the same supporting software. This report describes the development of the BNL hydronics model, initial results and internal consistency checks, and its intended relationship to the LBL model. A method of interacting with the LBL model that does not require physical integration of the two codes is described. This will provide capability now, with reduced up-front cost, as long as the number of runs required is not large.

  15. FORENSIC COMPUTING MODELS: TECHNICAL OVERVIEW

    Directory of Open Access Journals (Sweden)

    Gulshan Shrivastava

    2012-05-01

    Full Text Available In this paper, we deal with introducing a technique of digital forensics for reconstruction of events or evidences after the commitment of a crime through any of the digital devices. It shows a clear transparency between Computer Forensics and Digital Forensics and gives a brief description about the classification of Digital Forensics. It has also been described that how the emergences of various digital forensic models help digital forensic practitioners and examiners in doing digital forensics. Further, discussed Merits and Demerits of the required models and review of every major model.

  16. Pervasive Computing and Prosopopoietic Modelling

    DEFF Research Database (Denmark)

    Michelsen, Anders Ib

    2011-01-01

    that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997) terms, ‘invisible...... into the other. It also indicates a generative creation that itself points to important issues of ontology with methodological implications for the design of computing. In this article these implications will be conceptualised as prosopopoietic modeling on the basis of Bernward Joerges introduction......, pointing to a principal role of the paradoxical distinction/complicity within the computational heritage in three cases: a. Prosopopoietic aspects of John von Neumann’s First Draft of a Report on the EDVAC from 1945. b. Herbert Simon’s notion of simulation in The Science of the Artificial from the 1970s. c...

  17. Outline and Preliminary Evaluation of the Classical Digital Library Model.

    Science.gov (United States)

    MacCall, Steven L.; Cleveland, Ana D.; Gibson, Ian E.

    1999-01-01

    Outlines the classical digital library model, which is derived from traditional practices of library and information science professionals, as an alternative to the database retrieval model. Reports preliminary results from an evaluation study of library and information professionals and endusers involved with primary care medicine. (AEF)

  18. A preliminary model of the coma of 2060 Chiron

    Science.gov (United States)

    Boice, Daniel C.; Konno, I.; Stern, S. Alan; Huebner, Walter F.

    1992-01-01

    We have included gravity in our fluid dynamic model with chemical kinetics of dusty comet comae and applied it with two dust sizes to 2060 Chiron. A progress report on the model and preliminary results concerning gas/dust dynamics and chemistry is given.

  19. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  20. Cosmic logic: a computational model

    Science.gov (United States)

    Vanchurin, Vitaly

    2016-02-01

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps.

  1. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  2. Synthesis, preliminary bioevaluation and computational analysis of caffeic acid analogues.

    Science.gov (United States)

    Liu, Zhiqian; Fu, Jianjun; Shan, Lei; Sun, Qingyan; Zhang, Weidong

    2014-05-16

    A series of caffeic acid amides were designed, synthesized and evaluated for anti-inflammatory activity. Most of them exhibited promising anti-inflammatory activity against nitric oxide (NO) generation in murine macrophage RAW264.7 cells. A 3D pharmacophore model was created based on the biological results for further structural optimization. Moreover, predication of the potential targets was also carried out by the PharmMapper server. These amide analogues represent a promising class of anti-inflammatory scaffold for further exploration and target identification.

  3. Preliminary Multivariable Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. Previously, the authors published two single variable cost models based on 19 flight missions. The current paper presents the development of a multi-variable space telescopes cost model. The validity of previously published models are tested. Cost estimating relationships which are and are not significant cost drivers are identified. And, interrelationships between variables are explored

  4. Cosmic Logic: a Computational Model

    CERN Document Server

    Vanchurin, Vitaly

    2015-01-01

    We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or G{\\" o}del number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies...

  5. Computational Modeling in Tissue Engineering

    CERN Document Server

    2013-01-01

    One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in:  (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...

  6. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology...... adoption theories, such as Diffusion of Innovations, Technology Acceptance Model, Unified Theory of Acceptance and Use of Technology. Further on, at research model for identification of Cloud Computing Adoption factors from a business model perspective is presented. The following business model building...

  7. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    adoption theories, such as Diffusion of Innovations, Technology Acceptance Model, Unified Theory of Acceptance and Use of Technology. Further on, at research model for identification of Cloud Computing Adoption factors from a business model perspective is presented. The following business model building......The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology...

  8. Modeling Malaysia's Energy System: Some Preliminary Results

    OpenAIRE

    Ahmad M. Yusof

    2011-01-01

    Problem statement: The current dynamic and fragile world energy environment necessitates the development of new energy model that solely caters to analyze Malaysias energy scenarios. Approach: The model is a network flow model that traces the flow of energy carriers from its sources (import and mining) through some conversion and transformation processes for the production of energy products to final destinations (energy demand sectors). The integration to the economic sectors is done exogene...

  9. MODEL IDENTIFICATION AND COMPUTER ALGEBRA.

    Science.gov (United States)

    Bollen, Kenneth A; Bauldry, Shawn

    2010-10-07

    Multiequation models that contain observed or latent variables are common in the social sciences. To determine whether unique parameter values exist for such models, one needs to assess model identification. In practice analysts rely on empirical checks that evaluate the singularity of the information matrix evaluated at sample estimates of parameters. The discrepancy between estimates and population values, the limitations of numerical assessments of ranks, and the difference between local and global identification make this practice less than perfect. In this paper we outline how to use computer algebra systems (CAS) to determine the local and global identification of multiequation models with or without latent variables. We demonstrate a symbolic CAS approach to local identification and develop a CAS approach to obtain explicit algebraic solutions for each of the model parameters. We illustrate the procedures with several examples, including a new proof of the identification of a model for handling missing data using auxiliary variables. We present an identification procedure for Structural Equation Models that makes use of CAS and that is a useful complement to current methods.

  10. Los Alamos Center for Computer Security formal computer security model

    Energy Technology Data Exchange (ETDEWEB)

    Dreicer, J.S.; Hunteman, W.J.; Markin, J.T.

    1989-01-01

    This paper provides a brief presentation of the formal computer security model currently being developed at the Los Alamos Department of Energy (DOE) Center for Computer Security (CCS). The need to test and verify DOE computer security policy implementation first motivated this effort. The actual analytical model was a result of the integration of current research in computer security and previous modeling and research experiences. The model is being developed to define a generic view of the computer and network security domains, to provide a theoretical basis for the design of a security model, and to address the limitations of present formal mathematical models for computer security. The fundamental objective of computer security is to prevent the unauthorized and unaccountable access to a system. The inherent vulnerabilities of computer systems result in various threats from unauthorized access. The foundation of the Los Alamos DOE CCS model is a series of functionally dependent probability equations, relations, and expressions. The model is undergoing continued discrimination and evolution. We expect to apply the model to the discipline of the Bell and LaPadula abstract sets of objects and subjects. 6 refs.

  11. Modeling method and preliminary model of Asteroid Toutatis from Chang'E-2 optical images

    Science.gov (United States)

    Li, Xiang-Yu; Qiao, Dong

    2014-06-01

    Shape modeling is fundamental to the analysis of dynamic environment and motion around asteroid. Chang'E-2 successfully made a flyby of Asteroid 4179 Toutatis and obtained plenty of high-resolution images during the mission. In this paper, the modeling method and preliminary model of Asteroid Toutatis are discussed. First, the optical images obtained by Chang'E-2 are analyzed. Terrain and silhouette features in images are described. Then, the modeling method based on previous radar model and preliminary information from optical images is proposed. A preliminary polyhedron model of Asteroid Toutatis is established. Finally, the spherical harmonic coefficients of Asteroid Toutatis based on the polyhedron model are obtained. Some parameters of model are analyzed and compared. Although the model proposed in this paper is only a preliminary model, this work offers a valuable reference for future high-resolution models.

  12. International Conference on Computational Intelligence, Cyber Security, and Computational Models

    CERN Document Server

    Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn

    2016-01-01

    This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

  13. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based on this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.

  14. Computational modelling of SCC flow

    DEFF Research Database (Denmark)

    Geiker, Mette Rica; Thrane, Lars Nyholm; Szabo, Peter

    2005-01-01

    To benefit from the full potential of self-compacting concrete (SCC) prediction tools are needed for the form filling of SCC. Such tools should take into account the properties of the concrete, the shape and size of the structural element, the position of rebars, and the casting technique. Exampl...... of computational models for the time dependent flow behavior are given, and advantages and disadvantages of discrete particle and single fluid models are briefly described.......To benefit from the full potential of self-compacting concrete (SCC) prediction tools are needed for the form filling of SCC. Such tools should take into account the properties of the concrete, the shape and size of the structural element, the position of rebars, and the casting technique. Examples...

  15. Computer modeling of piezoresistive gauges

    Energy Technology Data Exchange (ETDEWEB)

    Nutt, G. L.; Hallquist, J. O.

    1981-08-07

    A computer model of a piezoresistive gauge subject to shock loading is developed. The time-dependent two-dimensional response of the gauge is calculated. The stress and strain components of the gauge are determined assuming elastic-plastic material properties. The model is compared with experiment for four cases. An ytterbium foil gauge in a PPMA medum subjected to a 0.5 Gp plane shock wave, where the gauge is presented to the shock with its flat surface both parallel and perpendicular to the front. A similar comparison is made for a manganin foil subjected to a 2.7 Gp shock. The signals are compared also with a calibration equation derived with the gauge and medium properties accounted for but with the assumption that the gauge is in stress equilibrium with the shocked medium.

  16. Towards the Epidemiological Modeling of Computer Viruses

    OpenAIRE

    Xiaofan Yang; Lu-Xing Yang

    2012-01-01

    Epidemic dynamics of computer viruses is an emerging discipline aiming to understand the way that computer viruses spread on networks. This paper is intended to establish a series of rational epidemic models of computer viruses. First, a close inspection of some common characteristics shared by all typical computer viruses clearly reveals the flaws of previous models. Then, a generic epidemic model of viruses, which is named as the SLBS model, is proposed. Finally, diverse generalizations of ...

  17. A preliminary deposit model for lithium brines

    Science.gov (United States)

    Bradley, Dwight; Munk, LeeAnn; Jochens, Hillary; Hynek, Scott; Labay, Keith A.

    2013-01-01

    This report is part of an effort by the U.S. Geological Survey to update existing mineral deposit models and to develop new ones. The global transition away from hydrocarbons toward energy alternatives increases demand for many scarce metals. Among these is lithium, a key component of lithium-ion batteries for electric and hybrid vehicles. Lithium brine deposits account for about three-fourths of the world’s lithium production. Updating an earlier deposit model, we emphasize geologic information that might directly or indirectly help in exploration for lithium brine deposits, or for assessing regions for mineral resource potential. Special attention is given to the best-known deposit in the world—Clayton Valley, Nevada, and to the giant Salar de Atacama, Chile.

  18. Preliminary Model of Acute Mountain Sickness Severity

    Science.gov (United States)

    2010-10-01

    variance, the Akaike information criterion (AIC) and Bayesian information criterion (BIC) were utilized in selecting the final model using the... information and completed an Environmental Symptoms Questionnaire (ESQ). The ESQ assessed AMS severity using the validated AMS-Cerebral (AMS-C) factor...reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching

  19. Modeling and Testing of EVs - Preliminary Study and Laboratory Development

    DEFF Research Database (Denmark)

    Yang, Guang-Ya; Marra, Francesco; Nielsen, Arne Hejde

    2010-01-01

    impact at different geographical areas, as well as driving and charging patterns. Electric circuit model is deployed in this work to represent the electrical properties of a lithium-ion battery. This paper reports the preliminary modeling and validation work based on manufacturer data sheet and realistic......Electric vehicles (EVs) are expected to play a key role in the future energy management system to stabilize both supply and consumption with the presence of high penetration of renewable generation. A reasonably accurate model of battery is a key element for the study of EVs behavior and the grid...... tests, followed by the suggestions towards a feasible battery model for further studies....

  20. Towards the Epidemiological Modeling of Computer Viruses

    Directory of Open Access Journals (Sweden)

    Xiaofan Yang

    2012-01-01

    Full Text Available Epidemic dynamics of computer viruses is an emerging discipline aiming to understand the way that computer viruses spread on networks. This paper is intended to establish a series of rational epidemic models of computer viruses. First, a close inspection of some common characteristics shared by all typical computer viruses clearly reveals the flaws of previous models. Then, a generic epidemic model of viruses, which is named as the SLBS model, is proposed. Finally, diverse generalizations of the SLBS model are suggested. We believe this work opens a door to the full understanding of how computer viruses prevail on the Internet.

  1. Quantum Computation Beyond the Circuit Model

    OpenAIRE

    Jordan, Stephen P.

    2008-01-01

    The quantum circuit model is the most widely used model of quantum computation. It provides both a framework for formulating quantum algorithms and an architecture for the physical construction of quantum computers. However, several other models of quantum computation exist which provide useful alternative frameworks for both discovering new quantum algorithms and devising new physical implementations of quantum computers. In this thesis, I first present necessary background material for a ge...

  2. Computational modeling of epithelial tissues.

    Science.gov (United States)

    Smallwood, Rod

    2009-01-01

    There is an extensive literature on the computational modeling of epithelial tissues at all levels from subcellular to whole tissue. This review concentrates on behavior at the individual cell to whole tissue level, and particularly on organizational aspects, and provides an indication of where information from other areas, such as the modeling of angiogenesis, is relevant. The skin, and the lining of all of the body cavities (lung, gut, cervix, bladder etc) are epithelial tissues, which in a topological sense are the boundary between inside and outside the body. They are thin sheets of cells (usually of the order of 0.5 mm thick) without extracellular matrix, have a relatively simple structure, and contain few types of cells. They have important barrier, secretory and transport functions, which are essential for the maintenance of life, so homeostasis and wound healing are important aspects of the behavior of epithelial tissues. Carcinomas originate in epithelial tissues.There are essentially two approaches to modeling tissues--to start at the level of the tissue (i.e., a length scale of the order of 1 mm) and develop generalized equations for behavior (a continuum approach); or to start at the level of the cell (i.e., a length scale of the order of 10 µm) and develop tissue behavior as an emergent property of cellular behavior (an individual-based approach). As will be seen, these are not mutually exclusive approaches, and they come in a variety of flavors.

  3. Linguistics Computation, Automatic Model Generation, and Intensions

    CERN Document Server

    Nourani, C F

    1994-01-01

    Techniques are presented for defining models of computational linguistics theories. The methods of generalized diagrams that were developed by this author for modeling artificial intelligence planning and reasoning are shown to be applicable to models of computation of linguistics theories. It is shown that for extensional and intensional interpretations, models can be generated automatically which assign meaning to computations of linguistics theories for natural languages. Keywords: Computational Linguistics, Reasoning Models, G-diagrams For Models, Dynamic Model Implementation, Linguistics and Logics For Artificial Intelligence

  4. Model dynamics for quantum computing

    Science.gov (United States)

    Tabakin, Frank

    2017-08-01

    A model master equation suitable for quantum computing dynamics is presented. In an ideal quantum computer (QC), a system of qubits evolves in time unitarily and, by virtue of their entanglement, interfere quantum mechanically to solve otherwise intractable problems. In the real situation, a QC is subject to decoherence and attenuation effects due to interaction with an environment and with possible short-term random disturbances and gate deficiencies. The stability of a QC under such attacks is a key issue for the development of realistic devices. We assume that the influence of the environment can be incorporated by a master equation that includes unitary evolution with gates, supplemented by a Lindblad term. Lindblad operators of various types are explored; namely, steady, pulsed, gate friction, and measurement operators. In the master equation, we use the Lindblad term to describe short time intrusions by random Lindblad pulses. The phenomenological master equation is then extended to include a nonlinear Beretta term that describes the evolution of a closed system with increasing entropy. An external Bath environment is stipulated by a fixed temperature in two different ways. Here we explore the case of a simple one-qubit system in preparation for generalization to multi-qubit, qutrit and hybrid qubit-qutrit systems. This model master equation can be used to test the stability of memory and the efficacy of quantum gates. The properties of such hybrid master equations are explored, with emphasis on the role of thermal equilibrium and entropy constraints. Several significant properties of time-dependent qubit evolution are revealed by this simple study.

  5. A risk computation model for environmental restoration activities

    Energy Technology Data Exchange (ETDEWEB)

    Droppo, J.B. Jr.; Strenge, D.L.; Buck, J.W.

    1991-01-01

    A risk computation model useful in environmental restoration activities was developed for the US Department of Energy (DOE). This model, the Multimedia Environmental Pollutant Assessment System (MEPAS), can be used to evaluate effects of potential exposures over a broad range of regulatory issues including radioactive carcinogenic, nonradioactive carcinogenic, and noncarcinogenic effects. MEPAS integrates risk computation components. Release, transport, dispersion, deposition, exposure, and uptake computations are linked in a single system for evaluation of air, surface water, ground water, and overland flow transport. MEPAS uses standard computation approaches. Whenever available and appropriate, US Environmental Protection Agency guidance and models were used to facilitate compatibility and acceptance. MEPAS is a computational tool that can be used at several phases of an environmental restoration effort. At a preliminary stage in problem characterization, potential problems can be prioritized. As more data become available, MEPAS can provide an estimate of baseline risks or evaluate environmental monitoring data. In the feasibility stage, MEPAS can compute risk from alternative remedies. However, MEPAS is not designed to replace a detailed risk assessment of the selected remedy. For major problems, it will be appropriate to use a more detailed, risk computation tool for a detailed, site-specific evaluation of the selected remedy. 15 refs., 2 figs.

  6. Computational modeling of membrane proteins.

    Science.gov (United States)

    Koehler Leman, Julia; Ulmschneider, Martin B; Gray, Jeffrey J

    2015-01-01

    The determination of membrane protein (MP) structures has always trailed that of soluble proteins due to difficulties in their overexpression, reconstitution into membrane mimetics, and subsequent structure determination. The percentage of MP structures in the protein databank (PDB) has been at a constant 1-2% for the last decade. In contrast, over half of all drugs target MPs, only highlighting how little we understand about drug-specific effects in the human body. To reduce this gap, researchers have attempted to predict structural features of MPs even before the first structure was experimentally elucidated. In this review, we present current computational methods to predict MP structure, starting with secondary structure prediction, prediction of trans-membrane spans, and topology. Even though these methods generate reliable predictions, challenges such as predicting kinks or precise beginnings and ends of secondary structure elements are still waiting to be addressed. We describe recent developments in the prediction of 3D structures of both α-helical MPs as well as β-barrels using comparative modeling techniques, de novo methods, and molecular dynamics (MD) simulations. The increase of MP structures has (1) facilitated comparative modeling due to availability of more and better templates, and (2) improved the statistics for knowledge-based scoring functions. Moreover, de novo methods have benefited from the use of correlated mutations as restraints. Finally, we outline current advances that will likely shape the field in the forthcoming decade.

  7. Preliminary Computational Study for Future Tests in the NASA Ames 9 foot' x 7 foot Wind Tunnel

    Science.gov (United States)

    Pearl, Jason M.; Carter, Melissa B.; Elmiligui, Alaa A.; WInski, Courtney S.; Nayani, Sudheer N.

    2016-01-01

    The NASA Advanced Air Vehicles Program, Commercial Supersonics Technology Project seeks to advance tools and techniques to make over-land supersonic flight feasible. In this study, preliminary computational results are presented for future tests in the NASA Ames 9 foot x 7 foot supersonic wind tunnel to be conducted in early 2016. Shock-plume interactions and their effect on pressure signature are examined for six model geometries. Near- field pressure signatures are assessed using the CFD code USM3D to model the proposed test geometries in free-air. Additionally, results obtained using the commercial grid generation software Pointwise Reigistered Trademark are compared to results using VGRID, the NASA Langley Research Center in-house mesh generation program.

  8. Cupola Furnace Computer Process Model

    Energy Technology Data Exchange (ETDEWEB)

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  9. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  10. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  11. Modeling the connection between development and evolution: Preliminary report

    Energy Technology Data Exchange (ETDEWEB)

    Mjolsness, E.; Reinitz, J. [Yale Univ., New Haven, CT (United States); Garrett, C.D. [Washington Univ., Seattle, WA (United States). Dept. of Computer Science; Sharp, D.H. [Los Alamos National Lab., NM (United States)

    1993-07-29

    In this paper we outline a model which incorporates development processes into an evolutionary frame work. The model consists of three sectors describing development, genetics, and the selective environment. The formulation of models governing each sector uses dynamical grammars to describe processes in which state variables evolve in a quantitative fashion, and the number and type of participating biological entities can change. This program has previously been elaborated for development. Its extension to the other sectors of the model is discussed here and forms the basis for further approximations. A specific implementation of these ideas is described for an idealized model of the evolution of a multicellular organism. While this model doe not describe an actual biological system, it illustrates the interplay of development and evolution. Preliminary results of numerical simulations of this idealized model are presented.

  12. Computational Modelling in Cancer: Methods and Applications

    Directory of Open Access Journals (Sweden)

    Konstantina Kourou

    2015-01-01

    Full Text Available Computational modelling of diseases is an emerging field, proven valuable for the diagnosis, prognosis and treatment of the disease. Cancer is one of the diseases where computational modelling provides enormous advancements, allowing the medical professionals to perform in silico experiments and gain insights prior to any in vivo procedure. In this paper, we review the most recent computational models that have been proposed for cancer. Well known databases used for computational modelling experiments, as well as, the various markup language representations are discussed. In addition, recent state of the art research studies related to tumour growth and angiogenesis modelling are presented.

  13. Biologically Inspired Visual Model With Preliminary Cognition and Active Attention Adjustment.

    Science.gov (United States)

    Qiao, Hong; Xi, Xuanyang; Li, Yinlin; Wu, Wei; Li, Fengfu

    2015-11-01

    Recently, many computational models have been proposed to simulate visual cognition process. For example, the hierarchical Max-Pooling (HMAX) model was proposed according to the hierarchical and bottom-up structure of V1 to V4 in the ventral pathway of primate visual cortex, which could achieve position- and scale-tolerant recognition. In our previous work, we have introduced memory and association into the HMAX model to simulate visual cognition process. In this paper, we improve our theoretical framework by mimicking a more elaborate structure and function of the primate visual cortex. We will mainly focus on the new formation of memory and association in visual processing under different circumstances as well as preliminary cognition and active adjustment in the inferior temporal cortex, which are absent in the HMAX model. The main contributions of this paper are: 1) in the memory and association part, we apply deep convolutional neural networks to extract various episodic features of the objects since people use different features for object recognition. Moreover, to achieve a fast and robust recognition in the retrieval and association process, different types of features are stored in separated clusters and the feature binding of the same object is stimulated in a loop discharge manner and 2) in the preliminary cognition and active adjustment part, we introduce preliminary cognition to classify different types of objects since distinct neural circuits in a human brain are used for identification of various types of objects. Furthermore, active cognition adjustment of occlusion and orientation is implemented to the model to mimic the top-down effect in human cognition process. Finally, our model is evaluated on two face databases CAS-PEAL-R1 and AR. The results demonstrate that our model exhibits its efficiency on visual recognition process with much lower memory storage requirement and a better performance compared with the traditional purely computational

  14. The Square Kilometre Array Science Data Processor. Preliminary compute platform design

    Science.gov (United States)

    Broekema, P. C.; van Nieuwpoort, R. V.; Bal, H. E.

    2015-07-01

    The Square Kilometre Array is a next-generation radio-telescope, to be built in South Africa and Western Australia. It is currently in its detailed design phase, with procurement and construction scheduled to start in 2017. The SKA Science Data Processor is the high-performance computing element of the instrument, responsible for producing science-ready data. This is a major IT project, with the Science Data Processor expected to challenge the computing state-of-the art even in 2020. In this paper we introduce the preliminary Science Data Processor design and the principles that guide the design process, as well as the constraints to the design. We introduce a highly scalable and flexible system architecture capable of handling the SDP workload.

  15. Preliminary modeling of the TMI-2 accident with MELPROG-TRAC

    Energy Technology Data Exchange (ETDEWEB)

    Jenks, R.P.

    1988-01-01

    In support of Nuclear Regulatory Commission and Organization for Economic Cooperation and Development (OECD)-sponsored Three Mile Island-Unit 2 (TMI-2) Analysis Exercise studies, work has been performed to develop a simulation model of the TMI-2 plant for use with the integrated MELPROG-TRAC computer code. Numerous nuclear power plant simulation studies have been performed with the TRAC computer code in the past. Some of these addressed the TMI-2 accident or other hypothetical events at the TMI plant. In addition, studies have been previously performed with the MELPROG-TRAC code using Oconee-1 and Surry plant models. This paper describes the preliminary MELPROG-TRAC input model for severe accident analysis.

  16. Computational Models for Nonlinear Aeroelastic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate new and efficient computational methods of modeling nonlinear aeroelastic systems. The...

  17. Model of computation for Fourier optical processors

    Science.gov (United States)

    Naughton, Thomas J.

    2000-05-01

    We present a novel and simple theoretical model of computation that captures what we believe are the most important characteristics of an optical Fourier transform processor. We use this abstract model to reason about the computational properties of the physical systems it describes. We define a grammar for our model's instruction language, and use it to write algorithms for well-known filtering and correlation techniques. We also suggest suitable computational complexity measures that could be used to analyze any coherent optical information processing technique, described with the language, for efficiency. Our choice of instruction language allows us to argue that algorithms describable with this model should have optical implementations that do not require a digital electronic computer to act as a master unit. Through simulation of a well known model of computation from computer theory we investigate the general-purpose capabilities of analog optical processors.

  18. A computational model for feature binding

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The "Binding Problem" is an important problem across many disciplines, including psychology, neuroscience, computational modeling, and even philosophy. In this work, we proposed a novel computational model, Bayesian Linking Field Model, for feature binding in visual perception, by combining the idea of noisy neuron model, Bayesian method, Linking Field Network and competitive mechanism. Simulation Experiments demonstrated that our model perfectly fulfilled the task of feature binding in visual perception and provided us some enlightening idea for future research.

  19. A computational model for feature binding

    Institute of Scientific and Technical Information of China (English)

    SHI ZhiWei; SHI ZhongZhi; LIU Xi; SHI ZhiPing

    2008-01-01

    The "Binding Problem" is an important problem across many disciplines, including psychology, neuroscience, computational modeling, and even philosophy. In this work, we proposed a novel computational model, Bayesian Linking Field Model, for feature binding in visual perception, by combining the idea of noisy neuron model, Bayesian method, Linking Field Network and competitive mechanism.Simulation Experiments demonstrated that our model perfectly fulfilled the task of feature binding in visual perception and provided us some enlightening idea for future research.

  20. Computational nanophotonics modeling and applications

    CERN Document Server

    Musa, Sarhan M

    2013-01-01

    This reference offers tools for engineers, scientists, biologists, and others working with the computational techniques of nanophotonics. It introduces the key concepts of computational methods in a manner that is easily digestible for newcomers to the field. The book also examines future applications of nanophotonics in the technical industry and covers new developments and interdisciplinary research in engineering, science, and medicine. It provides an overview of the key computational nanophotonics and describes the technologies with an emphasis on how they work and their key benefits.

  1. Preliminary Computational Fluid Dynamics (CFD) Simulation of EIIB Push Barge in Shallow Water

    Science.gov (United States)

    Beneš, Petr; Kollárik, Róbert

    2011-12-01

    This study presents preliminary CFD simulation of EIIb push barge in inland conditions using CFD software Ansys Fluent. The RANSE (Reynolds Averaged Navier-Stokes Equation) methods are used for the viscosity solution of turbulent flow around the ship hull. Different RANSE methods are used for the comparison of their results in ship resistance calculations, for selecting the appropriate and removing inappropriate methods. This study further familiarizes on the creation of geometrical model which considers exact water depth to vessel draft ratio in shallow water conditions, grid generation, setting mathematical model in Fluent and evaluation of the simulations results.

  2. Two Classes of Models of Granular Computing

    Institute of Scientific and Technical Information of China (English)

    Daowu Pei

    2006-01-01

    This paper reviews a class of important models of granular computing which are induced by equivalence relations, or by general binary relations, or by neighborhood systems, and propose a class of models of granular computing which are induced by coverings of the given universe.

  3. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  4. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  5. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  6. Preliminary efficacy of a computer-delivered HIV prevention intervention for African American teenage females.

    Science.gov (United States)

    Klein, Charles H; Card, Josefina J

    2011-12-01

    This study translated SiHLE (Sisters Informing, Healing, Living, and Empowering), a 12-hour Centers for Disease Control and Prevention evidence-based group-level intervention for African American females 14-18 years of age, into a 2-hour computer-delivered individual-level intervention. A randomized controlled trial (n = 178) was conducted to examine the efficacy of the new Multimedia SiHLE intervention. Average condom-protected sex acts (proportion of vaginal sex acts with condoms, last 90 days) for sexually active participants receiving Multimedia SiHLE rose from M = 51% at baseline to M = 71% at 3-month follow-up (t = 2.06, p = .05); no statistically significant difference was found in the control group. Non-sexually active intervention group participants reported a significant increase in condom self-efficacy (t = 2.36, p = .02); no statistically significant difference was found in the control group. The study provides preliminary support for the efficacy of a computer-delivered adaptation of a proven HIV prevention program for African American teenage women. This is consistent with meta-analyses that have shown that computer-delivered interventions, which can often be disseminated at lower per-capita cost than human-delivered interventions, can influence HIV risk behaviors in positive fashion.

  7. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-06-01

    A computational approach used for subsurface explosion cratering was extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for the first computer simulation because it is one of the most thoroughly studied craters. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s, meteorite mass of 1.67 x 10/sup 8/ kg, with a corresponding kinetic energy of 1.88 x 10/sup 16/ J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation, a Tillotson equation-of-state description for iron and limestone was used with no shear strength. Results obtained for this preliminary calculation of the formation of Meteor Crater are in good agreement with field measurements. A color movie based on this calculation was produced using computer-generated graphics. 19 figures, 5 tables, 63 references.

  8. Element-Based Computational Model

    Directory of Open Access Journals (Sweden)

    Conrad Mueller

    2012-02-01

    Full Text Available A variation on the data-flow model is proposed to use for developing parallel architectures. While the model is a data driven model it has significant differences to the data-flow model. The proposed model has an evaluation cycleof processing elements (encapsulated data that is similar to the instruction cycle of the von Neumann model. The elements contain the information required to process them. The model is inherently parallel. An emulation of the model has been implemented. The objective of this paper is to motivate support for taking the research further. Using matrix multiplication as a case study, the element/data-flow based model is compared with the instruction-based model. This is done using complexity analysis followed by empirical testing to verify this analysis. The positive results are given as motivation for the research to be taken to the next stage - that is, implementing the model using FPGAs.

  9. Preliminary Study of Jerome Model and Horace Model

    Institute of Scientific and Technical Information of China (English)

    周静; 蒋秀丽

    2013-01-01

    Translation always plays the crucial role in the cross-culture communication and the development of human being’s history of civilization. The author of this papertries to make an analysis and contrast of the two translation models: the Jerome model and the Horace model, pointing out that it is because of different circumstances that different translation models are used in the translation.

  10. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    Science.gov (United States)

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  11. Computational modeling of lipoprotein metabolism

    NARCIS (Netherlands)

    Schalkwijk, Daniël Bernardus van

    2013-01-01

    This PhD thesis contains the following chapters. The first part, containing chapter 2 and 3 mainly concerns model development. Chapter 2 describes the development of a mathematical modeling framework within which different diagnostic models based on lipoprotein profiles can be developed, and a first

  12. Computer Aided Modelling – Opportunities and Challenges

    DEFF Research Database (Denmark)

    2011-01-01

    This chapter considers the opportunities that are present in developing, extending and applying aspects of computer-aided modelling principles and practice. What are the best tasks to be done by modellers and what needs the application of CAPE tools? How do we efficiently develop model-based solu......This chapter considers the opportunities that are present in developing, extending and applying aspects of computer-aided modelling principles and practice. What are the best tasks to be done by modellers and what needs the application of CAPE tools? How do we efficiently develop model...

  13. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  14. Preliminary Study of the Gravimetric Local Geoid Model in Jordan:

    Directory of Open Access Journals (Sweden)

    A. Al-Zoubi

    2007-06-01

    Full Text Available Recently, there is an increased interest in studying and defining the Local and Regional Geoid Model worldwide, due to its importance in geodetic and geophysics applications. The use of the Global Positioning System (GPS is internationally growing, yet the lack of any Geoid Model for Jordan has limited the use of GPS for geodetic applications. Therefore, this work aims to present the preliminary results that we propose for The Gravimetric Jordanian Geoid Model (GeoJordan. The model is created using gravimetric data and the GravSoft program. The validation of this model is done by using GPS measurements and precise leveling at Amman area. However, a comparison between the Global Geopotential Models OSU91A and EGM96 showed great discrepancies through the presented results. Also, presenting the approach used to obtain the orthometric height from GPS ellipsoidal height measurements. Nevertheless, the error margin; obtained in this initial study of the GeoJordan after fitting the data with GPS/leveling measurement; is about (10cm, in tested area whereas the standard error of the created model is about (40cm.

  15. Applying computer simulation models as learning tools in fishery management

    Science.gov (United States)

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  16. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  17. Visual and Computational Modelling of Minority Games

    OpenAIRE

    Robertas Damaševičius; Darius Ašeriškis

    2017-01-01

    The paper analyses the Minority Game and focuses on analysis and computational modelling of several variants (variable payoff, coalition-based and ternary voting) of Minority Game using UAREI (User-Action-Rule-Entities-Interface) model. UAREI is a model for formal specification of software gamification, and the UAREI visual modelling language is a language used for graphical representation of game mechanics. The URAEI model also provides the embedded executable modelling framework to evaluate...

  18. 26 CFR 1.818-4 - Election with respect to life insurance reserves computed on preliminary term basis.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Election with respect to life insurance reserves... Election with respect to life insurance reserves computed on preliminary term basis. (a) In general. Section 818(c) permits a life insurance company issuing contracts with respect to which the life...

  19. Macromolecular Chain at a Cellular Surface: a Computer Simulation Model

    Science.gov (United States)

    Xie, Jun; Pandey, Ras

    2001-06-01

    Computer simulations are performed to study conformation and dynamics of relatively large chain macromolecule at the surface of a model cell membrane - a preliminary attempt to ultimately realistic model for protein on a cell membrane. We use a discrete lattice of size Lx × L × L. The chain molecule of length Lc is modelled by consecutive nodes connected by bonds on the trail of a random walk with appropriate constraints such as excluded volume, energy dependent configurational bias, etc. Monte Carlo method is used to move chains via segmental dynamics, i.e., end-move, kink-jump, crank-shaft, reptation, etc. Membrane substrate is designed by an ensemble of short chains on a flat surface. Large chain molecule is then driven toward the membrane by a field. We plan to examine the dynamics of chain macromolecule, spread of its density, and its conformation.

  20. Hierarchical Modeling and Robust Synthesis for the Preliminary Design of Large Scale Complex Systems

    Science.gov (United States)

    Koch, Patrick N.

    1997-01-01

    constructing partitioned response surfaces is developed to reduce the computational expense of experimentation for fitting models in a large number of factors. Noise modeling techniques are compared and recommendations are offered for the implementation of robust design when approximate models are sought. These techniques, approaches, and recommendations are incorporated within the method developed for hierarchical robust preliminary design exploration. This method as well as the associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system. The case study is developed in collaboration with Allison Engine Company, Rolls Royce Aerospace, and is based on the Allison AE3007 existing engine designed for midsize commercial, regional business jets. For this case study, the turbofan system-level problem is partitioned into engine cycle design and configuration design and a compressor modules integrated for more detailed subsystem-level design exploration, improving system evaluation. The fan and low pressure turbine subsystems are also modeled, but in less detail. Given the defined partitioning, these subproblems are investigated independently and concurrently, and response surface models are constructed to approximate the responses of each. These response models are then incorporated within a commercial turbofan hierarchical compromise decision support problem formulation. Five design scenarios are investigated, and robust solutions are identified. The method and solutions identified are verified by comparison with the AE3007 engine. The solutions obtained are similar to the AE3007 cycle and configuration, but are better with respect to many of the requirements.

  1. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define...... a taxonomy of aspects around conservation, constraints and constitutive relations. Aspects of the ICAS-MoT toolbox are given to illustrate the functionality of a computer aided modelling tool, which incorporates an interface to MS Excel....

  2. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar

    2016-03-21

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.

  3. Computational models for analyzing lipoprotein profiles

    NARCIS (Netherlands)

    Graaf, A.A. de; Schalkwijk, D.B. van

    2011-01-01

    At present, several measurement technologies are available for generating highly detailed concentration-size profiles of lipoproteins, offering increased diagnostic potential. Computational models are useful in aiding the interpretation of these complex datasets and making the data more accessible f

  4. Informing mechanistic toxicology with computational molecular models.

    Science.gov (United States)

    Goldsmith, Michael R; Peterson, Shane D; Chang, Daniel T; Transue, Thomas R; Tornero-Velez, Rogelio; Tan, Yu-Mei; Dary, Curtis C

    2012-01-01

    Computational molecular models of chemicals interacting with biomolecular targets provides toxicologists a valuable, affordable, and sustainable source of in silico molecular level information that augments, enriches, and complements in vitro and in vivo efforts. From a molecular biophysical ansatz, we describe how 3D molecular modeling methods used to numerically evaluate the classical pair-wise potential at the chemical/biological interface can inform mechanism of action and the dose-response paradigm of modern toxicology. With an emphasis on molecular docking, 3D-QSAR and pharmacophore/toxicophore approaches, we demonstrate how these methods can be integrated with chemoinformatic and toxicogenomic efforts into a tiered computational toxicology workflow. We describe generalized protocols in which 3D computational molecular modeling is used to enhance our ability to predict and model the most relevant toxicokinetic, metabolic, and molecular toxicological endpoints, thereby accelerating the computational toxicology-driven basis of modern risk assessment while providing a starting point for rational sustainable molecular design.

  5. Computational fluid dynamics modeling in yarn engineering

    CSIR Research Space (South Africa)

    Patanaik, A

    2011-07-01

    Full Text Available This chapter deals with the application of computational fluid dynamics (CFD) modeling in reducing yarn hairiness during the ring spinning process and thereby “engineering” yarn with desired properties. Hairiness significantly affects the appearance...

  6. Computational Models for Nonlinear Aeroelastic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate a new and efficient computational method of modeling nonlinear aeroelastic systems. The...

  7. A new epidemic model of computer viruses

    Science.gov (United States)

    Yang, Lu-Xing; Yang, Xiaofan

    2014-06-01

    This paper addresses the epidemiological modeling of computer viruses. By incorporating the effect of removable storage media, considering the possibility of connecting infected computers to the Internet, and removing the conservative restriction on the total number of computers connected to the Internet, a new epidemic model is proposed. Unlike most previous models, the proposed model has no virus-free equilibrium and has a unique endemic equilibrium. With the aid of the theory of asymptotically autonomous systems as well as the generalized Poincare-Bendixson theorem, the endemic equilibrium is shown to be globally asymptotically stable. By analyzing the influence of different system parameters on the steady number of infected computers, a collection of policies is recommended to prohibit the virus prevalence.

  8. Computer Model Locates Environmental Hazards

    Science.gov (United States)

    2008-01-01

    Catherine Huybrechts Burton founded San Francisco-based Endpoint Environmental (2E) LLC in 2005 while she was a student intern and project manager at Ames Research Center with NASA's DEVELOP program. The 2E team created the Tire Identification from Reflectance model, which algorithmically processes satellite images using turnkey technology to retain only the darkest parts of an image. This model allows 2E to locate piles of rubber tires, which often are stockpiled illegally and cause hazardous environmental conditions and fires.

  9. Preliminary deformation model for National Seismic Hazard map of Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Meilano, Irwan; Gunawan, Endra; Sarsito, Dina; Prijatna, Kosasih; Abidin, Hasanuddin Z. [Geodesy Research Division, Faculty of Earth Science and Technology, Institute of Technology Bandung (Indonesia); Susilo,; Efendi, Joni [Agency for Geospatial Information (BIG) (Indonesia)

    2015-04-24

    Preliminary deformation model for the Indonesia’s National Seismic Hazard (NSH) map is constructed as the block rotation and strain accumulation function at the elastic half-space. Deformation due to rigid body motion is estimated by rotating six tectonic blocks in Indonesia. The interseismic deformation due to subduction is estimated by assuming coupling on subduction interface while deformation at active fault is calculated by assuming each of the fault‘s segment slips beneath a locking depth or in combination with creeping in a shallower part. This research shows that rigid body motion dominates the deformation pattern with magnitude more than 15 mm/year, except in the narrow area near subduction zones and active faults where significant deformation reach to 25 mm/year.

  10. Parallel computing in atmospheric chemistry models

    Energy Technology Data Exchange (ETDEWEB)

    Rotman, D. [Lawrence Livermore National Lab., CA (United States). Atmospheric Sciences Div.

    1996-02-01

    Studies of atmospheric chemistry are of high scientific interest, involve computations that are complex and intense, and require enormous amounts of I/O. Current supercomputer computational capabilities are limiting the studies of stratospheric and tropospheric chemistry and will certainly not be able to handle the upcoming coupled chemistry/climate models. To enable such calculations, the authors have developed a computing framework that allows computations on a wide range of computational platforms, including massively parallel machines. Because of the fast paced changes in this field, the modeling framework and scientific modules have been developed to be highly portable and efficient. Here, the authors present the important features of the framework and focus on the atmospheric chemistry module, named IMPACT, and its capabilities. Applications of IMPACT to aircraft studies will be presented.

  11. A Computational Framework for Realistic Retina Modeling.

    Science.gov (United States)

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas.

  12. Proceedings Fifth Workshop on Developments in Computational Models--Computational Models From Nature

    CERN Document Server

    Cooper, S Barry; 10.4204/EPTCS.9

    2009-01-01

    The special theme of DCM 2009, co-located with ICALP 2009, concerned Computational Models From Nature, with a particular emphasis on computational models derived from physics and biology. The intention was to bring together different approaches - in a community with a strong foundational background as proffered by the ICALP attendees - to create inspirational cross-boundary exchanges, and to lead to innovative further research. Specifically DCM 2009 sought contributions in quantum computation and information, probabilistic models, chemical, biological and bio-inspired ones, including spatial models, growth models and models of self-assembly. Contributions putting to the test logical or algorithmic aspects of computing (e.g., continuous computing with dynamical systems, or solid state computing models) were also very much welcomed.

  13. Computer Modeling of Direct Metal Laser Sintering

    Science.gov (United States)

    Cross, Matthew

    2014-01-01

    A computational approach to modeling direct metal laser sintering (DMLS) additive manufacturing process is presented. The primary application of the model is for determining the temperature history of parts fabricated using DMLS to evaluate residual stresses found in finished pieces and to assess manufacturing process strategies to reduce part slumping. The model utilizes MSC SINDA as a heat transfer solver with imbedded FORTRAN computer code to direct laser motion, apply laser heating as a boundary condition, and simulate the addition of metal powder layers during part fabrication. Model results are compared to available data collected during in situ DMLS part manufacture.

  14. Computational Modeling of Culture's Consequences

    NARCIS (Netherlands)

    Hofstede, G.J.; Jonker, C.M.; Verwaart, T.

    2010-01-01

    This paper presents an approach to formalize the influence of culture on the decision functions of agents in social simulations. The key components are (a) a definition of the domain of study in the form of a decision model, (b) knowledge acquisition based on a dimensional theory of culture, resulti

  15. Computational aspects of premixing modelling

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, D.F. [Sydney Univ., NSW (Australia). Dept. of Chemical Engineering; Witt, P.J.

    1998-01-01

    In the steam explosion research field there is currently considerable effort being devoted to the modelling of premixing. Practically all models are based on the multiphase flow equations which treat the mixture as an interpenetrating continuum. Solution of these equations is non-trivial and a wide range of solution procedures are in use. This paper addresses some numerical aspects of this problem. In particular, we examine the effect of the differencing scheme for the convective terms and show that use of hybrid differencing can cause qualitatively wrong solutions in some situations. Calculations are performed for the Oxford tests, the BNL tests, a MAGICO test and to investigate various sensitivities of the solution. In addition, we show that use of a staggered grid can result in a significant error which leads to poor predictions of `melt` front motion. A correction is given which leads to excellent convergence to the analytic solution. Finally, we discuss the issues facing premixing model developers and highlight the fact that model validation is hampered more by the complexity of the process than by numerical issues. (author)

  16. Visual and Computational Modelling of Minority Games

    Directory of Open Access Journals (Sweden)

    Robertas Damaševičius

    2017-02-01

    Full Text Available The paper analyses the Minority Game and focuses on analysis and computational modelling of several variants (variable payoff, coalition-based and ternary voting of Minority Game using UAREI (User-Action-Rule-Entities-Interface model. UAREI is a model for formal specification of software gamification, and the UAREI visual modelling language is a language used for graphical representation of game mechanics. The URAEI model also provides the embedded executable modelling framework to evaluate how the rules of the game will work for the players in practice. We demonstrate flexibility of UAREI model for modelling different variants of Minority Game rules for game design.

  17. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented......, focusing on universality of the ac response in the extreme disorder limit. Finally, some important unsolved problems relating to hopping models for ac conduction are listed....

  18. Mechanistic models in computational social science

    Science.gov (United States)

    Holme, Petter; Liljeros, Fredrik

    2015-09-01

    Quantitative social science is not only about regression analysis or, in general, data inference. Computer simulations of social mechanisms have an over 60 years long history. They have been used for many different purposes—to test scenarios, to test the consistency of descriptive theories (proof-of-concept models), to explore emergent phenomena, for forecasting, etc. In this essay, we sketch these historical developments, the role of mechanistic models in the social sciences and the influences from the natural and formal sciences. We argue that mechanistic computational models form a natural common ground for social and natural sciences, and look forward to possible future information flow across the social-natural divide.

  19. Mechanistic Models in Computational Social Science

    CERN Document Server

    Holme, Petter

    2015-01-01

    Quantitative social science is not only about regression analysis or, in general, data inference. Computer simulations of social mechanisms have an over 60 years long history. They have been used for many different purposes -- to test scenarios, to test the consistency of descriptive theories (proof-of-concept models), to explore emerging phenomena, for forecasting, etc. In this essay, we sketch these historical developments, the role of mechanistic models in the social sciences and the influences from natural and formal sciences. We argue that mechanistic computational models form a natural common ground for social and natural sciences, and look forward to possible future information flow across the social-natural divide.

  20. Computational modeling of failure in composite laminates

    NARCIS (Netherlands)

    Van der Meer, F.P.

    2010-01-01

    There is no state of the art computational model that is good enough for predictive simulation of the complete failure process in laminates. Already on the single ply level controversy exists. Much work has been done in recent years in the development of continuum models, but these fail to predict t

  1. Computational Intelligence. Mortality Models for the Actuary

    NARCIS (Netherlands)

    Willemse, W.J.

    2001-01-01

    This thesis applies computational intelligence to the field of actuarial (insurance) science. In particular, this thesis deals with life insurance where mortality modelling is important. Actuaries use ancient models (mortality laws) from the nineteenth century, for example Gompertz' and Makeham's la

  2. Generating computational models for serious gaming

    NARCIS (Netherlands)

    Westera, Wim

    2014-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  3. Do's and Don'ts of Computer Models for Planning

    Science.gov (United States)

    Hammond, John S., III

    1974-01-01

    Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)

  4. Do's and Don'ts of Computer Models for Planning

    Science.gov (United States)

    Hammond, John S., III

    1974-01-01

    Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)

  5. Parallel Computing of Ocean General Circulation Model

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper discusses the parallel computing of the thirdgeneration Ocea n General Circulation Model (OGCM) from the State Key Laboratory of Numerical Mo deling for Atmospheric Science and Geophysical Fluid Dynamics(LASG),Institute of Atmosphere Physics(IAP). Meanwhile, several optimization strategies for paralle l computing of OGCM (POGCM) on Scalable Shared Memory Multiprocessor (S2MP) are presented. Using Message Passing Interface (MPI), we obtain super linear speedup on SGI Origin 2000 for parallel OGCM(POGCM) after optimization.

  6. On the completeness of quantum computation models

    CERN Document Server

    Arrighi, Pablo

    2010-01-01

    The notion of computability is stable (i.e. independent of the choice of an indexing) over infinite-dimensional vector spaces provided they have a finite "tensorial dimension". Such vector spaces with a finite tensorial dimension permit to define an absolute notion of completeness for quantum computation models and give a precise meaning to the Church-Turing thesis in the framework of quantum theory. (Extra keywords: quantum programming languages, denotational semantics, universality.)

  7. Security Management Model in Cloud Computing Environment

    OpenAIRE

    2016-01-01

    In the cloud computing environment, cloud virtual machine (VM) will be more and more the number of virtual machine security and management faced giant Challenge. In order to address security issues cloud computing virtualization environment, this paper presents a virtual machine based on efficient and dynamic deployment VM security management model state migration and scheduling, study of which virtual machine security architecture, based on AHP (Analytic Hierarchy Process) virtual machine de...

  8. Finite difference computing with exponential decay models

    CERN Document Server

    Langtangen, Hans Petter

    2016-01-01

    This text provides a very simple, initial introduction to the complete scientific computing pipeline: models, discretization, algorithms, programming, verification, and visualization. The pedagogical strategy is to use one case study – an ordinary differential equation describing exponential decay processes – to illustrate fundamental concepts in mathematics and computer science. The book is easy to read and only requires a command of one-variable calculus and some very basic knowledge about computer programming. Contrary to similar texts on numerical methods and programming, this text has a much stronger focus on implementation and teaches testing and software engineering in particular. .

  9. A computational model of analogical reasoning

    Institute of Scientific and Technical Information of China (English)

    李波; 赵沁平

    1997-01-01

    A computational model of analogical reasoning is presented, which divides analogical reasoning process into four subprocesses, i.e. reminding, elaboration, matching and transfer. For each subprocess, its role and the principles it follows are given. The model is discussed in detail, including salient feature-based reminding, relevance-directed elaboration, an improved matching model and a transfer model. And the advantages of this model are summarized based on the results of BHARS, which is an analogical reasoning system implemented by this model.

  10. A preliminary study on the short-term efficacy of chairside computer-aided design/computer-assisted manufacturing- generated posterior lithium disilicate crowns.

    Science.gov (United States)

    Reich, Sven; Fischer, Sören; Sobotta, Bernhard; Klapper, Horst-Uwe; Gozdowski, Stephan

    2010-01-01

    The purpose of this preliminary study was to evaluate the clinical performance of chairside-generated crowns over a preliminary time period of 24 months. Forty-one posterior crowns made of a machinable lithium disilicate ceramic for full-contour crowns were inserted in 34 patients using a chairside computer-aided design/computer-assisted manufacturing technique. The crowns were evaluated at baseline and after 6, 12, and 24 months according to modified United States Public Health Service criteria. After 2 years, all reexamined crowns (n = 39) were in situ; one abutment exhibited secondary caries and two abutments received root canal treatment. Within the limited observation period, the crowns revealed clinically satisfying results.

  11. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  12. On the computational modeling of FSW processes

    OpenAIRE

    Agelet de Saracibar Bosch, Carlos; Chiumenti, Michèle; Santiago, Diego de; Cervera Ruiz, Miguel; Dialami, Narges; Lombera, Guillermo

    2010-01-01

    This work deals with the computational modeling and numerical simulation of Friction Stir Welding (FSW) processes. Here a quasi-static, transient, mixed stabilized Eulerian formulation is used. Norton-Hoff and Sheppard-Wright rigid thermoplastic material models have been considered. A product formula algorithm, leading to a staggered solution scheme, has been used. The model has been implemented into the in-house developed FE code COMET. Results obtained in the simulation of FSW process are c...

  13. An improved computational constitutive model for glass

    Science.gov (United States)

    Holmquist, Timothy J.; Johnson, Gordon R.; Gerlach, Charles A.

    2017-01-01

    In 2011, Holmquist and Johnson presented a model for glass subjected to large strains, high strain rates and high pressures. It was later shown that this model produced solutions that were severely mesh dependent, converging to a solution that was much too strong. This article presents an improved model for glass that uses a new approach to represent the interior and surface strength that is significantly less mesh dependent. This new formulation allows for the laboratory data to be accurately represented (including the high tensile strength observed in plate-impact spall experiments) and produces converged solutions that are in good agreement with ballistic data. The model also includes two new features: one that decouples the damage model from the strength model, providing more flexibility in defining the onset of permanent deformation; the other provides for a variable shear modulus that is dependent on the pressure. This article presents a review of the original model, a description of the improved model and a comparison of computed and experimental results for several sets of ballistic data. Of special interest are computed and experimental results for two impacts onto a single target, and the ability to compute the damage velocity in agreement with experiment data. This article is part of the themed issue 'Experimental testing and modelling of brittle materials at high strain rates'.

  14. Computer Aided Modelling – Opportunities and Challenges

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    -based solutions to significant problems? The important issues of workflow and data flow are discussed together with fit-for-purpose model development. As well, the lack of tools around multiscale modelling provides opportunities for the development of efficient tools to address such challenges. The ability......This chapter considers the opportunities that are present in developing, extending and applying aspects of computer-aided modelling principles and practice. What are the best tasks to be done by modellers and what needs the application of CAPE tools? How do we efficiently develop model...... and opportunities are discussed for such systems....

  15. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  16. Computational algebraic geometry of epidemic models

    Science.gov (United States)

    Rodríguez Vega, Martín.

    2014-06-01

    Computational Algebraic Geometry is applied to the analysis of various epidemic models for Schistosomiasis and Dengue, both, for the case without control measures and for the case where control measures are applied. The models were analyzed using the mathematical software Maple. Explicitly the analysis is performed using Groebner basis, Hilbert dimension and Hilbert polynomials. These computational tools are included automatically in Maple. Each of these models is represented by a system of ordinary differential equations, and for each model the basic reproductive number (R0) is calculated. The effects of the control measures are observed by the changes in the algebraic structure of R0, the changes in Groebner basis, the changes in Hilbert dimension, and the changes in Hilbert polynomials. It is hoped that the results obtained in this paper become of importance for designing control measures against the epidemic diseases described. For future researches it is proposed the use of algebraic epidemiology to analyze models for airborne and waterborne diseases.

  17. Biomedical Imaging and Computational Modeling in Biomechanics

    CERN Document Server

    Iacoviello, Daniela

    2013-01-01

    This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease.   The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.

  18. Cost-based optimization of a nuclear reactor core design: a preliminary model

    Energy Technology Data Exchange (ETDEWEB)

    Sacco, Wagner F.; Alves Filho, Hermes [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico. Dept. de Modelagem Computacional]. E-mails: wfsacco@iprj.uerj.br; halves@iprj.uerj.br; Pereira, Claudio M.N.A. [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil). Div. de Reatores]. E-mail: cmnap@ien.gov.br

    2007-07-01

    A new formulation of a nuclear core design optimization problem is introduced in this article. Originally, the optimization problem consisted in adjusting several reactor cell parameters, such as dimensions, enrichment and materials, in order to minimize the radial power peaking factor in a three-enrichment zone reactor, considering restrictions on the average thermal flux, criticality and sub-moderation. Here, we address the same problem using the minimization of the fuel and cladding materials costs as the objective function, and the radial power peaking factor as an operational constraint. This cost-based optimization problem is attacked by two metaheuristics, the standard genetic algorithm (SGA), and a recently introduced Metropolis algorithm called the Particle Collision Algorithm (PCA). The two algorithms are submitted to the same computational effort and their results are compared. As the formulation presented is preliminary, more elaborate models are also discussed (author)

  19. COMPUTATIONAL MODELLING OF BUFFETING EFFECTS USING OPENFOAM SOFTWARE PACKAGE

    Directory of Open Access Journals (Sweden)

    V. T. Kalugin

    2015-01-01

    Full Text Available In this paper, the preliminary results of computational modeling of an aircraft with the airbrake deployed are presented. The calculations were performed with OpenFOAM software package. The results outlined are a part of a research project to optimise aircraft performance using a perforated airbrake. Within this stage of the project OpenFOAM software package with hybrid RANS-LES approach was tested in respect to a given configuration of the aircraft, airbrake and then has been compared with the test data. For the worst case the amplitude of the peak force acting on the tail fin can be up to 6 times higher than the average value without airbrake deployed. To reduce unsteady loads acting on the tailfin, perforation of the airbrake was proposed.

  20. Analisis Model Manajemen Insiden Berbasis Cloud Computing

    Directory of Open Access Journals (Sweden)

    Anggi Sukamto

    2015-05-01

    Full Text Available Dukungan teknologi informasi yang diterapkan oleh organisasi membutuhkan suatu manajemen agar penggunaannya dapat memenuhi tujuan penerapan teknologi tersebut. Salah satu kerangka kerja manajemen layanan teknologi informasi yang dapat diadopsi oleh organisasi adalah Information Technology Infrastructure Library (ITIL. Dukungan layanan (service support merupakan bagian dari proses ITIL. Pada umumnya, aktivitas dukungan layanan dilaksanakan dengan penggunaan teknologi yang dapat diakses melalui internet. Kondisi tersebut mengarah pada suatu konsep cloud computing. Cloud computing memungkinkan suatu instansi atau perusahaan untuk bisa mengatur sumber daya melalui jaringan internet. Fokus penelitian ini adalah menganalisis proses dan pelaku yang terlibat dalam dukungan layanan khususnya pada proses manajemen insiden, serta mengidentifikasi potensi penyerahan pelaku ke bentuk layanan cloud computing. Berdasarkan analisis yang dilakukan maka usulan model manajemen insiden berbasis cloud ini dapat diterapkan dalam suatu organisasi yang telah menggunakan teknologi komputer untuk mendukung kegiatan operasional. Kata Kunci—Cloud computing, ITIL, Manajemen Insiden, Service Support, Service Desk.

  1. Computational fluid dynamics in three dimensional angiography: Preliminary hemodynamic results of various proximal geometry

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ha Youn; Park, Sung Tae; Bae, Won Kyoung; Goo, Dong Erk [Dept. of Radiology, Soonchunhyang University Hospital, Seoul (Korea, Republic of)

    2014-12-15

    We studied the influence of proximal geometry on the results of computational fluid dynamics (CFD). We made five models of different proximal geometry from three dimensional angiography of 63-year-old women with intracranial aneurysm. CFD results were analyzed as peak systolic velocity (PSV) at inlet and outlet as well as flow velocity profile at proximal level of internal carotid artery (ICA) aneurysm. Modified model of cavernous one with proximal tubing showed faster PSV at outlet than that at inlet. The PSV of outlets of other models were slower than that of inlets. The flow velocity profiles at immediate proximal to ICA aneurysm showed similar patterns in all models, suggesting that proximal vessel geometries could affect CFD results.

  2. Applied Mathematics, Modelling and Computational Science

    CERN Document Server

    Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

    2015-01-01

    The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

  3. Utilizing computer models for optimizing classroom acoustics

    Science.gov (United States)

    Hinckley, Jennifer M.; Rosenberg, Carl J.

    2002-05-01

    The acoustical conditions in a classroom play an integral role in establishing an ideal learning environment. Speech intelligibility is dependent on many factors, including speech loudness, room finishes, and background noise levels. The goal of this investigation was to use computer modeling techniques to study the effect of acoustical conditions on speech intelligibility in a classroom. This study focused on a simulated classroom which was generated using the CATT-acoustic computer modeling program. The computer was utilized as an analytical tool in an effort to optimize speech intelligibility in a typical classroom environment. The factors that were focused on were reverberation time, location of absorptive materials, and background noise levels. Speech intelligibility was measured with the Rapid Speech Transmission Index (RASTI) method.

  4. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  5. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  6. Computer modeling of loudspeaker arrays in rooms

    Science.gov (United States)

    Schwenke, Roger

    2002-05-01

    Loudspeakers present a special challenge to computational modeling of rooms. When modeling a collection of noncorrelated sound sources, such as a group of musicians, coarse resolution power spectrum and directivities are sufficient. In contrast, a typical loudspeaker array consists of many speakers driven with the same signal, and are therefore almost completely correlated. This can lead to a quite complicated, but stable, pattern of spatial nulls and lobes which depends sensitively on frequency. It has been shown that, to model these interactions accurately, one must have loudspeaker data with 1 deg spatial resolution, 1/24 octave frequency resolution including phase. It will be shown that computer models at such a high resolution can in fact inform design decisions of loudspeaker arrays.

  7. Computational models for synthetic marine infrared clutter

    Science.gov (United States)

    Constantikes, Kim T.; Zysnarski, Adam H.

    1996-06-01

    The next generation of ship defense missiles will need to engage stealthy, passive, sea-skimming missiles. Detection and guidance will occur against a background of sea surface and horizon which can present significant clutter problems for infrared seekers, particularly when targets are comparatively dim. We need a variety of sea clutter models: statistical image models for signal processing algorithm design, clutter occurrence models for systems effectiveness assessment, and constructive image models for synthesizing very large field-of-view (FOV) images with high spatial and temporal resolution. We have implemented and tested such a constructive model. First principle models of water waves and light transport provide a computationally intensive clutter model implemented as a raytracer. Our models include sea, sky, and solar radiance; reflectance; attenuating atmospheres; constructive solid geometry targets; target and water wave dynamics; and simple sensor image formation.

  8. Computational modelling for dry-powder inhalers

    NARCIS (Netherlands)

    Kröger, Ralf; Woolhouse, Robert; Becker, Michael; Wachtel, Herbert; de Boer, Anne; Horner, Marc

    2012-01-01

    Computational fluid dynamics (CFD) is a simulation tool used for modelling powder flow through inhalers to allow optimisation both of device design and drug powder. Here, Ralf Kröger, Consulting Senior CFD Engineer, ANSYS Germany GmbH; Marc Horner, Lead Technical Services Engineer, Healthcare, ANSYS

  9. Agent based computational model of trust

    NARCIS (Netherlands)

    A. Gorobets (Alexander); B. Nooteboom (Bart)

    2004-01-01

    textabstractThis paper employs the methodology of Agent-Based Computational Economics (ACE) to investigate under what conditions trust can be viable in markets. The emergence and breakdown of trust is modeled in a context of multiple buyers and suppliers. Agents adapt their trust in a partner, the w

  10. Integer Programming Models for Computational Biology Problems

    Institute of Scientific and Technical Information of China (English)

    Giuseppe Lancia

    2004-01-01

    The recent years have seen an impressive increase in the use of Integer Programming models for the solution of optimization problems originating in Molecular Biology. In this survey, some of the most successful Integer Programming approaches are described, while a broad overview of application areas being is given in modern Computational Molecular Biology.

  11. Computational modelling for dry-powder inhalers

    NARCIS (Netherlands)

    Kröger, Ralf; Woolhouse, Robert; Becker, Michael; Wachtel, Herbert; de Boer, Anne; Horner, Marc

    2012-01-01

    Computational fluid dynamics (CFD) is a simulation tool used for modelling powder flow through inhalers to allow optimisation both of device design and drug powder. Here, Ralf Kröger, Consulting Senior CFD Engineer, ANSYS Germany GmbH; Marc Horner, Lead Technical Services Engineer, Healthcare,

  12. A Stochastic Dynamic Model of Computer Viruses

    Directory of Open Access Journals (Sweden)

    Chunming Zhang

    2012-01-01

    Full Text Available A stochastic computer virus spread model is proposed and its dynamic behavior is fully investigated. Specifically, we prove the existence and uniqueness of positive solutions, and the stability of the virus-free equilibrium and viral equilibrium by constructing Lyapunov functions and applying Ito's formula. Some numerical simulations are finally given to illustrate our main results.

  13. STEW A Nonlinear Data Modeling Computer Program

    CERN Document Server

    Chen, H

    2000-01-01

    A nonlinear data modeling computer program, STEW, employing the Levenberg-Marquardt algorithm, has been developed to model the experimental sup 2 sup 3 sup 9 Pu(n,f) and sup 2 sup 3 sup 5 U(n,f) cross sections. This report presents results of the modeling of the sup 2 sup 3 sup 9 Pu(n,f) and sup 2 sup 3 sup 5 U(n,f) cross-section data. The calculation of the fission transmission coefficient is based on the double-humped-fission-barrier model of Bjornholm and Lynn. Incident neutron energies of up to 5 MeV are considered.

  14. STEW: A Nonlinear Data Modeling Computer Program

    Energy Technology Data Exchange (ETDEWEB)

    Chen, H.

    2000-03-04

    A nonlinear data modeling computer program, STEW, employing the Levenberg-Marquardt algorithm, has been developed to model the experimental {sup 239}Pu(n,f) and {sup 235}U(n,f) cross sections. This report presents results of the modeling of the {sup 239}Pu(n,f) and {sup 235}U(n,f) cross-section data. The calculation of the fission transmission coefficient is based on the double-humped-fission-barrier model of Bjornholm and Lynn. Incident neutron energies of up to 5 MeV are considered.

  15. Evaluating computational models of cholesterol metabolism.

    Science.gov (United States)

    Paalvast, Yared; Kuivenhoven, Jan Albert; Groen, Albert K

    2015-10-01

    Regulation of cholesterol homeostasis has been studied extensively during the last decades. Many of the metabolic pathways involved have been discovered. Yet important gaps in our knowledge remain. For example, knowledge on intracellular cholesterol traffic and its relation to the regulation of cholesterol synthesis and plasma cholesterol levels is incomplete. One way of addressing the remaining questions is by making use of computational models. Here, we critically evaluate existing computational models of cholesterol metabolism making use of ordinary differential equations and addressed whether they used assumptions and make predictions in line with current knowledge on cholesterol homeostasis. Having studied the results described by the authors, we have also tested their models. This was done primarily by testing the effect of statin treatment in each model. Ten out of eleven models tested have made assumptions in line with current knowledge of cholesterol metabolism. Three out of the ten remaining models made correct predictions, i.e. predicting a decrease in plasma total and LDL cholesterol or increased uptake of LDL upon treatment upon the use of statins. In conclusion, few models on cholesterol metabolism are able to pass a functional test. Apparently most models have not undergone the critical iterative systems biology cycle of validation. We expect modeling of cholesterol metabolism to go through many more model topologies and iterative cycles and welcome the increased understanding of cholesterol metabolism these are likely to bring.

  16. Applied modelling and computing in social science

    CERN Document Server

    Povh, Janez

    2015-01-01

    In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

  17. Mechanistic models in computational social science

    OpenAIRE

    Petter eHolme; Fredrik eLiljeros

    2015-01-01

    Quantitative social science is not only about regression analysis or, in general, data inference. Computer simulations of social mechanisms have an over 60 years long history. They have been used for many different purposes—to test scenarios, to test the consistency of descriptive theories (proof-of-concept models), to explore emergent phenomena, for forecasting, etc. In this essay, we sketch these historical developments, the role of mechanistic models in the social sciences and the influenc...

  18. Mechanistic models in computational social science

    OpenAIRE

    Holme, Petter; Liljeros, Fredrik

    2015-01-01

    Quantitative social science is not only about regression analysis or, in general, data inference. Computer simulations of social mechanisms have an over 60 years long history. They have been used for many different purposes -- to test scenarios, to test the consistency of descriptive theories (proof-of-concept models), to explore emergent phenomena, for forecasting, etc. In this essay, we sketch these historical developments, the role of mechanistic models in the social sciences and the influ...

  19. A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling

    Science.gov (United States)

    Cao, G.

    2015-12-01

    All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the

  20. A Dualistic Model To Describe Computer Architectures

    Science.gov (United States)

    Nitezki, Peter; Engel, Michael

    1985-07-01

    The Dualistic Model for Computer Architecture Description uses a hierarchy of abstraction levels to describe a computer in arbitrary steps of refinement from the top of the user interface to the bottom of the gate level. In our Dualistic Model the description of an architecture may be divided into two major parts called "Concept" and "Realization". The Concept of an architecture on each level of the hierarchy is an Abstract Data Type that describes the functionality of the computer and an implementation of that data type relative to the data type of the next lower level of abstraction. The Realization on each level comprises a language describing the means of user interaction with the machine, and a processor interpreting this language in terms of the language of the lower level. The surface of each hierarchical level, the data type and the language express the behaviour of a ma-chine at this level, whereas the implementation and the processor describe the structure of the algorithms and the system. In this model the Principle of Operation maps the object and computational structure of the Concept onto the structures of the Realization. Describing a system in terms of the Dualistic Model is therefore a process of refinement starting at a mere description of behaviour and ending at a description of structure. This model has proven to be a very valuable tool in exploiting the parallelism in a problem and it is very transparent in discovering the points where par-allelism is lost in a special architecture. It has successfully been used in a project on a survey of Computer Architecture for Image Processing and Pattern Analysis in Germany.

  1. Preliminary modelling study of ice accretion on wind turbines

    DEFF Research Database (Denmark)

    Pedersen, Marie Cecilie; Yin, Chungen

    2014-01-01

    One of the main challenges associated with cold-climate wind energy is icing on wind turbines and a series of icing-induced problems such as production loss, blade fatigue and safety issues. Because of the difficulties with on-site measurements, simulations are often used to understand and predict...... icing events. In this paper, a new methodology for prediction of icing-induced production loss is proposed, from which the fundamentals of ice accretion on wind turbines can be better understood and the operational production losses can be more reliably predicted. Computational fluid dynamics (CFD......) modelling of ice accretion on wind turbines is also performed for different ice events, resulting in a reliable framework for CFD-based ice accretion modelling which is one of the key elements in the new methodology....

  2. Preliminary results of a three-dimensional radiative transfer model

    Energy Technology Data Exchange (ETDEWEB)

    O`Hirok, W. [Univ. of California, Santa Barbara, CA (United States)

    1995-09-01

    Clouds act as the primary modulator of the Earth`s radiation at the top of the atmosphere, within the atmospheric column, and at the Earth`s surface. They interact with both shortwave and longwave radiation, but it is primarily in the case of shortwave where most of the uncertainty lies because of the difficulties in treating scattered solar radiation. To understand cloud-radiative interactions, radiative transfer models portray clouds as plane-parallel homogeneous entities to ease the computational physics. Unfortunately, clouds are far from being homogeneous, and large differences between measurement and theory point to a stronger need to understand and model cloud macrophysical properties. In an attempt to better comprehend the role of cloud morphology on the 3-dimensional radiation field, a Monte Carlo model has been developed. This model can simulate broadband shortwave radiation fluxes while incorporating all of the major atmospheric constituents. The model is used to investigate the cloud absorption anomaly where cloud absorption measurements exceed theoretical estimates and to examine the efficacy of ERBE measurements and cloud field experiments. 3 figs.

  3. Processor core model for quantum computing.

    Science.gov (United States)

    Yung, Man-Hong; Benjamin, Simon C; Bose, Sougato

    2006-06-09

    We describe an architecture based on a processing "core," where multiple qubits interact perpetually, and a separate "store," where qubits exist in isolation. Computation consists of single qubit operations, swaps between the store and the core, and free evolution of the core. This enables computation using physical systems where the entangling interactions are "always on." Alternatively, for switchable systems, our model constitutes a prescription for optimizing many-qubit gates. We discuss implementations of the quantum Fourier transform, Hamiltonian simulation, and quantum error correction.

  4. Computer Model Of Fragmentation Of Atomic Nuclei

    Science.gov (United States)

    Wilson, John W.; Townsend, Lawrence W.; Tripathi, Ram K.; Norbury, John W.; KHAN FERDOUS; Badavi, Francis F.

    1995-01-01

    High Charge and Energy Semiempirical Nuclear Fragmentation Model (HZEFRG1) computer program developed to be computationally efficient, user-friendly, physics-based program for generating data bases on fragmentation of atomic nuclei. Data bases generated used in calculations pertaining to such radiation-transport applications as shielding against radiation in outer space, radiation dosimetry in outer space, cancer therapy in laboratories with beams of heavy ions, and simulation studies for designing detectors for experiments in nuclear physics. Provides cross sections for production of individual elements and isotopes in breakups of high-energy heavy ions by combined nuclear and Coulomb fields of interacting nuclei. Written in ANSI FORTRAN 77.

  5. Synopsis of some preliminary computational studies related to unsaturated zone transport at Area G

    Energy Technology Data Exchange (ETDEWEB)

    Vold, E.

    1998-03-01

    Computational transport models are described with applications in three problem areas related to unsaturated zone moisture movement beneath Area G. These studies may be used to support the ongoing maintenance of the site Performance Assessment. The three areas include: a 1-D transient analysis with average tuff hydraulic properties in the near surface region with computed results compared to field data; the influence on near surface transient moisture percolation due to realistic distributions in hydraulic properties derived statistically from the observed variance in the field data; and the west to east moisture flow in a 2-D steady geometry approximation of the Pajarito Plateau. Results indicate that a simple transient model for transport of moisture volume fraction fits field data well compared to a moisture pulse observed in the active disposal unit, pit 37. Using realistic infiltration boundary conditions for summer showers and for spring snow melt conditions, the computed moisture pulses show significant propagation to less than 10-ft depth. Next, the hydraulic properties were varied on a 2-D grid using statistical distributions based on the field data means and variances for the hydraulic parameters. Near surface transient percolation in these conditions shows a qualitatively realistic percolation with a spatially variable wave front moving into the tuff; however, the flow does not channel into preferred paths and suggests there is no formation of fast paths which could enhance transportation of contaminants. Finally, moisture transport is modeled through an unsaturated 2-D slice representing the upper stratigraphic layers beneath Area G and a west-to-east cut of several miles to examine possible lateral movement from the west where percolation is assumed to be greater than at Area G. Results show some west-to-east moisture flux consistent with the assumed profile for the percolation boundary conditions.

  6. Mechanistic models in computational social science

    Directory of Open Access Journals (Sweden)

    Petter eHolme

    2015-09-01

    Full Text Available Quantitative social science is not only about regression analysis or, in general, data inference. Computer simulations of social mechanisms have an over 60 years long history. They have been used for many different purposes—to test scenarios, to test the consistency of descriptive theories (proof-of-concept models, to explore emergent phenomena, for forecasting, etc. In this essay, we sketch these historical developments, the role of mechanistic models in the social sciences and the influences from the natural and formal sciences. We argue that mechanistic computational models form a natural common ground for social and natural sciences, and look forward to possible future information flow across the social-natural divide.

  7. Computational modelling of evolution: ecosystems and language

    CERN Document Server

    Lipowski, Adam

    2008-01-01

    Recently, computational modelling became a very important research tool that enables us to study problems that for decades evaded scientific analysis. Evolutionary systems are certainly examples of such problems: they are composed of many units that might reproduce, diffuse, mutate, die, or in some cases for example communicate. These processes might be of some adaptive value, they influence each other and occur on various time scales. That is why such systems are so difficult to study. In this paper we briefly review some computational approaches, as well as our contributions, to the evolution of ecosystems and language. We start from Lotka-Volterra equations and the modelling of simple two-species prey-predator systems. Such systems are canonical example for studying oscillatory behaviour in competitive populations. Then we describe various approaches to study long-term evolution of multi-species ecosystems. We emphasize the need to use models that take into account both ecological and evolutionary processe...

  8. Queuing theory models for computer networks

    Science.gov (United States)

    Galant, David C.

    1989-01-01

    A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

  9. Computer Aided Design Modeling for Heterogeneous Objects

    CERN Document Server

    Gupta, Vikas; Tandon, Puneet

    2010-01-01

    Heterogeneous object design is an active research area in recent years. The conventional CAD modeling approaches only provide geometry and topology of the object, but do not contain any information with regard to the materials of the object and so can not be used for the fabrication of heterogeneous objects (HO) through rapid prototyping. Current research focuses on computer-aided design issues in heterogeneous object design. A new CAD modeling approach is proposed to integrate the material information into geometric regions thus model the material distributions in the heterogeneous object. The gradient references are used to represent the complex geometry heterogeneous objects which have simultaneous geometry intricacies and accurate material distributions. The gradient references helps in flexible manipulability and control to heterogeneous objects, which guarantees the local control over gradient regions of developed heterogeneous objects. A systematic approach on data flow, processing, computer visualizat...

  10. Rough – Granular Computing knowledge discovery models

    Directory of Open Access Journals (Sweden)

    Mohammed M. Eissa

    2016-11-01

    Full Text Available Medical domain has become one of the most important areas of research in order to richness huge amounts of medical information about the symptoms of diseases and how to distinguish between them to diagnose it correctly. Knowledge discovery models play vital role in refinement and mining of medical indicators to help medical experts to settle treatment decisions. This paper introduces four hybrid Rough – Granular Computing knowledge discovery models based on Rough Sets Theory, Artificial Neural Networks, Genetic Algorithm and Rough Mereology Theory. A comparative analysis of various knowledge discovery models that use different knowledge discovery techniques for data pre-processing, reduction, and data mining supports medical experts to extract the main medical indicators, to reduce the misdiagnosis rates and to improve decision-making for medical diagnosis and treatment. The proposed models utilized two medical datasets: Coronary Heart Disease dataset and Hepatitis C Virus dataset. The main purpose of this paper was to explore and evaluate the proposed models based on Granular Computing methodology for knowledge extraction according to different evaluation criteria for classification of medical datasets. Another purpose is to make enhancement in the frame of KDD processes for supervised learning using Granular Computing methodology.

  11. Computational Modeling of Vortex Generators for Turbomachinery

    Science.gov (United States)

    Chima, R. V.

    2002-01-01

    In this work computational models were developed and used to investigate applications of vortex generators (VGs) to turbomachinery. The work was aimed at increasing the efficiency of compressor components designed for the NASA Ultra Efficient Engine Technology (UEET) program. Initial calculations were used to investigate the physical behavior of VGs. A parametric study of the effects of VG height was done using 3-D calculations of isolated VGs. A body force model was developed to simulate the effects of VGs without requiring complicated grids. The model was calibrated using 2-D calculations of the VG vanes and was validated using the 3-D results. Then three applications of VGs to a compressor rotor and stator were investigated: 1) The results of the 3-D calculations were used to simulate the use of small casing VGs used to generate rotor preswirl or counterswirl. Computed performance maps were used to evaluate the effects of VGs. 2) The body force model was used to simulate large part-span splitters on the casing ahead of the stator. Computed loss buckets showed the effects of the VGs. 3) The body force model was also used to investigate the use of tiny VGs on the stator suction surface for controlling secondary flows. Near-surface particle traces and exit loss profiles were used to evaluate the effects of the VGs.

  12. Multidirectional Networks of Government Transparency: A Preliminary Model

    Directory of Open Access Journals (Sweden)

    Ahmad Subhan

    2016-11-01

    Full Text Available This article reviews some literature in theoretical level regarding two concepts: governance network and government transparency, in order to search for theoretical linkages and to build an alternative framework that can support the implementation of public disclosure. Transparency agenda has been implemented in various forms at international, national, and local level. Transparency application was also followed by Indonesia with the implementation of Public Information Disclosure Law since 2008. This enthusiasm is quite reasonable because transparency is believed to be one of the human rights principles; as well as a key to better governance, that can help democracy consolidation, prevent corruption, strengthen the legitimacy and improve efficiency. In order to maximize transparency, the government can use a network approach because of some changes at this time, such as democratization, decentralization, and liberalization has placed the government in a position where there is not one actor who manages the state power without stakeholder’s participation. In this context, the government needs to build synergies with other institutions in a reciprocal relationship with all stakeholders. Therefore, adopting the theory of government networks can be one of the strategies to strengthen government transparency. The findings of this article indicate that the government transparency application needs to develop networks in all directions: intragovernmental, intergovernmental and collaborative networks. These three types of network in contrast with the popular belief that government transparency is interpreted only as a procedural activity to outside parties. A preliminary model in this article gives an overview about the arena of government transparency with multi-directional networks more comprehensively.

  13. Nursing opinion leadership: a preliminary model derived from philosophic theories of rational belief.

    Science.gov (United States)

    Anderson, Christine A; Whall, Ann L

    2013-10-01

    Opinion leaders are informal leaders who have the ability to influence others' decisions about adopting new products, practices or ideas. In the healthcare setting, the importance of translating new research evidence into practice has led to interest in understanding how opinion leaders could be used to speed this process. Despite continued interest, gaps in understanding opinion leadership remain. Agent-based models are computer models that have proven to be useful for representing dynamic and contextual phenomena such as opinion leadership. The purpose of this paper is to describe the work conducted in preparation for the development of an agent-based model of nursing opinion leadership. The aim of this phase of the model development project was to clarify basic assumptions about opinions, the individual attributes of opinion leaders and characteristics of the context in which they are effective. The process used to clarify these assumptions was the construction of a preliminary nursing opinion leader model, derived from philosophical theories about belief formation. © 2013 John Wiley & Sons Ltd.

  14. Sticker DNA computer model--PartⅡ:Application

    Institute of Scientific and Technical Information of China (English)

    XU Jin; LI Sanping; DONG Yafei; WEI Xiaopeng

    2004-01-01

    Sticker model is one of the basic models in the DNA computer models. This model is coded with single-double stranded DNA molecules. It has the following advantages that the operations require no strands extension and use no enzymes; What's more, the materials are reusable. Therefore, it arouses attention and interest of scientists in many fields. In this paper, we extend and improve the sticker model, which will be definitely beneficial to the construction of DNA computer. This paper is the second part of our series paper, which mainly focuses on the application of sticker model. It mainly consists of the following three sections: the matrix representation of sticker model is first presented; then a brief review of the past research on graph and combinatorial optimization, such as the minimal set covering problem, the vertex covering problem, Hamiltonian path or cycle problem, the maximal clique problem, the maximal independent problem and the Steiner spanning tree problem, is described; Finally a DNA algorithm for the graph isomorphic problem based on the sticker model is given.

  15. Computing the complexity for Schelling segregation models

    Science.gov (United States)

    Gerhold, Stefan; Glebsky, Lev; Schneider, Carsten; Weiss, Howard; Zimmermann, Burkhard

    2008-12-01

    The Schelling segregation models are "agent based" population models, where individual members of the population (agents) interact directly with other agents and move in space and time. In this note we study one-dimensional Schelling population models as finite dynamical systems. We define a natural notion of entropy which measures the complexity of the family of these dynamical systems. The entropy counts the asymptotic growth rate of the number of limit states. We find formulas and deduce precise asymptotics for the number of limit states, which enable us to explicitly compute the entropy.

  16. Computer Modelling of 3D Geological Surface

    CERN Document Server

    Kodge, B G

    2011-01-01

    The geological surveying presently uses methods and tools for the computer modeling of 3D-structures of the geographical subsurface and geotechnical characterization as well as the application of geoinformation systems for management and analysis of spatial data, and their cartographic presentation. The objectives of this paper are to present a 3D geological surface model of Latur district in Maharashtra state of India. This study is undertaken through the several processes which are discussed in this paper to generate and visualize the automated 3D geological surface model of a projected area.

  17. Computational Study of a Primitive Life Model

    Science.gov (United States)

    Andrecut, Mircea

    We present a computational study of a primitive life model. The calculation involves a discrete treatment of a partial differential equation and some details of that problems are explained. We show that the investigated model is equivalent to a diffusively coupled logistic lattice. The bifurcation diagrams were calculated for different values of the control parameters. The obtained diagrams have shown that the time dependence of the population of the investigated model exhibits transitions between ordered and chaotic behavior. We have investigated also the patterns formation in this system.

  18. Simulating lightning into the RAMS model: implementation and preliminary results

    Directory of Open Access Journals (Sweden)

    S. Federico

    2014-05-01

    Full Text Available This paper shows the results of a tailored version of a previously published methodology, designed to simulate lightning activity, implemented into the Regional Atmospheric Modeling System (RAMS. The method gives the flash density at the resolution of the RAMS grid-scale allowing for a detailed analysis of the evolution of simulated lightning activity. The system is applied in detail to two case studies occurred over the Lazio Region, in Central Italy. Simulations are compared with the lightning activity detected by the LINET network. The cases refer to two thunderstorms of different intensity. Results show that the model predicts reasonably well both cases and that the lightning activity is well reproduced especially for the most intense case. However, there are errors in timing and positioning of the convection, whose magnitude depends on the case study, which mirrors in timing and positioning errors of the lightning distribution. To assess objectively the performance of the methodology, standard scores are presented for four additional case studies. Scores show the ability of the methodology to simulate the daily lightning activity for different spatial scales and for two different minimum thresholds of flash number density. The performance decreases at finer spatial scales and for higher thresholds. The comparison of simulated and observed lighting activity is an immediate and powerful tool to assess the model ability to reproduce the intensity and the evolution of the convection. This shows the importance of the use of computationally efficient lightning schemes, such as the one described in this paper, in forecast models.

  19. A computational-grid based system for continental drainage network extraction using SRTM digital elevation models

    Science.gov (United States)

    Curkendall, David W.; Fielding, Eric J.; Pohl, Josef M.; Cheng, Tsan-Huei

    2003-01-01

    We describe a new effort for the computation of elevation derivatives using the Shuttle Radar Topography Mission (SRTM) results. Jet Propulsion Laboratory's (JPL) SRTM has produced a near global database of highly accurate elevation data. The scope of this database enables computing precise stream drainage maps and other derivatives on Continental scales. We describe a computing architecture for this computationally very complex task based on NASA's Information Power Grid (IPG), a distributed high performance computing network based on the GLOBUS infrastructure. The SRTM data characteristics and unique problems they present are discussed. A new algorithm for organizing the conventional extraction algorithms [1] into a cooperating parallel grid is presented as an essential component to adapt to the IPG computing structure. Preliminary results are presented for a Southern California test area, established for comparing SRTM and its results against those produced using the USGS National Elevation Data (NED) model.

  20. Computational Modeling of Pollution Transmission in Rivers

    Science.gov (United States)

    Parsaie, Abbas; Haghiabi, Amir Hamzeh

    2017-06-01

    Modeling of river pollution contributes to better management of water quality and this will lead to the improvement of human health. The advection dispersion equation (ADE) is the government equation on pollutant transmission in the river. Modeling the pollution transmission includes numerical solution of the ADE and estimating the longitudinal dispersion coefficient (LDC). In this paper, a novel approach is proposed for numerical modeling of the pollution transmission in rivers. It is related to use both finite volume method as numerical method and artificial neural network (ANN) as soft computing technique together in simulation. In this approach, the result of the ANN for predicting the LDC was considered as input parameter for the numerical solution of the ADE. To validate the model performance in real engineering problems, the pollutant transmission in Severn River has been simulated. Comparison of the final model results with measured data of the Severn River showed that the model has good performance. Predicting the LDC by ANN model significantly improved the accuracy of computer simulation of the pollution transmission in river.

  1. Method of generating a computer readable model

    DEFF Research Database (Denmark)

    2008-01-01

    A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element. The met......A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element....... The method comprises encoding a first and a second one of the construction elements as corresponding data structures, each representing the connection elements of the corresponding construction element, and each of the connection elements having associated with it a predetermined connection type. The method...

  2. Computer Modelling and Simulation for Inventory Control

    Directory of Open Access Journals (Sweden)

    G.K. Adegoke

    2012-07-01

    Full Text Available This study concerns the role of computer simulation as a device for conducting scientific experiments on inventory control. The stores function utilizes a bulk of physical assets and engages a bulk of financial resources in a manufacturing outfit therefore there is a need for an efficient inventory control. The reason being that inventory control reduces cost of production and thereby facilitates the effective and efficient accomplishment of production objectives of an organization. Some mathematical and statistical models were used to compute the Economic Order Quantity (EOQ. Test data were gotten from a manufacturing company and same were simulated. The results generated were used to predict a real life situation and have been presented and discussed. The language of implementation for the three models is Turbo Pascal due to its capability, generality and flexibility as a scientific programming language.

  3. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  4. Preliminary development of the Active Colonoscopy Training Model

    Directory of Open Access Journals (Sweden)

    Choi J

    2011-06-01

    Full Text Available JungHun Choi1, Kale Ravindra1, Randolph Robert1, David Drozek21Mechanical Engineering, Ohio University, Athens, OH, USA; 2College of Osteopathic Medicine, Ohio University, Athens, OH, USAAbstract: Formal colonoscopy training requires a significant amount of time and effort. In particular, it requires actual patients for a realistic learning experience. The quality of colonoscopy training varies, and includes didactic courses and procedures proctored by skilled surgeons. A colonoscopy training model is occasionally used as part of the training method, but the effects are minute due to both the simple and tedious training procedures. To enhance the educational effect of the colonoscopy training model, the Active Colonoscopy Training Model (ACTM has been developed. ACTM is an interactive colonoscopy training device which can create the environment of a real colonoscopy procedure as closely as possible. It comprises a configurable rubber colon, a human torso, sensors, a display, and the control part. The ACTM provides audio and visual interaction to the trainee by monitoring important factors, such as forces caused by the distal tip and the shaft of the colonoscope and the pressure to open up the lumen and the localization of the distal tip. On the computer screen, the trainee can easily monitor the status of the colonoscopy, which includes the localization of the distal tip, maximum forces, pressure inside the colon, and surgery time. The forces between the rubber colon and the constraints inside the ACTM are measured and the real time display shows the results to the trainee. The pressure sensors will check the pressure at different parts of the colon. The real-time localized distal tip gives the colonoscopy trainee easier and more confident operation without introducing an additional device in the colonoscope. With the current need for colonoscopists and physicians, the ACTM can play an essential role resolving the problems of the current

  5. A computer model of auditory stream segregation.

    Science.gov (United States)

    Beauvois, M W; Meddis, R

    1991-08-01

    A computer model is described which simulates some aspects of auditory stream segregation. The model emphasizes the explanatory power of simple physiological principles operating at a peripheral rather than a central level. The model consists of a multi-channel bandpass-filter bank with a "noisy" output and an attentional mechanism that responds selectively to the channel with the greatest activity. A "leaky integration" principle allows channel excitation to accumulate and dissipate over time. The model produces similar results to two experimental demonstrations of streaming phenomena, which are presented in detail. These results are discussed in terms of the "emergent properties" of a system governed by simple physiological principles. As such the model is contrasted with higher-level Gestalt explanations of the same phenomena while accepting that they may constitute complementary kinds of explanation.

  6. A Neural Computational Model of Incentive Salience

    OpenAIRE

    Jun Zhang; Berridge, Kent C; Amy J Tindell; Kyle S Smith; J Wayne Aldridge

    2009-01-01

    Incentive salience is a motivational property with ‘magnet-like’ qualities. When attributed to reward-predicting stimuli (cues), incentive salience triggers a pulse of ‘wanting’ and an individual is pulled toward the cues and reward. A key computational question is how incentive salience is generated during a cue re-encounter, which combines both learning and the state of limbic brain mechanisms. Learning processes, such as temporal-difference models, provide one way for stimuli to acquire ca...

  7. AMAR: A Computational Model of Autosegmental Phonology

    Science.gov (United States)

    1993-10-01

    the 8th International Joint Conference on Artificial Inteligence . 683-5. Koskenniemi, K. 1984. A general computational model for word-form recognition...NUMBER Massachusetts Institute of Technology Artificial Intelligence Laboratory AI-TR 1450 545 Technology Square Cambridge, Massachusetts 02139 9...reader a feel for the workinigs of ANIAR. this chapter will begini withi a very sininpb examl- ple based oni ani artificial tonie laniguage with oiony t

  8. Computational Biology: Modeling Chronic Renal Allograft Injury.

    Science.gov (United States)

    Stegall, Mark D; Borrows, Richard

    2015-01-01

    New approaches are needed to develop more effective interventions to prevent long-term rejection of organ allografts. Computational biology provides a powerful tool to assess the large amount of complex data that is generated in longitudinal studies in this area. This manuscript outlines how our two groups are using mathematical modeling to analyze predictors of graft loss using both clinical and experimental data and how we plan to expand this approach to investigate specific mechanisms of chronic renal allograft injury.

  9. Computational fluid dynamics modelling in cardiovascular medicine.

    Science.gov (United States)

    Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2016-01-01

    This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges.

  10. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C

    2013-01-01

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  11. ADGEN: ADjoint GENerator for computer models

    Energy Technology Data Exchange (ETDEWEB)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.

    1989-05-01

    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs.

  12. Computational acoustic modeling of cetacean vocalizations

    Science.gov (United States)

    Gurevich, Michael Dixon

    A framework for computational acoustic modeling of hypothetical vocal production mechanisms in cetaceans is presented. As a specific example, a model of a proposed source in the larynx of odontocetes is developed. Whales and dolphins generate a broad range of vocal sounds, but the exact mechanisms they use are not conclusively understood. In the fifty years since it has become widely accepted that whales can and do make sound, how they do so has remained particularly confounding. Cetaceans' highly divergent respiratory anatomy, along with the difficulty of internal observation during vocalization have contributed to this uncertainty. A variety of acoustical, morphological, ethological and physiological evidence has led to conflicting and often disputed theories of the locations and mechanisms of cetaceans' sound sources. Computational acoustic modeling has been used to create real-time parametric models of musical instruments and the human voice. These techniques can be applied to cetacean vocalizations to help better understand the nature and function of these sounds. Extensive studies of odontocete laryngeal morphology have revealed vocal folds that are consistently similar to a known but poorly understood acoustic source, the ribbon reed. A parametric computational model of the ribbon reed is developed, based on simplified geometrical, mechanical and fluid models drawn from the human voice literature. The physical parameters of the ribbon reed model are then adapted to those of the odontocete larynx. With reasonable estimates of real physical parameters, both the ribbon reed and odontocete larynx models produce sounds that are perceptually similar to their real-world counterparts, and both respond realistically under varying control conditions. Comparisons of acoustic features of the real-world and synthetic systems show a number of consistencies. While this does not on its own prove that either model is conclusively an accurate description of the source, it

  13. Computational Design Modelling : Proceedings of the Design Modelling Symposium

    CERN Document Server

    Kilian, Axel; Palz, Norbert; Scheurer, Fabian

    2012-01-01

    This book publishes the peer-reviewed proceeding of the third Design Modeling Symposium Berlin . The conference constitutes a platform for dialogue on experimental practice and research within the field of computationally informed architectural design. More than 60 leading experts the computational processes within the field of computationally informed architectural design to develop a broader and less exotic building practice that bears more subtle but powerful traces of the complex tool set and approaches we have developed and studied over recent years. The outcome are new strategies for a reasonable and innovative implementation of digital potential in truly innovative and radical design guided by both responsibility towards processes and the consequences they initiate.

  14. Group training with healthy computing practices to prevent repetitive strain injury (RSI): a preliminary study.

    Science.gov (United States)

    Peper, Erik; Gibney, Katherine H; Wilson, Vietta E

    2004-12-01

    This pilot study investigated whether group training, in which participants become role models and coaches, would reduce discomfort as compared to a nontreatment Control Group. Sixteen experimental participants participated in 6 weekly 2-hr group sessions of a Healthy Computing program whereas 12 control participants received no training. None of the participants reported symptoms to their supervisors nor were they receiving medical treatment for repetitive strain injury prior to the program. The program included training in ergonomic principles, psychophysiological awareness and control, sEMG practice at the workstation, and coaching coworkers. Using two-tailed t tests to analyze the data, the Experimental Group reported (1) a significant overall reduction in most body symptoms as compared to the Control Group and (2) a significant increase in positive work-style habits, such as taking breaks at the computer, as compared to the Control Group. This study suggests that employees could possibly improve health and work style patterns based on a holistic training program delivered in a group format followed by individual practice.

  15. Preparing computers for affective communication: a psychophysiological concept and preliminary results.

    Science.gov (United States)

    Whang, Min Cheol; Lim, Joa Sang; Boucsein, Wolfram

    Despite rapid advances in technology, computers remain incapable of responding to human emotions. An exploratory study was conducted to find out what physiological parameters might be useful to differentiate among 4 emotional states, based on 2 dimensions: pleasantness versus unpleasantness and arousal versus relaxation. The 4 emotions were induced by exposing 26 undergraduate students to different combinations of olfactory and auditory stimuli, selected in a pretest from 12 stimuli by subjective ratings of arousal and valence. Changes in electroencephalographic (EEG), heart rate variability, and electrodermal measures were used to differentiate the 4 emotions. EEG activity separates pleasantness from unpleasantness only in the aroused but not in the relaxed domain, where electrodermal parameters are the differentiating ones. All three classes of parameters contribute to a separation between arousal and relaxation in the positive valence domain, whereas the latency of the electrodermal response is the only differentiating parameter in the negative domain. We discuss how such a psychophysiological approach may be incorporated into a systemic model of a computer responsive to affective communication from the user.

  16. Computer Modeling of Human Delta Opioid Receptor

    Directory of Open Access Journals (Sweden)

    Tatyana Dzimbova

    2013-04-01

    Full Text Available The development of selective agonists of δ-opioid receptor as well as the model of interaction of ligands with this receptor is the subjects of increased interest. In the absence of crystal structures of opioid receptors, 3D homology models with different templates have been reported in the literature. The problem is that these models are not available for widespread use. The aims of our study are: (1 to choose within recently published crystallographic structures templates for homology modeling of the human δ-opioid receptor (DOR; (2 to evaluate the models with different computational tools; and (3 to precise the most reliable model basing on correlation between docking data and in vitro bioassay results. The enkephalin analogues, as ligands used in this study, were previously synthesized by our group and their biological activity was evaluated. Several models of DOR were generated using different templates. All these models were evaluated by PROCHECK and MolProbity and relationship between docking data and in vitro results was determined. The best correlations received for the tested models of DOR were found between efficacy (erel of the compounds, calculated from in vitro experiments and Fitness scoring function from docking studies. New model of DOR was generated and evaluated by different approaches. This model has good GA341 value (0.99 from MODELLER, good values from PROCHECK (92.6% of most favored regions and MolProbity (99.5% of favored regions. Scoring function correlates (Pearson r = -0.7368, p-value = 0.0097 with erel of a series of enkephalin analogues, calculated from in vitro experiments. So, this investigation allows suggesting a reliable model of DOR. Newly generated model of DOR receptor could be used further for in silico experiments and it will give possibility for faster and more correct design of selective and effective ligands for δ-opioid receptor.

  17. Interlanguages and synchronic models of computation

    CERN Document Server

    Berka, Alexander Victor

    2010-01-01

    A novel language system has given rise to promising alternatives to standard formal and processor network models of computation. An interstring linked with a abstract machine environment, shares sub-expressions, transfers data, and spatially allocates resources for the parallel evaluation of dataflow. Formal models called the a-Ram family are introduced, designed to support interstring programming languages (interlanguages). Distinct from dataflow, graph rewriting, and FPGA models, a-Ram instructions are bit level and execute in situ. They support sequential and parallel languages without the space/time overheads associated with the Turing Machine and l-calculus, enabling massive programs to be simulated. The devices of one a-Ram model, called the Synchronic A-Ram, are fully connected and simpler than FPGA LUT's. A compiler for an interlanguage called Space, has been developed for the Synchronic A-Ram. Space is MIMD. strictly typed, and deterministic. Barring memory allocation and compilation, modules are ref...

  18. A Neural Computational Model of Incentive Salience

    Science.gov (United States)

    Zhang, Jun; Berridge, Kent C.; Tindell, Amy J.; Smith, Kyle S.; Aldridge, J. Wayne

    2009-01-01

    Incentive salience is a motivational property with ‘magnet-like’ qualities. When attributed to reward-predicting stimuli (cues), incentive salience triggers a pulse of ‘wanting’ and an individual is pulled toward the cues and reward. A key computational question is how incentive salience is generated during a cue re-encounter, which combines both learning and the state of limbic brain mechanisms. Learning processes, such as temporal-difference models, provide one way for stimuli to acquire cached predictive values of rewards. However, empirical data show that subsequent incentive values are also modulated on the fly by dynamic fluctuation in physiological states, altering cached values in ways requiring additional motivation mechanisms. Dynamic modulation of incentive salience for a Pavlovian conditioned stimulus (CS or cue) occurs during certain states, without necessarily requiring (re)learning about the cue. In some cases, dynamic modulation of cue value occurs during states that are quite novel, never having been experienced before, and even prior to experience of the associated unconditioned reward in the new state. Such cases can include novel drug-induced mesolimbic activation and addictive incentive-sensitization, as well as natural appetite states such as salt appetite. Dynamic enhancement specifically raises the incentive salience of an appropriate CS, without necessarily changing that of other CSs. Here we suggest a new computational model that modulates incentive salience by integrating changing physiological states with prior learning. We support the model with behavioral and neurobiological data from empirical tests that demonstrate dynamic elevations in cue-triggered motivation (involving natural salt appetite, and drug-induced intoxication and sensitization). Our data call for a dynamic model of incentive salience, such as presented here. Computational models can adequately capture fluctuations in cue-triggered ‘wanting’ only by

  19. A neural computational model of incentive salience.

    Science.gov (United States)

    Zhang, Jun; Berridge, Kent C; Tindell, Amy J; Smith, Kyle S; Aldridge, J Wayne

    2009-07-01

    Incentive salience is a motivational property with 'magnet-like' qualities. When attributed to reward-predicting stimuli (cues), incentive salience triggers a pulse of 'wanting' and an individual is pulled toward the cues and reward. A key computational question is how incentive salience is generated during a cue re-encounter, which combines both learning and the state of limbic brain mechanisms. Learning processes, such as temporal-difference models, provide one way for stimuli to acquire cached predictive values of rewards. However, empirical data show that subsequent incentive values are also modulated on the fly by dynamic fluctuation in physiological states, altering cached values in ways requiring additional motivation mechanisms. Dynamic modulation of incentive salience for a Pavlovian conditioned stimulus (CS or cue) occurs during certain states, without necessarily requiring (re)learning about the cue. In some cases, dynamic modulation of cue value occurs during states that are quite novel, never having been experienced before, and even prior to experience of the associated unconditioned reward in the new state. Such cases can include novel drug-induced mesolimbic activation and addictive incentive-sensitization, as well as natural appetite states such as salt appetite. Dynamic enhancement specifically raises the incentive salience of an appropriate CS, without necessarily changing that of other CSs. Here we suggest a new computational model that modulates incentive salience by integrating changing physiological states with prior learning. We support the model with behavioral and neurobiological data from empirical tests that demonstrate dynamic elevations in cue-triggered motivation (involving natural salt appetite, and drug-induced intoxication and sensitization). Our data call for a dynamic model of incentive salience, such as presented here. Computational models can adequately capture fluctuations in cue-triggered 'wanting' only by incorporating

  20. A neural computational model of incentive salience.

    Directory of Open Access Journals (Sweden)

    Jun Zhang

    2009-07-01

    Full Text Available Incentive salience is a motivational property with 'magnet-like' qualities. When attributed to reward-predicting stimuli (cues, incentive salience triggers a pulse of 'wanting' and an individual is pulled toward the cues and reward. A key computational question is how incentive salience is generated during a cue re-encounter, which combines both learning and the state of limbic brain mechanisms. Learning processes, such as temporal-difference models, provide one way for stimuli to acquire cached predictive values of rewards. However, empirical data show that subsequent incentive values are also modulated on the fly by dynamic fluctuation in physiological states, altering cached values in ways requiring additional motivation mechanisms. Dynamic modulation of incentive salience for a Pavlovian conditioned stimulus (CS or cue occurs during certain states, without necessarily requiring (relearning about the cue. In some cases, dynamic modulation of cue value occurs during states that are quite novel, never having been experienced before, and even prior to experience of the associated unconditioned reward in the new state. Such cases can include novel drug-induced mesolimbic activation and addictive incentive-sensitization, as well as natural appetite states such as salt appetite. Dynamic enhancement specifically raises the incentive salience of an appropriate CS, without necessarily changing that of other CSs. Here we suggest a new computational model that modulates incentive salience by integrating changing physiological states with prior learning. We support the model with behavioral and neurobiological data from empirical tests that demonstrate dynamic elevations in cue-triggered motivation (involving natural salt appetite, and drug-induced intoxication and sensitization. Our data call for a dynamic model of incentive salience, such as presented here. Computational models can adequately capture fluctuations in cue-triggered 'wanting' only by

  1. Computational Modeling of Large Wildfires: A Roadmap

    KAUST Repository

    Coen, Janice L.

    2010-08-01

    Wildland fire behavior, particularly that of large, uncontrolled wildfires, has not been well understood or predicted. Our methodology to simulate this phenomenon uses high-resolution dynamic models made of numerical weather prediction (NWP) models coupled to fire behavior models to simulate fire behavior. NWP models are capable of modeling very high resolution (< 100 m) atmospheric flows. The wildland fire component is based upon semi-empirical formulas for fireline rate of spread, post-frontal heat release, and a canopy fire. The fire behavior is coupled to the atmospheric model such that low level winds drive the spread of the surface fire, which in turn releases sensible heat, latent heat, and smoke fluxes into the lower atmosphere, feeding back to affect the winds directing the fire. These coupled dynamic models capture the rapid spread downwind, flank runs up canyons, bifurcations of the fire into two heads, and rough agreement in area, shape, and direction of spread at periods for which fire location data is available. Yet, intriguing computational science questions arise in applying such models in a predictive manner, including physical processes that span a vast range of scales, processes such as spotting that cannot be modeled deterministically, estimating the consequences of uncertainty, the efforts to steer simulations with field data ("data assimilation"), lingering issues with short term forecasting of weather that may show skill only on the order of a few hours, and the difficulty of gathering pertinent data for verification and initialization in a dangerous environment. © 2010 IEEE.

  2. DYNAMIC TASK PARTITIONING MODEL IN PARALLEL COMPUTING

    Directory of Open Access Journals (Sweden)

    Javed Ali

    2012-04-01

    Full Text Available Parallel computing systems compose task partitioning strategies in a true multiprocessing manner. Such systems share the algorithm and processing unit as computing resources which leads to highly inter process communications capabilities. The main part of the proposed algorithm is resource management unit which performs task partitioning and co-scheduling .In this paper, we present a technique for integrated task partitioning and co-scheduling on the privately owned network. We focus on real-time and non preemptive systems. A large variety of experiments have been conducted on the proposed algorithm using synthetic and real tasks. Goal of computation model is to provide a realistic representation of the costs of programming The results show the benefit of the task partitioning. The main characteristics of our method are optimal scheduling and strong link between partitioning, scheduling and communication. Some important models for task partitioning are also discussed in the paper. We target the algorithm for task partitioning which improve the inter process communication between the tasks and use the recourses of the system in the efficient manner. The proposed algorithm contributes the inter-process communication cost minimization amongst the executing processes.

  3. Computer modeling for optimal placement of gloveboxes

    Energy Technology Data Exchange (ETDEWEB)

    Hench, K.W.; Olivas, J.D. [Los Alamos National Lab., NM (United States); Finch, P.R. [New Mexico State Univ., Las Cruces, NM (United States)

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units.

  4. Computer model of tetrahedral amorphous diamond

    Science.gov (United States)

    Djordjević, B. R.; Thorpe, M. F.; Wooten, F.

    1995-08-01

    We computer generate a model of amorphous diamond using the Wooten-Weaire method, with fourfold coordination everywhere. We investigate two models: one where four-membered rings are allowed and the other where the four-membered rings are forbidden; each model consisting of 4096 atoms. Starting from the perfect diamond crystalline structure, we first randomize the structure by introducing disorder through random bond switches at a sufficiently high temperature. Subsequently, the temperature is reduced in stages, and the topological and geometrical relaxation of the structure takes place using the Keating potential. After a long annealing process, a random network of comparatively low energy is obtained. We calculate the pair distribution function, mean bond angle, rms angular deviation, rms bond length, rms bond-length deviation, and ring statistics for the final relaxed structures. We minimize the total strain energy by adjusting the density of the sample. We compare our results with similar computer-generated models for amorphous silicon, and with experimental measurement of the structure factor for (predominantly tetrahedral) amorphous carbon.

  5. Computer Generated Cardiac Model For Nuclear Medicine

    Science.gov (United States)

    Hills, John F.; Miller, Tom R.

    1981-07-01

    A computer generated mathematical model of a thallium-201 myocardial image is described which is based on realistic geometric and physiological assumptions. The left ventricle is represented by an ellipsoid truncated by aortic and mitral valve planes. Initially, an image of a motionless left ventricle is calculated with the location, size, and relative activity of perfusion defects selected by the designer. The calculation includes corrections for photon attenuation by overlying structures and the relative distribution of activity within the tissues. Motion of the ventricular walls is simulated either by a weighted sum of images at different stages in the cardiac cycle or by a blurring function whose width varies with position. Camera and collimator blurring are estimated by the MTF of the system measured at a representative depth in a phantom. Statistical noise is added using a Poisson random number generator. The usefulness of this model is due to two factors: the a priori characterization of location and extent of perfusion defects and the strong visual similarity of the images to actual clinical studies. These properties should permit systematic evaluation of image processing algorithms using this model. The principles employed in developing this cardiac image model can readily be applied to the simulation of other nuclear medicine studies and to other medical imaging modalities including computed tomography, ultrasound, and digital radiography.

  6. COMMON PHASES OF COMPUTER FORENSICS INVESTIGATION MODELS

    Directory of Open Access Journals (Sweden)

    Yunus Yusoff

    2011-06-01

    Full Text Available The increasing criminal activities using digital information as the means or targets warrant for a structured manner in dealing with them. Since 1984 when a formalized process been introduced, a great number of new and improved computer forensic investigation processes have been developed. In this paper, we reviewed a few selected investigation processes that have been produced throughout the yearsand then identified the commonly shared processes. Hopefully, with the identification of the commonly shard process, it would make it easier for the new users to understand the processes and also to serve as the basic underlying concept for the development of a new set of processes. Based on the commonly shared processes, we proposed a generic computer forensics investigation model, known as GCFIM.

  7. Spatiotemporal Context Modelling in Pervasive Context-Aware Computing Environment: A Logic Perspective

    Directory of Open Access Journals (Sweden)

    Darine Ameyed

    2016-04-01

    Full Text Available Pervasive context-aware computing, is one of the topics that received particular attention from researchers. The context, itself is an important notion explored in many works discussing its: acquisition, definition, modelling, reasoning and more. Given the permanent evolution of context-aware systems, context modeling is still a complex task, due to the lack of an adequate, dynamic, formal and relevant context representation. This paper discusses various context modeling approaches and previous logic-based works. It also proposes a preliminary formal spatiotemporal context modelling based on first order logic, derived from the structure of natural languages.

  8. 40 CFR 194.23 - Models and computer codes.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e.,...

  9. Computational modeling of a forward lunge

    DEFF Research Database (Denmark)

    Eriksen, Tine Alkjær; Wieland, Maja Rose; Andersen, Michael Skipper

    2012-01-01

    during forward lunging. Thus, the purpose of the present study was to establish a musculoskeletal model of the forward lunge to computationally investigate the complete mechanical force equilibrium of the tibia during the movement to examine the loading pattern of the cruciate ligaments. A healthy female...... was selected out of a group of healthy subjects, who all performed a forward lunge on a force platform, targeting a knee flexion angle of 90˚. Skin-markers were placed on anatomical landmarks on the subject and the movement was recorded by five video cameras. The three-dimensional kinematic data describing...... the forward lunge movement were extracted and used to develop a biomechanical model of the lunge movement. The model comprised two legs including femur, crus, rigid foot segments and the pelvis. Each leg had 35 independent muscle units, which were recruited according to a minimum fatigue criterion...

  10. Computer model for analyzing sodium cold traps

    Energy Technology Data Exchange (ETDEWEB)

    McPheeters, C C; Raue, D J

    1983-05-01

    A computer model was developed to simulate the processes that occur in sodium cold traps. The Model for Analyzing Sodium Cold Traps (MASCOT) simulates any desired configuration of mesh arrangements and dimensions and calculates pressure drops and flow distributions, temperature profiles, impurity concentration profiles, and impurity mass distributions. The calculated pressure drop as a function of impurity mass content determines the capacity of the cold trap. The accuracy of the model was checked by comparing calculated mass distributions with experimentally determined mass distributions from literature publications and with results from our own cold trap experiments. The comparisons were excellent in all cases. A parametric study was performed to determine which design variables are most important in maximizing cold trap capacity.

  11. A Graph Model for Imperative Computation

    CERN Document Server

    McCusker, Guy

    2009-01-01

    Scott's graph model is a lambda-algebra based on the observation that continuous endofunctions on the lattice of sets of natural numbers can be represented via their graphs. A graph is a relation mapping finite sets of input values to output values. We consider a similar model based on relations whose input values are finite sequences rather than sets. This alteration means that we are taking into account the order in which observations are made. This new notion of graph gives rise to a model of affine lambda-calculus that admits an interpretation of imperative constructs including variable assignment, dereferencing and allocation. Extending this untyped model, we construct a category that provides a model of typed higher-order imperative computation with an affine type system. An appropriate language of this kind is Reynolds's Syntactic Control of Interference. Our model turns out to be fully abstract for this language. At a concrete level, it is the same as Reddy's object spaces model, which was the first "...

  12. Computer-Based Reading Programs: A Preliminary Investigation of Two Parent Implemented Programs with Students At-Risk for Reading Failure

    Science.gov (United States)

    Pindiprolu, Sekhar S.; Forbush, David

    2009-01-01

    In 2000, National Reading Panelists (NRP) reported that computer delivered reading instruction has potential for promoting the reading skills of students at-risk for reading failure. However, panelists also noted a scarcity of data present in the literature on the effects of computer-based reading instruction. This preliminary investigation…

  13. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  14. Computational modeling of Li-ion batteries

    Science.gov (United States)

    Grazioli, D.; Magri, M.; Salvadori, A.

    2016-08-01

    This review focuses on energy storage materials modeling, with particular emphasis on Li-ion batteries. Theoretical and computational analyses not only provide a better understanding of the intimate behavior of actual batteries under operational and extreme conditions, but they may tailor new materials and shape new architectures in a complementary way to experimental approaches. Modeling can therefore play a very valuable role in the design and lifetime prediction of energy storage materials and devices. Batteries are inherently multi-scale, in space and time. The macro-structural characteristic lengths (the thickness of a single cell, for instance) are order of magnitudes larger than the particles that form the microstructure of the porous electrodes, which in turn are scale-separated from interface layers at which atomistic intercalations occur. Multi-physics modeling concepts, methodologies, and simulations at different scales, as well as scale transition strategies proposed in the recent literature are here revised. Finally, computational challenges toward the next generation of Li-ion batteries are discussed.

  15. Computational modeling of Li-ion batteries

    Science.gov (United States)

    Grazioli, D.; Magri, M.; Salvadori, A.

    2016-12-01

    This review focuses on energy storage materials modeling, with particular emphasis on Li-ion batteries. Theoretical and computational analyses not only provide a better understanding of the intimate behavior of actual batteries under operational and extreme conditions, but they may tailor new materials and shape new architectures in a complementary way to experimental approaches. Modeling can therefore play a very valuable role in the design and lifetime prediction of energy storage materials and devices. Batteries are inherently multi-scale, in space and time. The macro-structural characteristic lengths (the thickness of a single cell, for instance) are order of magnitudes larger than the particles that form the microstructure of the porous electrodes, which in turn are scale-separated from interface layers at which atomistic intercalations occur. Multi-physics modeling concepts, methodologies, and simulations at different scales, as well as scale transition strategies proposed in the recent literature are here revised. Finally, computational challenges toward the next generation of Li-ion batteries are discussed.

  16. Modeling Reality - How Computers Mirror Life

    Science.gov (United States)

    Bialynicki-Birula, Iwo; Bialynicka-Birula, Iwona

    2005-01-01

    The bookModeling Reality covers a wide range of fascinating subjects, accessible to anyone who wants to learn about the use of computer modeling to solve a diverse range of problems, but who does not possess a specialized training in mathematics or computer science. The material presented is pitched at the level of high-school graduates, even though it covers some advanced topics (cellular automata, Shannon's measure of information, deterministic chaos, fractals, game theory, neural networks, genetic algorithms, and Turing machines). These advanced topics are explained in terms of well known simple concepts: Cellular automata - Game of Life, Shannon's formula - Game of twenty questions, Game theory - Television quiz, etc. The book is unique in explaining in a straightforward, yet complete, fashion many important ideas, related to various models of reality and their applications. Twenty-five programs, written especially for this book, are provided on an accompanying CD. They greatly enhance its pedagogical value and make learning of even the more complex topics an enjoyable pleasure.

  17. COMPUTER MODELING OF EMBRYONIC MORTALITY AT CRIOCONSERVATION

    Directory of Open Access Journals (Sweden)

    Gorbunov,

    2016-08-01

    Full Text Available The purpose of the research was to determine the regularities of influence of mammalian embryos heterogeneity and effectiveness of cryoconservation steps on their viability by using the developed simulation model. The model is based on analytical expressions that reflect the main causes of embryonic mortality during in vitro and in vivo cultivation, crioconservation and embryo transplantation. Reduction of viability depends on a set of biological factors such as the animal special, donor and recipient state, quality of embryos, and of technological ones such as the efficiency of cryopreservation method, and embryo transplantation. Fulfilled computer experiment showed, that divergence of embryos viability depending on biological parameters variations changes in a range from 0 to 100%, whereas efficiency index of chosen technology has an inaccuracy about 1%. The comparative analysis of alternative technologies of embryos cryopreservation showed the maximum efficiency of stages of use of the cryoprotectant, freezing regime and in vitro and in vivo cultivation of biological object. The application of computer modeling gives an opportunity to reduce the range of embryos viability results, obtained in different experiments is many times, thereby to shorten the time, monetary costs and the slaughter of laboratory animals in obtaining reliable results.

  18. Computer-aided modeling framework – a generic modeling template

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    This work focuses on the development of a computer-aided modeling framework. The framework is a knowledge-based system that is built on a generic modeling language and structured on workflows for different modeling tasks. The overall objective is to support model developers and users to generate...... and test models systematically, efficiently and reliably. In this way, development of products and processes can be made faster, cheaper and more efficient. In this contribution, as part of the framework, a generic modeling template for the systematic derivation of problem specific models is presented....... The application of the modeling template is highlighted with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene...

  19. Optimization and mathematical modeling in computer architecture

    CERN Document Server

    Sankaralingam, Karu; Nowatzki, Tony

    2013-01-01

    In this book we give an overview of modeling techniques used to describe computer systems to mathematical optimization tools. We give a brief introduction to various classes of mathematical optimization frameworks with special focus on mixed integer linear programming which provides a good balance between solver time and expressiveness. We present four detailed case studies -- instruction set customization, data center resource management, spatial architecture scheduling, and resource allocation in tiled architectures -- showing how MILP can be used and quantifying by how much it outperforms t

  20. Dynamical Models for Computer Viruses Propagation

    Directory of Open Access Journals (Sweden)

    José R. C. Piqueira

    2008-01-01

    Full Text Available Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network.

  1. Computational social dynamic modeling of group recruitment.

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Nina M.; Lee, Marinna; Pickett, Marc; Turnley, Jessica Glicken (Sandia National Laboratories, Albuquerque, NM); Smrcka, Julianne D. (Sandia National Laboratories, Albuquerque, NM); Ko, Teresa H.; Moy, Timothy David (Sandia National Laboratories, Albuquerque, NM); Wu, Benjamin C.

    2004-01-01

    The Seldon software toolkit combines concepts from agent-based modeling and social science to create a computationally social dynamic model for group recruitment. The underlying recruitment model is based on a unique three-level hybrid agent-based architecture that contains simple agents (level one), abstract agents (level two), and cognitive agents (level three). This uniqueness of this architecture begins with abstract agents that permit the model to include social concepts (gang) or institutional concepts (school) into a typical software simulation environment. The future addition of cognitive agents to the recruitment model will provide a unique entity that does not exist in any agent-based modeling toolkits to date. We use social networks to provide an integrated mesh within and between the different levels. This Java based toolkit is used to analyze different social concepts based on initialization input from the user. The input alters a set of parameters used to influence the values associated with the simple agents, abstract agents, and the interactions (simple agent-simple agent or simple agent-abstract agent) between these entities. The results of phase-1 Seldon toolkit provide insight into how certain social concepts apply to different scenario development for inner city gang recruitment.

  2. Applying Intelligent Computing Techniques to Modeling Biological Networks from Expression Data

    Institute of Scientific and Technical Information of China (English)

    Wei-Po Lee; Kung-Cheng Yang

    2008-01-01

    Constructing biological networks is one of the most important issues in system sbiology. However, constructing a network from data manually takes a considerable large amount of time, therefore an automated procedure is advocated. To automate the procedure of network construction, in this work we use two intelligent computing techniques, genetic programming and neural computation, to infer two kinds of network models that use continuous variables. To verify the presented approaches, experiments have been conducted and the preliminary results show that both approaches can be used to infer networks successfully.

  3. Preliminary scenarios and nuclide transport models for low-and intermediate-level repository system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Youn Myoung; Han, Kyong Won; Hwang, Yong Soo; Kang, Chul Hyung

    2001-02-01

    Through the study 11 scenarios with which nuclide release from the low- and intermediate-level radioactive waste could be simulated and assessed are selected, based upon FEPs identified. For each scenario, some practical methodologies as well as mathematical models involved in modling of nuclide transport in various media are also proposed. It is considered that such methodologies can play a great role when real repository system is constructed and operated in very near future. Real repository system is anticipated not to be quite different with the repository system postulated through this study. Even though there shows very complicated features for relevant parameters associated with various phisical-, geohydrological-, and geochemical situation and even human society as well, it is very necessary to propose the methodologies for a quantitative assessment of the performance of the repository in order to use them as a template on the practical point of view of preliminary safety assessment. Mathematical models proposed could be easily adopted by such common computer codes as, for example, MIMOSA and MASCOT-K.

  4. Preliminary Thermal Hydraulic Analyses of the Conceptual Core Models with Tubular Type Fuel Assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Hee Taek; Park, Jong Hark; Park, Cheol

    2006-11-15

    A new research reactor (AHR, Advanced HANARO Reactor) based on the HANARO has being conceptually developed for the future needs of research reactors. A tubular type fuel was considered as one of the fuel options of the AHR. A tubular type fuel assembly has several curved fuel plates arranged with a constant small gap to build up cooling channels, which is very similar to an annulus pipe with many layers. This report presents the preliminary analysis of thermal hydraulic characteristics and safety margins for three conceptual core models using tubular fuel assemblies. Four design criteria, which are the fuel temperature, ONB (Onset of Nucleate Boiling) margin, minimum DNBR (Departure from Nucleate Boiling Ratio) and OFIR (Onset of Flow Instability Ratio), were investigated along with various core flow velocities in the normal operating conditions. And the primary coolant flow rate based a conceptual core model was suggested as a design information for the process design of the primary cooling system. The computational fluid dynamics analysis was also carried out to evaluate the coolant velocity distributions between tubular channels and the pressure drop characteristics of the tubular fuel assembly.

  5. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  6. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    Energy Technology Data Exchange (ETDEWEB)

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study.

  7. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    Energy Technology Data Exchange (ETDEWEB)

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study.

  8. Computational models of intergroup competition and warfare.

    Energy Technology Data Exchange (ETDEWEB)

    Letendre, Kenneth (University of New Mexico); Abbott, Robert G.

    2011-11-01

    This document reports on the research of Kenneth Letendre, the recipient of a Sandia Graduate Research Fellowship at the University of New Mexico. Warfare is an extreme form of intergroup competition in which individuals make extreme sacrifices for the benefit of their nation or other group to which they belong. Among animals, limited, non-lethal competition is the norm. It is not fully understood what factors lead to warfare. We studied the global variation in the frequency of civil conflict among countries of the world, and its positive association with variation in the intensity of infectious disease. We demonstrated that the burden of human infectious disease importantly predicts the frequency of civil conflict and tested a causal model for this association based on the parasite-stress theory of sociality. We also investigated the organization of social foraging by colonies of harvester ants in the genus Pogonomyrmex, using both field studies and computer models.

  9. Computer modeling of thermoelectric generator performance

    Science.gov (United States)

    Chmielewski, A. B.; Shields, V.

    1982-01-01

    Features of the DEGRA 2 computer code for simulating the operations of a spacecraft thermoelectric generator are described. The code models the physical processes occurring during operation. Input variables include the thermoelectric couple geometry and composition, the thermoelectric materials' properties, interfaces and insulation in the thermopile, the heat source characteristics, mission trajectory, and generator electrical requirements. Time steps can be specified and sublimation of the leg and hot shoe is accounted for, as are shorts between legs. Calculations are performed for conduction, Peltier, Thomson, and Joule heating, the cold junction can be adjusted for solar radition, and the legs of the thermoelectric couple are segmented to enhance the approximation accuracy. A trial run covering 18 couple modules yielded data with 0.3% accuracy with regard to test data. The model has been successful with selenide materials, SiGe, and SiN4, with output of all critical operational variables.

  10. A computational model of motor neuron degeneration.

    Science.gov (United States)

    Le Masson, Gwendal; Przedborski, Serge; Abbott, L F

    2014-08-20

    To explore the link between bioenergetics and motor neuron degeneration, we used a computational model in which detailed morphology and ion conductance are paired with intracellular ATP production and consumption. We found that reduced ATP availability increases the metabolic cost of a single action potential and disrupts K+/Na+ homeostasis, resulting in a chronic depolarization. The magnitude of the ATP shortage at which this ionic instability occurs depends on the morphology and intrinsic conductance characteristic of the neuron. If ATP shortage is confined to the distal part of the axon, the ensuing local ionic instability eventually spreads to the whole neuron and involves fasciculation-like spiking events. A shortage of ATP also causes a rise in intracellular calcium. Our modeling work supports the notion that mitochondrial dysfunction can account for salient features of the paralytic disorder amyotrophic lateral sclerosis, including motor neuron hyperexcitability, fasciculation, and differential vulnerability of motor neuron subpopulations.

  11. Direct modeling for computational fluid dynamics

    Science.gov (United States)

    Xu, Kun

    2015-06-01

    All fluid dynamic equations are valid under their modeling scales, such as the particle mean free path and mean collision time scale of the Boltzmann equation and the hydrodynamic scale of the Navier-Stokes (NS) equations. The current computational fluid dynamics (CFD) focuses on the numerical solution of partial differential equations (PDEs), and its aim is to get the accurate solution of these governing equations. Under such a CFD practice, it is hard to develop a unified scheme that covers flow physics from kinetic to hydrodynamic scales continuously because there is no such governing equation which could make a smooth transition from the Boltzmann to the NS modeling. The study of fluid dynamics needs to go beyond the traditional numerical partial differential equations. The emerging engineering applications, such as air-vehicle design for near-space flight and flow and heat transfer in micro-devices, do require further expansion of the concept of gas dynamics to a larger domain of physical reality, rather than the traditional distinguishable governing equations. At the current stage, the non-equilibrium flow physics has not yet been well explored or clearly understood due to the lack of appropriate tools. Unfortunately, under the current numerical PDE approach, it is hard to develop such a meaningful tool due to the absence of valid PDEs. In order to construct multiscale and multiphysics simulation methods similar to the modeling process of constructing the Boltzmann or the NS governing equations, the development of a numerical algorithm should be based on the first principle of physical modeling. In this paper, instead of following the traditional numerical PDE path, we introduce direct modeling as a principle for CFD algorithm development. Since all computations are conducted in a discretized space with limited cell resolution, the flow physics to be modeled has to be done in the mesh size and time step scales. Here, the CFD is more or less a direct

  12. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  13. Capabilities of the ATHENA computer code for modeling the SP-100 space reactor concept

    Science.gov (United States)

    Fletcher, C. D.

    1985-09-01

    The capability to perform thermal-hydraulic analyses of an SP-100 space reactor was demonstrated using the ATHENA computer code. The preliminary General Electric SP-100 design was modeled using Athena. The model simulates the fast reactor, liquid-lithium coolant loops, and lithium-filled heat pipes of this design. Two ATHENA demonstration calculations were performed simulating accident scenarios. A mask for the SP-100 model and an interface with the Nuclear Plant Analyzer (NPA) were developed, allowing a graphic display of the calculated results on the NPA.

  14. Using Gender Schema Theory to Examine Gender Equity in Computing: a Preliminary Study

    Science.gov (United States)

    Agosto, Denise E.

    Women continue to constitute a minority of computer science majors in the United States and Canada. One possible contributing factor is that most Web sites, CD-ROMs, and other digital resources do not reflect girls' design and content preferences. This article describes a pilot study that considered whether gender schema theory can serve as a framework for investigating girls' Web site design and content preferences. Eleven 14- and 15-year-old girls participated in the study. The methodology included the administration of the Children's Sex-Role Inventory (CSRI), Web-surfing sessions, interviews, and data analysis using iterative pattern coding. On the basis of their CSRI scores, the participants were divided into feminine-high (FH) and masculine-high (MH) groups. Data analysis uncovered significant differences in the criteria the groups used to evaluate Web sites. The FH group favored evaluation criteria relating to graphic and multimedia design, whereas the MH group favored evaluation criteria relating to subject content. Models of the two groups' evaluation criteria are presented, and the implications of the findings are discussed.

  15. Documentation of the Ecological Risk Assessment Computer Model ECORSK.5

    Energy Technology Data Exchange (ETDEWEB)

    Anthony F. Gallegos; Gilbert J. Gonzales

    1999-06-01

    The FORTRAN77 ecological risk computer model--ECORSK.5--has been used to estimate the potential toxicity of surficial deposits of radioactive and non-radioactive contaminants to several threatened and endangered (T and E) species at the Los Alamos National Laboratory (LANL). These analyses to date include preliminary toxicity estimates for the Mexican spotted owl, the American peregrine falcon, the bald eagle, and the southwestern willow flycatcher. This work has been performed as required for the Record of Decision for the construction of the Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility at LANL as part of the Environmental Impact Statement. The model is dependent on the use of the geographic information system and associated software--ARC/INFO--and has been used in conjunction with LANL's Facility for Information Management and Display (FIMAD) contaminant database. The integration of FIMAD data and ARC/INFO using ECORSK.5 allows the generation of spatial information from a gridded area of potential exposure called an Ecological Exposure Unit. ECORSK.5 was used to simulate exposures using a modified Environmental Protection Agency Quotient Method. The model can handle a large number of contaminants within the home range of T and E species. This integration results in the production of hazard indices which, when compared to risk evaluation criteria, estimate the potential for impact from consumption of contaminants in food and ingestion of soil. The assessment is considered a Tier-2 type of analysis. This report summarizes and documents the ECORSK.5 code, the mathematical models used in the development of ECORSK.5, and the input and other requirements for its operation. Other auxiliary FORTRAN 77 codes used for processing and graphing output from ECORSK.5 are also discussed. The reader may refer to reports cited in the introduction to obtain greater detail on past applications of ECORSK.5 and assumptions used in deriving model parameters.

  16. Statistics, Computation, and Modeling in Cosmology

    Science.gov (United States)

    Jewell, Jeff; Guiness, Joe; SAMSI 2016 Working Group in Cosmology

    2017-01-01

    Current and future ground and space based missions are designed to not only detect, but map out with increasing precision, details of the universe in its infancy to the present-day. As a result we are faced with the challenge of analyzing and interpreting observations from a wide variety of instruments to form a coherent view of the universe. Finding solutions to a broad range of challenging inference problems in cosmology is one of the goals of the “Statistics, Computation, and Modeling in Cosmology” workings groups, formed as part of the year long program on ‘Statistical, Mathematical, and Computational Methods for Astronomy’, hosted by the Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation funded institute. Two application areas have emerged for focused development in the cosmology working group involving advanced algorithmic implementations of exact Bayesian inference for the Cosmic Microwave Background, and statistical modeling of galaxy formation. The former includes study and development of advanced Markov Chain Monte Carlo algorithms designed to confront challenging inference problems including inference for spatial Gaussian random fields in the presence of sources of galactic emission (an example of a source separation problem). Extending these methods to future redshift survey data probing the nonlinear regime of large scale structure formation is also included in the working group activities. In addition, the working group is also focused on the study of ‘Galacticus’, a galaxy formation model applied to dark matter-only cosmological N-body simulations operating on time-dependent halo merger trees. The working group is interested in calibrating the Galacticus model to match statistics of galaxy survey observations; specifically stellar mass functions, luminosity functions, and color-color diagrams. The group will use subsampling approaches and fractional factorial designs to statistically and

  17. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...... that involve several types of numerical computations. The computers considered in our study are the Connection Machines CM-200 and CM-5, and the MasPar MP-2216...

  18. Computational Models to Synthesize Human Walking

    Institute of Scientific and Technical Information of China (English)

    Lei Ren; David Howard; Laurence Kenney

    2006-01-01

    The synthesis of human walking is of great interest in biomechanics and biomimetic engineering due to its predictive capabilities and potential applications in clinical biomechanics, rehabilitation engineering and biomimetic robotics. In this paper,the various methods that have been used to synthesize humanwalking are reviewed from an engineering viewpoint. This involves a wide spectrum of approaches, from simple passive walking theories to large-scale computational models integrating the nervous, muscular and skeletal systems. These methods are roughly categorized under four headings: models inspired by the concept of a CPG (Central Pattern Generator), methods based on the principles of control engineering, predictive gait simulation using optimisation, and models inspired by passive walking theory. The shortcomings and advantages of these methods are examined, and future directions are discussed in the context of providing insights into the neural control objectives driving gait and improving the stability of the predicted gaits. Future advancements are likely to be motivated by improved understanding of neural control strategies and the subtle complexities of the musculoskeletal system during human locomotion. It is only a matter of time before predictive gait models become a practical and valuable tool in clinical diagnosis, rehabilitation engineering and robotics.

  19. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  20. Preliminary study of the Gravimetric Local Geoid Model in Jordan: case study (GeoJordan Model

    Directory of Open Access Journals (Sweden)

    A. Al-Zoubi

    2007-06-01

    Full Text Available Recently, there has been an increased interest in studying and defining the Local and Regional Geoid Model worldwide, due to its importance in geodetic and geophysics applications.The use of the Global Positioning System (GPS is internationally growing, yet the lack of a Geoid Model for Jordan has limited the use of GPS for the geodetic applications in the country. This work aims to present the preliminary results that we propose for «The Gravimetric Jordanian Geoid Model (GeoJordan». The model is created using gravimetric data and the GRAVSOFT program. The model is validated using GPS and precise level measurements in the Amman area. Moreover, we present a comparison using the Global Geopotential Model OSU91A and the EGM96 Model and the results showed great discrepancies. We also present the approach used to obtain the orthometric height from GPS ellipsoidal height measurements. We found that the error margin obtained in this work of the GeoJordan after fitting the data with GPS/leveling measurements is about (10 cm in the tested area whereas the standard error of the created model is about (40 cm.

  1. Final technical report for DOE Computational Nanoscience Project: Integrated Multiscale Modeling of Molecular Computing Devices

    Energy Technology Data Exchange (ETDEWEB)

    Cummings, P. T.

    2010-02-08

    This document reports the outcomes of the Computational Nanoscience Project, "Integrated Multiscale Modeling of Molecular Computing Devices". It includes a list of participants and publications arising from the research supported.

  2. Evaluation of Marine Corps Manpower Computer Simulation Model

    Science.gov (United States)

    2016-12-01

    MARINE CORPS MANPOWER COMPUTER SIMULATION MODEL by Eric S. Anderson December 2016 Thesis Advisor: Arnold Buss Second Reader: Neil Rowe...Master’s thesis 4. TITLE AND SUBTITLE EVALUATION OF MARINE CORPS MANPOWER COMPUTER SIMULATION MODEL 5. FUNDING NUMBERS ACCT: 622716 JON...overall end strength are maintained. To assist their mission, an agent-based computer simulation model was developed in the Java computer language

  3. Computational Granular Dynamics Models and Algorithms

    CERN Document Server

    Pöschel, Thorsten

    2005-01-01

    Computer simulations not only belong to the most important methods for the theoretical investigation of granular materials, but also provide the tools that have enabled much of the expanding research by physicists and engineers. The present book is intended to serve as an introduction to the application of numerical methods to systems of granular particles. Accordingly, emphasis is placed on a general understanding of the subject rather than on the presentation of the latest advances in numerical algorithms. Although a basic knowledge of C++ is needed for the understanding of the numerical methods and algorithms in the book, it avoids usage of elegant but complicated algorithms to remain accessible for those who prefer to use a different programming language. While the book focuses more on models than on the physics of granular material, many applications to real systems are presented.

  4. Method of generating a computer readable model

    DEFF Research Database (Denmark)

    2008-01-01

    A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element....... The method comprises encoding a first and a second one of the construction elements as corresponding data structures, each representing the connection elements of the corresponding construction element, and each of the connection elements having associated with it a predetermined connection type. The method...... further comprises determining a first connection element of the first construction element and a second connection element of the second construction element located in a predetermined proximity of each other; and retrieving connectivity information of the corresponding connection types of the first...

  5. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  6. Gravothermal Star Clusters - Theory and Computer Modelling

    Science.gov (United States)

    Spurzem, Rainer

    2010-11-01

    In the George Darwin lecture, delivered to the British Royal Astronomical Society in 1960 by Viktor A. Ambartsumian he wrote on the evolution of stellar systems that it can be described by the "dynamic evolution of a gravitating gas" complemented by "a statistical description of the changes in the physical states of stars". This talk will show how this physical concept has inspired theoretical modeling of star clusters in the following decades up to the present day. The application of principles of thermodynamics shows, as Ambartsumian argued in his 1960 lecture, that there is no stable state of equilibrium of a gravitating star cluster. The trend to local thermodynamic equilibrium is always disturbed by escaping stars (Ambartsumian), as well as by gravothermal and gravogyro instabilities, as it was detected later. Here the state-of-the-art of modeling the evolution of dense stellar systems based on principles of thermodynamics and statistical mechanics (Fokker-Planck approximation) will be reviewed. Recent progress including rotation and internal correlations (primordial binaries) is presented. The models have also very successfully been used to study dense star clusters around massive black holes in galactic nuclei and even (in a few cases) relativistic supermassive dense objects in centres of galaxies (here again briefly touching one of the many research fields of V.A. Ambartsumian). For the modern present time of high-speed supercomputing, where we are tackling direct N-body simulations of star clusters, we will show that such direct modeling supports and proves the concept of the statistical models based on the Fokker-Planck theory, and that both theoretical concepts and direct computer simulations are necessary to support each other and make scientific progress in the study of star cluster evolution.

  7. The difference between playing games with and without the computer: a preliminary view.

    Science.gov (United States)

    Antonietti, Alessandro; Mellone, Rosa

    2003-03-01

    The authors address the question of whether associations between video games and cognitive and metacognitive variables depend either on the features of the computer or on the content of the game that the computer allows one to play. An experiment to separate these two kinds of effects was carried out by using a traditional version and a computer-supported version of Pegopolis, a solitaire game. The two versions were exactly the same except that they were played by moving pieces either on a real board or on a virtual computer-presented board. The performance levels and strategies followed during the game by the 40 undergraduates who took part in the experiment were not significantly different in the real and virtual conditions. None of the participants transferred playing strategies or practice from one version of the game to the other. Scores were not affected by gender or by the studies pursued by participants, the habit of playing games in the traditional manner or playing video games, or intelligence. Retrospective reports did not support differences in the subjective experience between the two versions. Results showed that video games, when they do not make much use of the computer's special features, produce effects because of the situations they simulate rather than because of features of the computer itself.

  8. Investigating the role of combined acoustic-visual feedback in one-dimensional synchronous brain computer interfaces, a preliminary study

    Science.gov (United States)

    Gargiulo, Gaetano D; Mohamed, Armin; McEwan, Alistair L; Bifulco, Paolo; Cesarelli, Mario; Jin, Craig T; Ruffo, Mariano; Tapson, Jonathan; van Schaik, André

    2012-01-01

    Feedback plays an important role when learning to use a brain computer interface (BCI), particularly in the case of synchronous feedback that relies on the interaction subject. In this preliminary study, we investigate the role of combined auditory-visual feedback during synchronous μ rhythm-based BCI sessions to help the subject to remain focused on the selected imaginary task. This new combined feedback, now integrated within the general purpose BCI2000 software, has been tested on eight untrained and three trained subjects during a monodimensional left-right control task. In order to reduce the setup burden and maximize subject comfort, an electroencephalographic device suitable for dry electrodes that required no skin preparation was used. Quality and index of improvement was evaluated based on a personal self-assessment questionnaire from each subject and quantitative data based on subject performance. Results for this preliminary study show that the combined feedback was well tolerated by the subjects and improved performance in 75% of the naïve subjects compared with visual feedback alone. PMID:23152713

  9. Numerical Computation of a Continuous-thrust State Transition Matrix Incorporating Accurate Hardware and Ephemeris Models

    Science.gov (United States)

    Ellison, Donald; Conway, Bruce; Englander, Jacob

    2015-01-01

    A significant body of work exists showing that providing a nonlinear programming (NLP) solver with expressions for the problem constraint gradient substantially increases the speed of program execution and can also improve the robustness of convergence, especially for local optimizers. Calculation of these derivatives is often accomplished through the computation of spacecraft's state transition matrix (STM). If the two-body gravitational model is employed as is often done in the context of preliminary design, closed form expressions for these derivatives may be provided. If a high fidelity dynamics model, that might include perturbing forces such as the gravitational effect from multiple third bodies and solar radiation pressure is used then these STM's must be computed numerically. We present a method for the power hardward model and a full ephemeris model. An adaptive-step embedded eight order Dormand-Prince numerical integrator is discussed and a method for the computation of the time of flight derivatives in this framework is presented. The use of these numerically calculated derivatieves offer a substantial improvement over finite differencing in the context of a global optimizer. Specifically the inclusion of these STM's into the low thrust missiondesign tool chain in use at NASA Goddard Spaceflight Center allows for an increased preliminary mission design cadence.

  10. A Granular Computing Model Based on Tolerance relation

    Institute of Scientific and Technical Information of China (English)

    WANG Guo-yin; HU Feng; HUANG Hai; WU Yu

    2005-01-01

    Granular computing is a new intelligent computing theory based on partition of problem concepts. It is an important problem in Rough Set theory to process incomplete information systems directly. In this paper, a granular computing model based on tolerance relation for processing incomplete information systems is developed. Furthermore, a criteria condition for attribution necessity is proposed in this model.

  11. Application of computer simulated persons in indoor environmental modeling

    DEFF Research Database (Denmark)

    Topp, C.; Nielsen, P. V.; Sørensen, Dan Nørtoft

    2002-01-01

    Computer simulated persons are often applied when the indoor environment is modeled by computational fluid dynamics. The computer simulated persons differ in size, shape, and level of geometrical complexity, ranging from simple box or cylinder shaped heat sources to more humanlike models. Little...

  12. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...

  13. Computational Modeling of Biological Systems From Molecules to Pathways

    CERN Document Server

    2012-01-01

    Computational modeling is emerging as a powerful new approach for studying and manipulating biological systems. Many diverse methods have been developed to model, visualize, and rationally alter these systems at various length scales, from atomic resolution to the level of cellular pathways. Processes taking place at larger time and length scales, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. Computational Modeling of Biological Systems: From Molecules to Pathways provides an overview of established computational methods for the modeling of biologically and medically relevant systems. It is suitable for researchers and professionals working in the fields of biophysics, computational biology, systems biology, and molecular medicine.

  14. Psychological underpinnings of intrafamilial computer-mediated communication: a preliminary exploration of CMC uptake with parents and siblings.

    Science.gov (United States)

    Goby, Valerie Priscilla

    2011-06-01

    This preliminary study investigates the uptake of computer-mediated communication (CMC) with parents and siblings, an area on which no research appears to have been conducted. Given the lack of relevant literature, grounded theory methodology was used and online focus group discussions were conducted in an attempt to generate suitable hypotheses for further empirical studies. Codification of the discussion data revealed various categories of meaning, namely: a perceived inappropriateness of CMC with members of family of origin; issues relating to the family generational gap; the nature of the offline sibling/parent relationship; the non-viability of online affordances such as planned self-disclosure, deception, identity construction; and disinhibition in interactions with family-of-origin members. These themes could be molded into hypotheses to assess the psychosocial limitations of CMC and to determine if it can indeed become a ubiquitous alternative to traditional communication modes as some scholars have claimed.

  15. A Packet Routing Model for Computer Networks

    Directory of Open Access Journals (Sweden)

    O. Osunade

    2012-05-01

    Full Text Available The quest for reliable data transmission in today’s computer networks and internetworks forms the basis for which routing schemes need be improved upon. The persistent increase in the size of internetwork leads to a dwindling performance of the present routing algorithms which are meant to provide optimal path for forwarding packets from one network to the other. A mathematical and analytical routing model framework is proposed to address the routing needs to a substantial extent. The model provides schemes typical of packet sources, queuing system within a buffer, links and bandwidth allocation and time-based bandwidth generator in routing chunks of packets to their destinations. Principal to the choice of link are such design considerations as least-congested link in a set of links, normalized throughput, mean delay and mean waiting time and the priority of packets in a set of prioritized packets. These performance metrics were targeted and the resultant outcome is a fair, load-balanced network.

  16. Computational modeling of acute myocardial infarction.

    Science.gov (United States)

    Sáez, P; Kuhl, E

    2016-01-01

    Myocardial infarction, commonly known as heart attack, is caused by reduced blood supply and damages the heart muscle because of a lack of oxygen. Myocardial infarction initiates a cascade of biochemical and mechanical events. In the early stages, cardiomyocytes death, wall thinning, collagen degradation, and ventricular dilation are the immediate consequences of myocardial infarction. In the later stages, collagenous scar formation in the infarcted zone and hypertrophy of the non-infarcted zone are auto-regulatory mechanisms to partly correct for these events. Here we propose a computational model for the short-term adaptation after myocardial infarction using the continuum theory of multiplicative growth. Our model captures the effects of cell death initiating wall thinning, and collagen degradation initiating ventricular dilation. Our simulations agree well with clinical observations in early myocardial infarction. They represent a first step toward simulating the progression of myocardial infarction with the ultimate goal to predict the propensity toward heart failure as a function of infarct intensity, location, and size.

  17. Computer modeling of complete IC fabrication process

    Science.gov (United States)

    Dutton, Robert W.

    1987-05-01

    The development of fundamental algorithms for process and device modeling as well as novel integration of the tools for advanced Integrated Circuit (IC) technology design is discussed. The development of the first complete 2D process simulator, SUPREM 4, is reported. The algorithms are discussed as well as application to local-oxidation and extrinsic diffusion conditions which occur in CMOS AND BiCMOS technologies. The evolution of 1D (SEDAN) and 2D (PISCES) device analysis is discussed. The application of SEDAN to a variety of non-silicon technologies (GaAs and HgCdTe) are considered. A new multi-window analysis capability for PISCES which exploits Monte Carlo analysis of hot carriers has been demonstrated and used to characterize a variety of silicon MOSFET and GaAs MESFET effects. A parallel computer implementation of PISCES has been achieved using a Hypercube architecture. The PISCES program has been used for a range of important device studies including: latchup, analog switch analysis, MOSFET capacitance studies and bipolar transient device for ECL gates. The program is broadly applicable to RAM and BiCMOS technology analysis and design. In the analog switch technology area this research effort has produced a variety of important modeling and advances.

  18. A Survey of Formal Models for Computer Security.

    Science.gov (United States)

    1981-09-30

    presenting the individual models. 6.1 Basic Concepts and Trends The finite state machine model for computation views a computer system as a finite...top-level specification. The simplest description of the top-level model for DSU is given by Walker, et al. [36]. It is a finite state machine model , with

  19. Investigating the role of combined acoustic-visual feedback in one-dimensional synchronous brain computer interfaces, a preliminary study

    Directory of Open Access Journals (Sweden)

    Gargiulo GD

    2012-09-01

    Full Text Available Gaetano D Gargiulo,1–3 Armin Mohamed,1 Alistair L McEwan,1 Paolo Bifulco,2 Mario Cesarelli,2 Craig T Jin,1 Mariano Ruffo,2 Jonathan Tapson,3 André van Schaik31School of Electrical and Information Engineering, The University of Sydney, New South Wales, Australia; 2Dipartimento di Ingegneria Elettronica e delle Telecomunicazioni "Federico II" University of Naples, Naples, Italy; 3BENS Laboratory, MARCS Institute, The University of Western Sydney, New South Wales, AustraliaAbstract: Feedback plays an important role when learning to use a brain computer interface (BCI, particularly in the case of synchronous feedback that relies on the interaction subject. In this preliminary study, we investigate the role of combined auditory-visual feedback during synchronous µ rhythm-based BCI sessions to help the subject to remain focused on the selected imaginary task. This new combined feedback, now integrated within the general purpose BCI2000 software, has been tested on eight untrained and three trained subjects during a monodimensional left-right control task. In order to reduce the setup burden and maximize subject comfort, an electroencephalographic device suitable for dry electrodes that required no skin preparation was used. Quality and index of improvement was evaluated based on a personal self-assessment questionnaire from each subject and quantitative data based on subject performance. Results for this preliminary study show that the combined feedback was well tolerated by the subjects and improved performance in 75% of the naïve subjects compared with visual feedback alone.Keywords: brain computer interface, dry electrodes, subject feedback

  20. Computer-mediated communication and the Gallaudet University community: a preliminary report.

    Science.gov (United States)

    Hogg, Nanette M; Lomicky, Carol S; Weiner, Stephen F

    2008-01-01

    The study examined the use of computer-mediated communication (CMC) among individuals involved in a conflict sparked by the appointment of an administrator as president-designate of Gallaudet University in 2006. CMC was defined as forms of communication used for transmitting (sharing) information through networks with digital devices. There were 662 survey respondents. Respondents reported overwhelmingly (98%) that they used CMC to communicate. Students and alumni reported CMC use in larger proportions than any other group. The favorite devices among all respondents were Sidekicks, stationary computers, and laptops. Half of all respondents also reported using some form of video device. Nearly all reported using e-mail; respondents also identified Web surfing, text messaging, and blogging as popular CMC activities. The authors plan another article reporting on computer and electronic technology use as a mechanism connecting collective identity to social movements.

  1. A semantic-web approach for modeling computing infrastructures

    NARCIS (Netherlands)

    M. Ghijsen; J. van der Ham; P. Grosso; C. Dumitru; H. Zhu; Z. Zhao; C. de Laat

    2013-01-01

    This paper describes our approach to modeling computing infrastructures. Our main contribution is the Infrastructure and Network Description Language (INDL) ontology. The aim of INDL is to provide technology independent descriptions of computing infrastructures, including the physical resources as w

  2. Computational and Modeling Strategies for Cell Motility

    Science.gov (United States)

    Wang, Qi; Yang, Xiaofeng; Adalsteinsson, David; Elston, Timothy C.; Jacobson, Ken; Kapustina, Maryna; Forest, M. Gregory

    A predictive simulation of the dynamics of a living cell remains a fundamental modeling and computational challenge. The challenge does not even make sense unless one specifies the level of detail and the phenomena of interest, whether the focus is on near-equilibrium or strongly nonequilibrium behavior, and on localized, subcellular, or global cell behavior. Therefore, choices have to be made clear at the outset, ranging from distinguishing between prokaryotic and eukaryotic cells, specificity within each of these types, whether the cell is "normal," whether one wants to model mitosis, blebs, migration, division, deformation due to confined flow as with red blood cells, and the level of microscopic detail for any of these processes. The review article by Hoffman and Crocker [48] is both an excellent overview of cell mechanics and an inspiration for our approach. One might be interested, for example, in duplicating the intricate experimental details reported in [43]: "actin polymerization periodically builds a mechanical link, the lamellipodium, connecting myosin motors with the initiation of adhesion sites, suggesting that the major functions driving motility are coordinated by a biomechanical process," or to duplicate experimental evidence of traveling waves in cells recovering from actin depolymerization [42, 35]. Modeling studies of lamellipodial structure, protrusion, and retraction behavior range from early mechanistic models [84] to more recent deterministic [112, 97] and stochastic [51] approaches with significant biochemical and structural detail. Recent microscopic-macroscopic models and algorithms for cell blebbing have been developed by Young and Mitran [116], which update cytoskeletal microstructure via statistical sampling techniques together with fluid variables. Alternatively, whole cell compartment models (without spatial details) of oscillations in spreading cells have been proposed [35, 92, 109] which show positive and negative feedback

  3. Engineering a thalamo-cortico-thalamic circuit on SpiNNaker: a preliminary study towards modelling sleep and wakefulness

    Directory of Open Access Journals (Sweden)

    Basabdatta Sen Bhattacharya

    2014-05-01

    Full Text Available We present a preliminary study of a thalamo-cortico-thalamic (TCT implementation on SpiNNaker (Spiking Neural Network architecture, a brain inspired hardware platform designed to incorporate the inherent biological properties of parallelism, fault tolerance and energy efficiency. These attributes make SpiNNaker an ideal platform for simulating biologically plausible computational models. Our focus in this work is to design a TCT framework that can be simulated on SpiNNaker to mimic dynamical behaviour similar to Electroencephalogram (EEG time and power-spectra signatures in sleep-wake transition. The scale of the model is minimised for simplicity in this proof-of-concept study; thus the total number of spiking neurons is approximately 1000 and represents a `mini-column' of the thalamocortical tissue. All data on model structure, synaptic layout and parameters is inspired from previous studies and abstracted at a level that is appropriate to the aims of the current study as well as computationally suitable for model simulation on a small 4-chip SpiNNaker system. The initial results from selective deletion of synaptic connectivity parameters in the model show similarity with EEG time series characteristics of sleep and wakefulness. These observations provide a positive perspective and a basis for future implementation of a very large scale biologically plausible model of thalamo-cortico-thalamic interactivity---the essential brain circuit that regulates the biological sleep-wake cycle and associated EEG rhythms.

  4. SIFT - A preliminary evaluation. [Software Implemented Fault Tolerant computer for aircraft control

    Science.gov (United States)

    Palumbo, D. L.; Butler, R. W.

    1983-01-01

    This paper presents the results of a performance evaluation of the SIFT computer system conducted in the NASA AIRLAB facility. The essential system functions are described and compared to both earlier design proposals and subsequent design improvements. The functions supporting fault tolerance are found to consume significant computing resources. With SIFT's specimen task load, scheduled at a 30-Hz rate, the executive tasks such as reconfiguration, clock synchronization and interactive consistency, require 55 percent of the available task slots. Other system overhead (e.g., voting and scheduling) use an average of 50 percent of each remaining task slot.

  5. QSAR models for anti-androgenic effect - a preliminary study

    DEFF Research Database (Denmark)

    Jensen, Gunde Egeskov; Nikolov, Nikolai Georgiev; Wedebye, Eva Bay;

    2011-01-01

    of the model for a particular application, balance of training sets, domain definition, and cut-offs for prediction interpretation should also be taken into account. Different descriptors in the modelling systems are illustrated with hydroxyflutamide and dexamethasone as examples (a non-steroid and a steroid......Three modelling systems (MultiCase (R), LeadScope (R) and MDL (R) QSAR) were used for construction of androgenic receptor antagonist models. There were 923-942 chemicals in the training sets. The models were cross-validated (leave-groups-out) with concordances of 77-81%, specificity of 78...

  6. Preliminary Modeling and Simulation Study on Olfactory Cell Sensation

    Science.gov (United States)

    Zhou, Jun; Yang, Wei; Chen, Peihua; Liu, Qingjun; Wang, Ping

    2009-05-01

    This paper introduced olfactory sensory neuron's whole-cell model with a concrete voltage-gated ionic channels and simulation. Though there are many models in olfactory sensory neuron and olfactory bulb, it remains uncertain how they express the logic of olfactory information processing. In this article, the olfactory neural network model is also introduced. This model specifies the connections among neural ensembles of the olfactory system. The simulation results of the neural network model are consistent with the observed olfactory biological characteristics such as 1/f-type power spectrum and oscillations.

  7. Computational Models of Spreadsheet Development: Basis for Educational Approaches

    CERN Document Server

    Hodnigg, Karin; Mittermeir, Roland T

    2008-01-01

    Among the multiple causes of high error rates in spreadsheets, lack of proper training and of deep understanding of the computational model upon which spreadsheet computations rest might not be the least issue. The paper addresses this problem by presenting a didactical model focussing on cell interaction, thus exceeding the atomicity of cell computations. The approach is motivated by an investigation how different spreadsheet systems handle certain computational issues implied from moving cells, copy-paste operations, or recursion.

  8. Integrating Computer Algebra Systems in Post-Secondary Mathematics Education: Preliminary Results of a Literature Review

    Science.gov (United States)

    Buteau, Chantal; Marshall, Neil; Jarvis, Daniel; Lavicza, Zsolt

    2010-01-01

    We present results of a literature review pilot study (326 papers) regarding the use of Computer Algebra Systems (CAS) in tertiary mathematics education. Several themes that have emerged from the review are discussed: diverse uses of CAS, benefits to student learning, issues of integration and mathematics learning, common and innovative usage of…

  9. A Solar Powered Wireless Computer Mouse: Design, Assembly and Preliminary Testing of 15 Prototypes

    NARCIS (Netherlands)

    van Sark, W.G.J.H.M.; Reich, N.H.; Alsema, E.A.; Netten, M.P.; Veefkind, M.; Silvester, S.; Elzen, B.; Verwaal, M.

    2007-01-01

    The concept and design of a solar powered wireless computer mouse has been completed, and 15 prototypes have been successfully assembled. After necessary cutting, the crystalline silicon cells show satisfactory efficiency: up to 14% when implemented into the mouse device. The implemented voltage

  10. Computational implementation of a systems prioritization methodology for the Waste Isolation Pilot Plant: A preliminary example

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States). Dept. of Mathematics; Anderson, D.R. [Sandia National Labs., Albuquerque, NM (United States). WIPP Performance Assessments Departments; Baker, B.L. [Technadyne Engineering Consultants, Albuquerque, NM (United States)] [and others

    1996-04-01

    A systems prioritization methodology (SPM) is under development to provide guidance to the US DOE on experimental programs and design modifications to be supported in the development of a successful licensing application for the Waste Isolation Pilot Plant (WIPP) for the geologic disposal of transuranic (TRU) waste. The purpose of the SPM is to determine the probabilities that the implementation of different combinations of experimental programs and design modifications, referred to as activity sets, will lead to compliance. Appropriate tradeoffs between compliance probability, implementation cost and implementation time can then be made in the selection of the activity set to be supported in the development of a licensing application. Descriptions are given for the conceptual structure of the SPM and the manner in which this structure determines the computational implementation of an example SPM application. Due to the sophisticated structure of the SPM and the computational demands of many of its components, the overall computational structure must be organized carefully to provide the compliance probabilities for the large number of activity sets under consideration at an acceptable computational cost. Conceptually, the determination of each compliance probability is equivalent to a large numerical integration problem. 96 refs., 31 figs., 36 tabs.

  11. Using computational simulation to aid in the prediction of socket fit: a preliminary study.

    Science.gov (United States)

    Lee, Winson C C; Zhang, Ming

    2007-10-01

    This study illustrates the use of computational analysis to predict prosthetic socket fit. A simple indentation test is performed by applying force to the residual limb of a trans-tibial amputee through an indenter until the subject perceives the onset of pain. Computational finite element (FE) analysis is then applied to evaluate the magnitude of pressure underlying the indenter that initiates pain (pain threshold pressure), and the pressure at the prosthetic socket-residual limb interface. The assessment of socket fit is examined by studying whether or not the socket-limb interface pressure exceeds the pain threshold pressure of the limb. Based on the computer-aided assessment, a new prosthetic socket is then fabricated and fitted to the amputee subject. Successful socket fit is achieved at the end of this process. The approach of using computational analysis to aid in assessing socket fit allows a more efficient evaluation and re-design of the socket even before the actual fabrication and fitting of the prosthetic socket. However, more thorough investigations are required before this approach can be widely used. A subsequent part of this paper discusses the limitations and suggests future research directions in this area.

  12. A Solar Powered Wireless Computer Mouse: Design, Assembly and Preliminary Testing of 15 Prototypes

    NARCIS (Netherlands)

    van Sark, W.G.J.H.M.; Reich, N.H.; Alsema, E.A.; Netten, M.P.; Veefkind, M.; Silvester, S.; Elzen, B.; Verwaal, M.

    2007-01-01

    The concept and design of a solar powered wireless computer mouse has been completed, and 15 prototypes have been successfully assembled. After necessary cutting, the crystalline silicon cells show satisfactory efficiency: up to 14% when implemented into the mouse device. The implemented voltage con

  13. Elements of matrix modeling and computing with Matlab

    CERN Document Server

    White, Robert E

    2006-01-01

    As discrete models and computing have become more common, there is a need to study matrix computation and numerical linear algebra. Encompassing a diverse mathematical core, Elements of Matrix Modeling and Computing with MATLAB examines a variety of applications and their modeling processes, showing you how to develop matrix models and solve algebraic systems. Emphasizing practical skills, it creates a bridge from problems with two and three variables to more realistic problems that have additional variables. Elements of Matrix Modeling and Computing with MATLAB focuses on seven basic applicat

  14. A Preliminary Review on Three-Dimensional City Model

    Institute of Scientific and Technical Information of China (English)

    ZHOU Qiming; ZHANG Wenjiang

    2004-01-01

    In this paper a review on current research on 3DCM is presented, and an alternative approach by integrating the concepts and techniques of object-oriented method and Computer Aided Design (CAD) is suggested. Through the approach urban spatial entities as objects are extracted, which are represented with primary 3D elements (node, edge, face and body) and their combinations. In the light of the concept of object, the method supports the multiple representation of Level of Details (LOD). More importantly, topological relationships between objects are described so that 3D topological operations can be implemented.

  15. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. A cost modelling system for cloud computing

    OpenAIRE

    Ajeh, Daniel; Ellman, Jeremy; Keogh, Shelagh

    2014-01-01

    An advance in technology unlocks new opportunities for organizations to increase their productivity, efficiency and process automation while reducing the cost of doing business as well. The emergence of cloud computing addresses these prospects through the provision of agile systems that are scalable, flexible and reliable as well as cost effective. Cloud computing has made hosting and deployment of computing resources cheaper and easier with no up-front charges but pay per-use flexible payme...

  17. Modeling and Testing of EVs - Preliminary Study and Laboratory Development

    DEFF Research Database (Denmark)

    Yang, Guang-Ya; Marra, Francesco; Nielsen, Arne Hejde;

    2010-01-01

    Electric vehicles (EVs) are expected to play a key role in the future energy management system to stabilize both supply and consumption with the presence of high penetration of renewable generation. A reasonably accurate model of battery is a key element for the study of EVs behavior and the grid...... tests, followed by the suggestions towards a feasible battery model for further studies....

  18. Single-photon emission computed tomography in human immunodeficiency virus encephalopathy: A preliminary report

    Energy Technology Data Exchange (ETDEWEB)

    Masdeu, J.C.; Yudd, A.; Van Heertum, R.L.; Grundman, M.; Hriso, E.; O' Connell, R.A.; Luck, D.; Camli, U.; King, L.N. (St. Vincent' s Medical Center, New York, NY (USA))

    1991-08-01

    Depression or psychosis in a previously asymptomatic individual infected with the human immunodeficiency virus (HIV) may be psychogenic, related to brain involvement by the HIV or both. Although prognosis and treatment differ depending on etiology, computed tomography (CT) and magnetic resonance imaging (MRI) are usually unrevealing in early HIV encephalopathy and therefore cannot differentiate it from psychogenic conditions. Thirty of 32 patients (94%) with HIV encephalopathy had single-photon emission computed tomography (SPECT) findings that differed from the findings in 15 patients with non-HIV psychoses and 6 controls. SPECT showed multifocal cortical and subcortical areas of hypoperfusion. In 4 cases, cognitive improvement after 6-8 weeks of zidovudine (AZT) therapy was reflected in amelioration of SPECT findings. CT remained unchanged. SPECT may be a useful technique for the evaluation of HIV encephalopathy.

  19. Cone-Beam Computed Tomography Evaluation of Mental Foramen Variations: A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Mahnaz Sheikhi

    2015-01-01

    Full Text Available Background. Mental foramen is important in surgical operations of premolars because it transfers the mental nerves and vessels. This study evaluated the variations of mental foramen by cone-beam computed tomography among a selected Iranian population. Materials and Methods. A total number of 180 cone-beam computed tomography projections were analyzed in terms of shape, size, direction, and horizontal and vertical positions of mental foramen in the right and left sides. Results. The most common shape was oval, opening direction was posterior-superior, horizontal position was in line with second premolar, and vertical position was apical to the adjacent dental root. The mean of foremen diameter was 3.59 mm. Conclusion. In addition to the most common types of mental foramen, other variations exist, too. Hence, it reflects the significance of preoperative radiographic examinations, especially 3-dimensional images to prevent nerve damage.

  20. Preliminary assessment of Tongue Drive System in medium term usage for computer access and wheelchair control.

    Science.gov (United States)

    Yousefi, Behnaz; Huo, Xueliang; Ghovanloo, Maysam

    2011-01-01

    Tongue Drive System (TDS) is a wireless, wearable assistive technology that enables individuals with severe motor impairments access computers, drive wheelchairs, and control their environments using tongue motion. In this paper, we have evaluated the TDS performance as a computer input device using ISO9241-9 standard tasks for pointing and selecting, based on the well known Fitts' Law, and as a powered wheelchair controller through an obstacle course navigation task. Nine able-bodied subjects who already had tongue piercing participated in this trial over 5 sessions during 5 weeks, allowing us to study the TDS learning process and its current limiting factors. Subjects worn tongue rings made of titanium in the form of a barbell with a small rare earth magnetic tracer hermetically sealed inside the upper ball. Comparing the results between 1(st) and 5(th) sessions showed that subjects' performance improved in all the measures through 5 sessions, demonstrating the effects of learning.

  1. The Influencing Factors of Computer Adoption in Agribusiness: A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Sudaryanto

    2011-08-01

    Full Text Available This research is aimed to investigate factors that influence the intentions of adopting computer for business purpose, and their implications on managerial development. Semi structured interview and courier mailed survey had been employed to collect the data. A conceptual framework and theoretical insight presented based on literature review and primary data collected from the various east java agribusiness. To develop qualitative information of sample characteristics, cross tabulation was employed. Logistic regression was used to test the research hypotheses. The research findings show that the intention to adopt computer in east java agribusiness is strongly influenced by managers whose ages are 41+, education (TAFE/D3, and sales volume. This research has a direct implication on agribusiness development for the overall east java agribusiness and provinces in Indonesia. It is expected that it will encourage other researchers to conducting similar research benchmarking with other developing countries. The complexity and wide range of agribusiness term made the research methodology complicated.

  2. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  3. Wave-current interactions: model development and preliminary results

    Science.gov (United States)

    Mayet, Clement; Lyard, Florent; Ardhuin, Fabrice

    2013-04-01

    The coastal area concentrates many uses that require integrated management based on diagnostic and predictive tools to understand and anticipate the future of pollution from land or sea, and learn more about natural hazards at sea or activity on the coast. The realistic modelling of coastal hydrodynamics needs to take into account various processes which interact, including tides, surges, and sea state (Wolf [2008]). These processes act at different spatial scales. Unstructured-grid models have shown the ability to satisfy these needs, given that a good mesh resolution criterion is used. We worked on adding a sea state forcing in a hydrodynamic circulation model. The sea state model is the unstructured version of WAVEWATCH III c (Tolman [2008]) (which version is developed at IFREMER, Brest (Ardhuin et al. [2010]) ), and the hydrodynamic model is the 2D barotropic module of the unstructured-grid finite element model T-UGOm (Le Bars et al. [2010]). We chose to use the radiation stress approach (Longuet-Higgins and Stewart [1964]) to represent the effect of surface waves (wind waves and swell) in the barotropic model, as previously done by Mastenbroek et al. [1993]and others. We present here some validation of the model against academic cases : a 2D plane beach (Haas and Warner [2009]) and a simple bathymetric step with analytic solution for waves (Ardhuin et al. [2008]). In a second part we present realistic application in the Ushant Sea during extreme event. References Ardhuin, F., N. Rascle, and K. Belibassakis, Explicit wave-averaged primitive equations using a generalized Lagrangian mean, Ocean Modelling, 20 (1), 35-60, doi:10.1016/j.ocemod.2007.07.001, 2008. Ardhuin, F., et al., Semiempirical Dissipation Source Functions for Ocean Waves. Part I: Definition, Calibration, and Validation, J. Phys. Oceanogr., 40 (9), 1917-1941, doi:10.1175/2010JPO4324.1, 2010. Haas, K. A., and J. C. Warner, Comparing a quasi-3D to a full 3D nearshore circulation model: SHORECIRC and

  4. Experiments and simulation models of a basic computation element of an autonomous molecular computing system.

    Science.gov (United States)

    Takinoue, Masahiro; Kiga, Daisuke; Shohda, Koh-Ichiroh; Suyama, Akira

    2008-10-01

    Autonomous DNA computers have been attracting much attention because of their ability to integrate into living cells. Autonomous DNA computers can process information through DNA molecules and their molecular reactions. We have already proposed an idea of an autonomous molecular computer with high computational ability, which is now named Reverse-transcription-and-TRanscription-based Autonomous Computing System (RTRACS). In this study, we first report an experimental demonstration of a basic computation element of RTRACS and a mathematical modeling method for RTRACS. We focus on an AND gate, which produces an output RNA molecule only when two input RNA molecules exist, because it is one of the most basic computation elements in RTRACS. Experimental results demonstrated that the basic computation element worked as designed. In addition, its behaviors were analyzed using a mathematical model describing the molecular reactions of the RTRACS computation elements. A comparison between experiments and simulations confirmed the validity of the mathematical modeling method. This study will accelerate construction of various kinds of computation elements and computational circuits of RTRACS, and thus advance the research on autonomous DNA computers.

  5. Infinite Time Cellular Automata: A Real Computation Model

    CERN Document Server

    Givors, Fabien; Ollinger, Nicolas

    2010-01-01

    We define a new transfinite time model of computation, infinite time cellular automata. The model is shown to be as powerful than infinite time Turing machines, both on finite and infinite inputs; thus inheriting many of its properties. We then show how to simulate the canonical real computation model, BSS machines, with infinite time cellular automata in exactly \\omega steps.

  6. Learning Anatomy: Do New Computer Models Improve Spatial Understanding?

    Science.gov (United States)

    Garg, Amit; Norman, Geoff; Spero, Lawrence; Taylor, Ian

    1999-01-01

    Assesses desktop-computer models that rotate in virtual three-dimensional space. Compares spatial learning with a computer carpal-bone model horizontally rotating at 10-degree views with the same model rotating at 90-degree views. (Author/CCM)

  7. A simulation model of a star computer network

    CERN Document Server

    Gomaa, H

    1979-01-01

    A simulation model of the CERN (European Organization for Nuclear Research) SPS star computer network is described. The model concentrates on simulating the message handling computer, through which all messages in the network pass. The implementation of the model and its calibration are also described. (6 refs).

  8. The Combination of Lecture-Based Education and Computer-Assisted learning (CAL in the Preliminary Hospital Pharmacy Internship Course

    Directory of Open Access Journals (Sweden)

    Mohammad Charkhpour

    2014-12-01

    Full Text Available Introduction: Developments in the field of information technology has profoundly affected our educational system. The efficacy of Computer-Assisted Learning (CAL has already been evaluated in medical education, but in this study, we examined the efficacy of CAL in combination with Lecture-Based Education.Methods: This quasi-experimental before and after study included 33 senior-year pharmacy students who had passed the preliminary hospital pharmacy internship course. Pre-test questionnaires were given to the students in order to examine their knowledge and attitudes. Then, three chemotherapy prescriptions were given to them. Pharmacology recourses also were available virtually. At the end, students were asked to answer post-test questionnaires with questions based upon knowledge and attitude.Results: The mean score of their knowledge was 3.48±2.04 of 20 before intervention and 17.82±2.31 of 20 after intervention. There was a statistically significant difference between the pre-test and post-testing scores (p<0.001. The mean attitude score of students before intervention was 42.48±15.59 (medium and their score after intervention was 75.97±21.03 (high. There was a statistically significant difference between pre-test and post-test results (p<0.000.Conclusion: The combination of Lecture-Based Education and Computer-Assisted Learning improved senior pharmacy students’ knowledge and attitude in hospital pharmacy internship course.

  9. Graph Partitioning Models for Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Hendrickson, B.; Kolda, T.G.

    1999-03-02

    Calculations can naturally be described as graphs in which vertices represent computation and edges reflect data dependencies. By partitioning the vertices of a graph, the calculation can be divided among processors of a parallel computer. However, the standard methodology for graph partitioning minimizes the wrong metric and lacks expressibility. We survey several recently proposed alternatives and discuss their relative merits.

  10. Preliminaries to a Social-Semiotic Model of Communicative Action

    Directory of Open Access Journals (Sweden)

    Antonio SANDU

    2015-12-01

    Full Text Available The purpose of this article is to bring contributions to the elaboration of a social-semiotic model of social constructionism, which will make a synthesis between the theory of communicative action and the theories of social-constructionist semiotic model?, based on the postulation of a social universe in a network of communicative interdependencies developed on levels of reality. The interpretative model we propose comes to conceptualize the particularities of the sociological analysis of the transmodern society, seen as a knowledge-based society, placed at the interference with the postmodern society; that of generalized permissiveness. The model proposed aims at a constructionist-fractalic (al? analysis (of deconstruction-reconstruction type of the interpretative drift of social constructs, under the empire of different constructive instances.

  11. X-ray phase computed tomography for nanoparticulated imaging probes and therapeutics: preliminary feasibility study

    Science.gov (United States)

    Tang, Xiangyang; Yang, Yi; Tang, Shaojie

    2011-03-01

    With the scientific progress in cancer biology, pharmacology and biomedical engineering, the nano-biotechnology based imaging probes and therapeutical agents (namely probes/agents) - a form of theranostics - are among the strategic solutions bearing the hope for the cure of cancer. The key feature distinguishing the nanoparticulated probes/agents from their conventional counterparts is their targeting capability. A large surface-to-volume ratio in nanoparticulated probes/agents enables the accommodation of multiple targeting, imaging and therapeutic components to cope with the intra- and inter-tumor heterogeneity. Most nanoparticulated probes/agents are synthesized with low atomic number materials and thus their x-ray attenuation are very similar to biological tissues. However, their microscopic structures are very different, which may result in significant differences in their refractive properties. Recently, the investigation in the x-ray grating-based differential phase contrast (DPC) CT has demonstrated its advantages in differentiating low-atomic materials over the conventional attenuation-based CT. We believe that a synergy of x-ray grating-based DPC CT and nanoparticulated imaging probes and therapeutic agents may play a significant role in extensive preclinical and clinical applications, or even become a modality for molecular imaging. Hence, we propose to image the refractive property of nanoparticulated imaging probes and therapeutical agents using x-ray grating-based DPC CT. In this work, we conduct a preliminary feasibility study with a focus to characterize the contrast-to-noise ratio (CNR) and contrast-detail behavior of the x-ray grating-based DPC CT. The obtained data may be instructive to the architecture design and performance optimization of the x-ray grating-based DPC CT for imaging biomarker-targeted imaging probes and therapeutic agents, and even informative to the translation of preclinical research in theranostics into clinical applications.

  12. COMPUTER MODEL FOR ORGANIC FERTILIZER EVALUATION

    Directory of Open Access Journals (Sweden)

    Zdenko Lončarić

    2009-12-01

    seedlings with highest mass and leaf area are produced using growing media with pH close to 6 and with EC lower than 2 dSm-1. It could be concluded that conductivity approx. 3 dSm-1 has inhibitory effect on lettuce if pH is about 7 or higher. The computer model shows that raising pH and EC resulted in decreasing growth which could be expressed as increasing stress index. The lettuce height as a function of pH and EC is incorporated into the model as stress function showing increase of lettuce height by lowering EC from 4 to 1 dSm-1or pH from 7.4 to 6. The highest growing media index (8.1 was determined for mixture of composted pig manure and peat (1:1, and lowest (2.3 for composted horse manure and peat (1:2.

  13. Preliminary mixed-layer model results for FIRE marine stratocumulus IFO conditions

    Science.gov (United States)

    Barlow, R.; Nicholls, S.

    1990-01-01

    Some preliminary results from the Turton and Nicholls mixed layer model using typical FIRE boundary conditions are presented. The model includes entrainment and drizzle parametrizations as well as interactive long and shortwave radiation schemes. A constraint on the integrated turbulent kinetic energy balance ensures that the model remains energetically consistent at all times. The preliminary runs were used to identify the potentially important terms in the heat and moisture budgets of the cloud layer, and to assess the anticipated diurnal variability. These are compared with typical observations from the C130. Sensitivity studies also revealed the remarkable stability of these cloud sheets: a number of negative feedback mechanisms appear to operate to maintain the cloud over an extended time period. These are also discussed. The degree to which such a modelling approach can be used to explain observed features, the specification of boundary conditions and problems of interpretation in non-horizontally uniform conditions is also raised.

  14. RHF RELAP5 model and preliminary loss-of-offsite-power simulation results for LEU conversion

    Energy Technology Data Exchange (ETDEWEB)

    Licht, J. R. [Argonne National Laboratory (ANL), Argonne, IL (United States). Nuclear Engineering Div.; Bergeron, A. [Argonne National Laboratory (ANL), Argonne, IL (United States). Nuclear Engineering Div.; Dionne, B. [Argonne National Laboratory (ANL), Argonne, IL (United States). Nuclear Engineering Div.; Thomas, F. [Institut Laue-Langevin (ILL), Grenoble (Switzerland). RHF Reactor Dept.

    2014-08-01

    The purpose of this document is to describe the current state of the RELAP5 model for the Institut Laue-Langevin High Flux Reactor (RHF) located in Grenoble, France, and provide an update to the key information required to complete, for example, simulations for a loss of offsite power (LOOP) accident. A previous status report identified a list of 22 items to be resolved in order to complete the RELAP5 model. Most of these items have been resolved by ANL and the RHF team. Enough information was available to perform preliminary safety analyses and define the key items that are still required. Section 2 of this document describes the RELAP5 model of RHF. The final part of this section briefly summarizes previous model issues and resolutions. Section 3 of this document describes preliminary LOOP simulations for both HEU and LEU fuel at beginning of cycle conditions.

  15. RHF RELAP5 Model and Preliminary Loss-Of-Offsite-Power Simulation Results for LEU Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Licht, J. R. [Argonne National Laboratory (ANL), Argonne, IL (United States). Nuclear Engineering Div.; Bergeron, A. [Argonne National Laboratory (ANL), Argonne, IL (United States). Nuclear Engineering Div.; Dionne, B. [Argonne National Laboratory (ANL), Argonne, IL (United States). Nuclear Engineering Div.; Thomas, F. [Institut Laue-Langevin (ILL), Grenoble (Switzerland). RHF Reactor Dept.

    2014-08-01

    The purpose of this document is to describe the current state of the RELAP5 model for the Institut Laue-Langevin High Flux Reactor (RHF) located in Grenoble, France, and provide an update to the key information required to complete, for example, simulations for a loss of offsite power (LOOP) accident. A previous status report identified a list of 22 items to be resolved in order to complete the RELAP5 model. Most of these items have been resolved by ANL and the RHF team. Enough information was available to perform preliminary safety analyses and define the key items that are still required. Section 2 of this document describes the RELAP5 model of RHF. The final part of this section briefly summarizes previous model issues and resolutions. Section 3 of this document describes preliminary LOOP simulations for both HEU and LEU fuel at beginning of cycle conditions.

  16. The complete guide to blender graphics computer modeling and animation

    CERN Document Server

    Blain, John M

    2014-01-01

    Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments

  17. Editorial: Modelling and computational challenges in granular materials

    NARCIS (Netherlands)

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss

  18. Reduced computational models of serotonin synthesis, release, and reuptake.

    Science.gov (United States)

    Flower, Gordon; Wong-Lin, KongFatt

    2014-04-01

    Multiscale computational models can provide systemic evaluation and prediction of neuropharmacological drug effects. To date, little computational modeling work has been done to bridge from intracellular to neuronal circuit level. A complex model that describes the intracellular dynamics of the presynaptic terminal of a serotonergic neuron has been previously proposed. By systematically perturbing the model's components, we identify the slow and fast dynamical components of the model, and the reduced slow or fast mode of the model is computationally significantly more efficient with accuracy not deviating much from the original model. The reduced fast-mode model is particularly suitable for incorporating into neurobiologically realistic spiking neuronal models, and hence for large-scale realistic computational simulations. We also develop user-friendly software based on the reduced models to allow scientists to rapidly test and predict neuropharmacological drug effects at a systems level.

  19. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  20. Performance Models for Split-execution Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; McCaskey, Alex [ORNL; Schrock, Jonathan [ORNL; Seddiqi, Hadayat [ORNL; Britt, Keith A [ORNL; Imam, Neena [ORNL

    2016-01-01

    Split-execution computing leverages the capabilities of multiple computational models to solve problems, but splitting program execution across different computational models incurs costs associated with the translation between domains. We analyze the performance of a split-execution computing system developed from conventional and quantum processing units (QPUs) by using behavioral models that track resource usage. We focus on asymmetric processing models built using conventional CPUs and a family of special-purpose QPUs that employ quantum computing principles. Our performance models account for the translation of a classical optimization problem into the physical representation required by the quantum processor while also accounting for hardware limitations and conventional processor speed and memory. We conclude that the bottleneck in this split-execution computing system lies at the quantum-classical interface and that the primary time cost is independent of quantum processor behavior.

  1. Dynamical ice sheet model coupling with the GEOS-5 AGCM: A preliminary assessment

    Science.gov (United States)

    Cullather, R. I.; Zhao, B.; Nowicki, S.; Suarez, M. J.

    2013-12-01

    Dynamical ice sheet models (ISMs) have been developed to address well-known limitations in eustatic change prediction capabilities. Coupling ISMs to an atmospheric general circulation model (AGCM) is not straightforward, due in part to the extreme difference in spatial scales between the ISM mesh and AGCM grid. In given locations, ISM element edge lengths may be a few km or less, while the AGCM typically has grid spacings on the order of 10s to 100s of km. The Goddard Earth Observing System Model, version 5 (GEOS-5) is a finite-volume AGCM and employs a cube-sphere atmospheric grid (nominally 1° resolution) and a catchment-based land surface scheme that operates on sub-grid scale areas, or tiles, that describe surface characteristics. The land surface component communicates with the atmosphere on a semi-implicit time step via the exchange grid. In this study, coupling between the AGCM and the ISM is facilitated through sub-grid scale land surface tiles that are defined for each element of the ISM. The ISM used here is the Ice Sheet System Model (ISSM) from NASA Jet Propulsion Laboratory and Univ. California at Irvine, which has been adapted for the Greenland Ice Sheet using the 72,320-element Schlagel mesh and using a shallow ice approximation. In the AGCM, land surface tiles are uniquely characterized through a simple downscaling parameterization of surface temperature within each AGCM grid box using a defined lapse rate. On each land surface tile, GEOS-5 employs an advanced snow hydrology model for improved representation of the ice sheet surface mass balance. This preliminary assessment focuses on the differences in the AGCM surface mass balance and surface temperature fields resulting from the downscaling measures employed in the context of an exchange grid, semi-implicit coupling with the atmosphere, and the response of the ISM. Differences in AGCM computational performance with the addition of Greenland tiles is examined, and comparative advantages of

  2. In-tank fluid sloshing effects during earthquakes: A preliminary computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    Park, J.E.; Rezvani, M.A.

    1995-04-01

    Hundreds of underground radioactive waste storage tanks are located at Department of Energy (DOE) sites. At present, no technique for evaluating the pressure loads due to the impact of earthquake generated waves on the side walls and dome of the tanks is known if the wave breaks back on itself. This paper presents the results of two-dimensional Computational Fluid Dynamics (CFD) calculations of the motion of waves in a generic rectangular tank as the result of accelerations recorded during an earthquake. The advantages and limitations of this technique and methods for avoiding the limitations will be discussed.

  3. Applications of computer assisted surgery and medical robotics at the ISSSTE, México: preliminary results.

    Science.gov (United States)

    Mosso, José Luis; Pohl, Mauricio; Jimenez, Juan Ramon; Valdes, Raquel; Yañez, Oscar; Medina, Veronica; Arambula, Fernando; Padilla, Miguel Angel; Marquez, Jorge; Gastelum, Alfonso; Mosso, Alejo; Frausto, Juan

    2007-01-01

    We present the first results of four projects of a second phase of a Mexican Project Computer Assisted Surgery and Medical Robotics, supported by the Mexican Science and Technology National Council (Consejo Nacional de Ciencia y Tecnología) under grant SALUD-2002-C01-8181. The projects are being developed by three universities (UNAM, UAM, ITESM) and the goal of this project is to integrate a laboratory in a Hospital of the ISSSTE to give service to surgeons or clinicians of Endoscopic surgeons, urologist, gastrointestinal endoscopist and neurosurgeons.

  4. IEA-ETSAP TIMES models in Denmark. Preliminary edition

    Energy Technology Data Exchange (ETDEWEB)

    Grohnheit, P.E.

    2011-03-15

    This report presents the project 'Danish participation in IEAETSAP, Annex XI, 2008-2010', which continued the Danish participation in ETSAP under Annex XI 'JOint STudies for New And Mitigated Energy Systems (JOSTNAMES): Climate friendly, Secure and Productive Energy Systems'. The main activity has been semi-annual workshops focusing on presentations of model analyses and use of the ETSAP tools (the MARKAL/TIMES family of models). Contributions to these workshops have been based on various collaborative projects within the EU research programmes and the Danish Centre for Environment, Energy and Health (CEEH). In addition, the DTU Climate Centre at Risoe, which was founded in the autumn of 2008, has taken part in the ETSAP workshops, and used the ETSAP model tools for projects, papers, and presentations, as well as for a Ph.D. project. (Author)

  5. Preliminary Functional-Structural Modeling on Poplar (Salicaceae)

    CERN Document Server

    Liu, Dongxiang; Letort, Véronique; Xing, Meijun; Gang, Yang; Huang, Xinyuan; Cao, Weiqun

    2010-01-01

    Poplar is one of the best fast-growing trees in the world, widely used for windbreak and wood product. Although architecture of poplar has direct impact on its applications, it has not been descried in previous poplar models, probably because of the difficulties raised by measurement, data processing and parameterization. In this paper, the functional-structural model GreenLab is calibrated by using poplar data of 3, 4, 5, 6 years old. The data was acquired by simplifying measurement. The architecture was also simplified by classifying the branches into several types (physiological age) using clustering analysis, which decrease the number of parameters. By multi-fitting the sampled data of each tree, the model parameters were identified and the plant architectures at different tree ages were simulated.

  6. Model for personal computer system selection.

    Science.gov (United States)

    Blide, L

    1987-12-01

    Successful computer software and hardware selection is best accomplished by following an organized approach such as the one described in this article. The first step is to decide what you want to be able to do with the computer. Secondly, select software that is user friendly, well documented, bug free, and that does what you want done. Next, you select the computer, printer and other needed equipment from the group of machines on which the software will run. Key factors here are reliability and compatibility with other microcomputers in your facility. Lastly, you select a reliable vendor who will provide good, dependable service in a reasonable time. The ability to correctly select computer software and hardware is a key skill needed by medical record professionals today and in the future. Professionals can make quality computer decisions by selecting software and systems that are compatible with other computers in their facility, allow for future net-working, ease of use, and adaptability for expansion as new applications are identified. The key to success is to not only provide for your present needs, but to be prepared for future rapid expansion and change in your computer usage as technology and your skills grow.

  7. Some vaccination strategies for the SEIR epidemic model. Preliminary results

    CERN Document Server

    De la Sen, M; Alonso-Quesada, S

    2011-01-01

    This paper presents a vaccination-based control strategy for a SEIR (susceptible plus infected plus infectious plus removed populations) propagation disease model. The model takes into account the total population amounts as a refrain for the illness transmission since its increase makes more difficult contacts among susceptible and infected. The control objective is the asymptotically tracking of the removed-by-immunity population to the total population while achieving simultaneously the remaining population (i.e. susceptible plus infected plus infectious) to asymptotically tend to zero.

  8. Computer modeling of a convective steam superheater

    Science.gov (United States)

    Trojan, Marcin

    2015-03-01

    Superheater is for generating superheated steam from the saturated steam from the evaporator outlet. In the case of pulverized coal fired boiler, a relatively small amount of ash causes problems with ash fouling on the heating surfaces, including the superheaters. In the convection pass of the boiler, the flue gas temperature is lower and ash deposits can be loose or sintered. Ash fouling not only reduces heat transfer from the flue gas to the steam, but also is the cause of a higher pressure drop on the flue gas flow path. In the case the pressure drop is greater than the power consumed by the fan increases. If the superheater surfaces are covered with ash than the steam temperature at the outlet of the superheater stages falls, and the flow rates of the water injected into attemperator should be reduced. There is also an increase in flue gas temperature after the different stages of the superheater. Consequently, this leads to a reduction in boiler efficiency. The paper presents the results of computational fluid dynamics simulations of the first stage superheater of both the boiler OP-210M using the commercial software. The temperature distributions of the steam and flue gas along the way they flow together with temperature of the tube walls and temperature of the ash deposits will be determined. The calculated steam temperature is compared with measurement results. Knowledge of these temperatures is of great practical importance because it allows to choose the grade of steel for a given superheater stage. Using the developed model of the superheater to determine its degree of ash fouling in the on-line mode one can control the activation frequency of steam sootblowers.

  9. Computer modeling of a convective steam superheater

    Directory of Open Access Journals (Sweden)

    Trojan Marcin

    2015-03-01

    Full Text Available Superheater is for generating superheated steam from the saturated steam from the evaporator outlet. In the case of pulverized coal fired boiler, a relatively small amount of ash causes problems with ash fouling on the heating surfaces, including the superheaters. In the convection pass of the boiler, the flue gas temperature is lower and ash deposits can be loose or sintered. Ash fouling not only reduces heat transfer from the flue gas to the steam, but also is the cause of a higher pressure drop on the flue gas flow path. In the case the pressure drop is greater than the power consumed by the fan increases. If the superheater surfaces are covered with ash than the steam temperature at the outlet of the superheater stages falls, and the flow rates of the water injected into attemperator should be reduced. There is also an increase in flue gas temperature after the different stages of the superheater. Consequently, this leads to a reduction in boiler efficiency. The paper presents the results of computational fluid dynamics simulations of the first stage superheater of both the boiler OP-210M using the commercial software. The temperature distributions of the steam and flue gas along the way they flow together with temperature of the tube walls and temperature of the ash deposits will be determined. The calculated steam temperature is compared with measurement results. Knowledge of these temperatures is of great practical importance because it allows to choose the grade of steel for a given superheater stage. Using the developed model of the superheater to determine its degree of ash fouling in the on-line mode one can control the activation frequency of steam sootblowers.

  10. Ablative Rocket Deflector Testing and Computational Modeling

    Science.gov (United States)

    Allgood, Daniel C.; Lott, Jeffrey W.; Raines, Nickey

    2010-01-01

    A deflector risk mitigation program was recently conducted at the NASA Stennis Space Center. The primary objective was to develop a database that characterizes the behavior of industry-grade refractory materials subjected to rocket plume impingement conditions commonly experienced on static test stands. The program consisted of short and long duration engine tests where the supersonic exhaust flow from the engine impinged on an ablative panel. Quasi time-dependent erosion depths and patterns generated by the plume impingement were recorded for a variety of different ablative materials. The erosion behavior was found to be highly dependent on the material s composition and corresponding thermal properties. For example, in the case of the HP CAST 93Z ablative material, the erosion rate actually decreased under continued thermal heating conditions due to the formation of a low thermal conductivity "crystallization" layer. The "crystallization" layer produced near the surface of the material provided an effective insulation from the hot rocket exhaust plume. To gain further insight into the complex interaction of the plume with the ablative deflector, computational fluid dynamic modeling was performed in parallel to the ablative panel testing. The results from the current study demonstrated that locally high heating occurred due to shock reflections. These localized regions of shock-induced heat flux resulted in non-uniform erosion of the ablative panels. In turn, it was observed that the non-uniform erosion exacerbated the localized shock heating causing eventual plume separation and reversed flow for long duration tests under certain conditions. Overall, the flow simulations compared very well with the available experimental data obtained during this project.

  11. Soft Computing Models in Industrial and Environmental Applications

    CERN Document Server

    Abraham, Ajith; Corchado, Emilio; 7th International Conference, SOCO’12

    2013-01-01

    This volume of Advances in Intelligent and Soft Computing contains accepted papers presented at SOCO 2012, held in the beautiful and historic city of Ostrava (Czech Republic), in September 2012.   Soft computing represents a collection or set of computational techniques in machine learning, computer science and some engineering disciplines, which investigate, simulate, and analyze very complex issues and phenomena.   After a through peer-review process, the SOCO 2012 International Program Committee selected 75 papers which are published in these conference proceedings, and represents an acceptance rate of 38%. In this relevant edition a special emphasis was put on the organization of special sessions. Three special sessions were organized related to relevant topics as: Soft computing models for Control Theory & Applications in Electrical Engineering, Soft computing models for biomedical signals and data processing and Advanced Soft Computing Methods in Computer Vision and Data Processing.   The selecti...

  12. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  13. Approximating response time distributions in closed queueing network models of computer performance

    Energy Technology Data Exchange (ETDEWEB)

    Salza, S.; Lavenberg, S.S.

    1981-01-01

    Hierarchical decomposition methods for approximating response time distributions in certain closed queueing network models of computer performance are investigated. The methods investigated apply whenever part of a customer's response time consists of a geometrically distributed number of successive cycles within a subnetwork. The key step involves replacing the subnetwork with parallel exponential servers having queue-size dependent service rates. Results on thinning stochastic point processes are used to justify this replacement when the mean number of cycles is large. Preliminary numerical comparisons of the approximations with simulation results indicate that the approximations are quite accurate even when the mean number of cycles is small. 17 references.

  14. Reduced order methods for modeling and computational reduction

    CERN Document Server

    Rozza, Gianluigi

    2014-01-01

    This monograph addresses the state of the art of reduced order methods for modeling and computational reduction of complex parametrized systems, governed by ordinary and/or partial differential equations, with a special emphasis on real time computing techniques and applications in computational mechanics, bioengineering and computer graphics.  Several topics are covered, including: design, optimization, and control theory in real-time with applications in engineering; data assimilation, geometry registration, and parameter estimation with special attention to real-time computing in biomedical engineering and computational physics; real-time visualization of physics-based simulations in computer science; the treatment of high-dimensional problems in state space, physical space, or parameter space; the interactions between different model reduction and dimensionality reduction approaches; the development of general error estimation frameworks which take into account both model and discretization effects. This...

  15. The emerging role of cloud computing in molecular modelling.

    Science.gov (United States)

    Ebejer, Jean-Paul; Fulle, Simone; Morris, Garrett M; Finn, Paul W

    2013-07-01

    There is a growing recognition of the importance of cloud computing for large-scale and data-intensive applications. The distinguishing features of cloud computing and their relationship to other distributed computing paradigms are described, as are the strengths and weaknesses of the approach. We review the use made to date of cloud computing for molecular modelling projects and the availability of front ends for molecular modelling applications. Although the use of cloud computing technologies for molecular modelling is still in its infancy, we demonstrate its potential by presenting several case studies. Rapid growth can be expected as more applications become available and costs continue to fall; cloud computing can make a major contribution not just in terms of the availability of on-demand computing power, but could also spur innovation in the development of novel approaches that utilize that capacity in more effective ways. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Sticker DNA computer model--Part Ⅰ:Theory

    Institute of Scientific and Technical Information of China (English)

    XU Jin; DONG Yafei; WEI Xiaopeng

    2004-01-01

    Sticker model is one of the basic models in the DNA computer models. This model is coded with single-double stranded DNA molecules. It has the following advantages that the operations require no strands extension and use no enzymes; What's more, the materials are reusable. Therefore it arouses attention and interest of scientists in many fields. In this paper, we will systematically analyze the theories and applications of the model, summarize other scientists' contributions in this field, and propose our research results. This paper is the theoretical portion of the sticker model on DNA computer, which includes the introduction of the basic model of sticker computing. Firstly, we systematically introduce the basic theories of classic models about sticker computing; Secondly, we discuss the sticker system which is an abstract computing model based on the sticker model and formal languages; Finally, extend and perfect the model, and present two types of models that are more extensive in the applications and more perfect in the theory than the past models: one is the so-called k-bit sticker model, the other is full-message sticker DNA computing model.

  17. Behavior computing modeling, analysis, mining and decision

    CERN Document Server

    2012-01-01

    Includes six case studies on behavior applications Presents new techniques for capturing behavior characteristics in social media First dedicated source of references for the theory and applications of behavior informatics and behavior computing

  18. Los Alamos CCS (Center for Computer Security) formal computer security model

    Energy Technology Data Exchange (ETDEWEB)

    Dreicer, J.S.; Hunteman, W.J. (Los Alamos National Lab., NM (USA))

    1989-01-01

    This paper provides a brief presentation of the formal computer security model currently being developed at the Los Alamos Department of Energy (DOE) Center for Computer Security (CCS). The initial motivation for this effort was the need to provide a method by which DOE computer security policy implementation could be tested and verified. The actual analytical model was a result of the integration of current research in computer security and previous modeling and research experiences. The model is being developed to define a generic view of the computer and network security domains, to provide a theoretical basis for the design of a security model, and to address the limitations of present models. Formal mathematical models for computer security have been designed and developed in conjunction with attempts to build secure computer systems since the early 70's. The foundation of the Los Alamos DOE CCS model is a series of functionally dependent probability equations, relations, and expressions. The mathematical basis appears to be justified and is undergoing continued discrimination and evolution. We expect to apply the model to the discipline of the Bell-Lapadula abstract sets of objects and subjects. 5 refs.

  19. Computing ordinary least-squares parameter estimates for the National Descriptive Model of Mercury in Fish

    Science.gov (United States)

    Donato, David I.

    2013-01-01

    A specialized technique is used to compute weighted ordinary least-squares (OLS) estimates of the parameters of the National Descriptive Model of Mercury in Fish (NDMMF) in less time using less computer memory than general methods. The characteristics of the NDMMF allow the two products X'X and X'y in the normal equations to be filled out in a second or two of computer time during a single pass through the N data observations. As a result, the matrix X does not have to be stored in computer memory and the computationally expensive matrix multiplications generally required to produce X'X and X'y do not have to be carried out. The normal equations may then be solved to determine the best-fit parameters in the OLS sense. The computational solution based on this specialized technique requires O(8p2+16p) bytes of computer memory for p parameters on a machine with 8-byte double-precision numbers. This publication includes a reference implementation of this technique and a Gaussian-elimination solver in preliminary custom software.

  20. Use of Computational Modeling to Evaluate Hypotheses About the Molecular and Cellular Mechanisms of Bystander Effects

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Yuchao; Conolly, Rory B; Andersen, Melvin E.

    2006-11-21

    This report describes the development of a computational systems biology approach to evaluate the hypotheses of molecular and cellular mechanisms of adaptive response to low dose ionizing radiation. Our concept is that computational models of signaling pathways can be developed and linked to biologically based dose response models to evaluate the underlying molecular mechanisms which lead to adaptive response. For development of quantitatively accurate, predictive models, it will be necessary to describe tissues consisting of multiple cell types where the different types each contribute in their own way to the overall function of the tissue. Such a model will probably need to incorporate not only cell type-specific data but also spatial information on the architecture of the tissue and on intercellular signaling. The scope of the current model was more limited. Data obtained in a number of different biological systems were synthesized to describe a chimeric, “average” population cell. Biochemical signaling pathways involved in sensing of DNA damage and in the activation of cell cycle checkpoint controls and the apoptotic path were also included. As with any computational modeling effort, it was necessary to develop these simplified initial descriptions (models) that can be iteratively refined. This preliminary model is a starting point which, with time, can evolve to a level of refinement where large amounts of detailed biological information are synthesized and a capability for robust predictions of dose- and time-response behaviors is obtained.

  1. Analog models of computations \\& Effective Church Turing Thesis: Efficient simulation of Turing machines by the General Purpose Analog Computer

    CERN Document Server

    Pouly, Amaury; Graça, Daniel S

    2012-01-01

    \\emph{Are analog models of computations more powerful than classical models of computations?} From a series of recent papers, it is now clear that many realistic analog models of computations are provably equivalent to classical digital models of computations from a \\emph{computability} point of view. Take, for example, the probably most realistic model of analog computation, the General Purpose Analog Computer (GPAC) model from Claude Shannon, a model for Differential Analyzers, which are analog machines used from 1930s to early 1960s to solve various problems. It is now known that functions computable by Turing machines are provably exactly those that are computable by GPAC. This paper is about next step: understanding if this equivalence also holds at the \\emph{complexity} level. In this paper we show that the realistic models of analog computation -- namely the General Purpose Analog Computer (GPAC) -- can simulate Turing machines in a computationally efficient manner. More concretely we show that, modulo...

  2. Vaccination strategies for SEIR models using feedback linearization. Preliminary results

    CERN Document Server

    De la Sen, M; Alonso-Quesada, S

    2011-01-01

    A linearization-based feedback-control strategy for a SEIR epidemic model is discussed. The vaccination objective is the asymptotically tracking of the removed-by-immunity population to the total population while achieving simultaneously the remaining population (i.e. susceptible plus infected plus infectious) to asymptotically tend to zero. The disease controlpolicy is designed based on a feedback linearization technique which provides a general method to generate families of vaccination policies with sound technical background.

  3. A Preliminary Evaluation of Bandwidth Allocation Model Dynamic Switching

    Directory of Open Access Journals (Sweden)

    Rafael F. Reale

    2014-06-01

    Full Text Available Bandwidth Allocation Models (BAMs are used in order to define Bandwidth Constraints (BCs in a per-class basis for MPLS/DS-TE networks and effectively define how network resources like bandwidth are obtained and shared by applications. The BAMs proposed (MAM – Maximum Allocation Model, RDM – Russian Dolls Model, G-RDM – Generic RDM and AllocTC-Sharing attempt to optimize the use of bandwidth resources on a per-link basis with different allocation and resource sharing characteristics. As such, the adoption of distinct BAMs and/or changes in network resource demands (network traffic profile may result in different network traffic allocation and operational behavior for distinct BAMs. This paper evaluates the resulting network characteristics (li nk utilization, preemption and flows blocking of using BAMs dynamically with different traffic scenarios. In brief, it is investigated the dynamics of BAM switching with distinct traffic scenarios. The paper presents initially the investigated BAMs in relation to their behavior and resource allocation characteristics. Then, distinct BAMs are compared using different traffic scenarios in order to investigate the impact of a dynamic change of the BAM configured in the network. Finally, the paper shows that the adoption of a dynamic BAM allocation strategy may result in benefits for network operation in terms of link utilization, preemption and flows blocking.

  4. A preliminary study of periodontitis and vascular calcification compound model

    Institute of Scientific and Technical Information of China (English)

    MENGYun; DENGJing; PanKe-qing

    2015-01-01

    Objective This experiment is desired to establish a compound model of chronic periodontitis and vascular calcification,so as to study the relation of periodontal and vascular calcification.Methods Forty male Wistar rats were randomly divided into:control group(group C),periodontitis group(group CP),vascular calcification group(group VDN),compound group (group CP+VDN).Every groups accepted the corresponding manages to establish the animal model.Eight weeks later,al the rats were sacrificed and the fol owing items were observed:inflam-matory factor in serum were tested,Hematoxylin-eosin staining(HE)staining of vascular tissue were taken to test.Results Through detection of periodontal tissue,serum and vascular tissue,an-imal models were successful.Histopathologic observation revealed:obvious inflammation of periodontal tissue was obversed in group CP and CP+VDN.The red Mineralized nodules deposition in group VDN and CP+VDN were higher than in group C and CP(P<0.05)by HE staining,and that in group CP+VDN was significantly higher than in group VDN(P<0.05);Animals in group CP+VDN showed higher level of IL-1 in serum than that in group CP,VDN and C.Conclusion This study has demonstrated that periodontitis have some promoting ef ect on vascular cal-cification.

  5. Preliminary study on enhancing waste management best practice model in Malaysia construction industry

    Science.gov (United States)

    Jamaludin, Amril Hadri; Karim, Nurulzatushima Abdul; Noor, Raja Nor Husna Raja Mohd; Othman, Nurulhidayah; Malik, Sulaiman Abdul

    2017-08-01

    Construction waste management (CWM) is the practice of minimizing and diverting construction waste, demolition debris, and land-clearing debris from disposal and redirecting recyclable resources back into the construction process. Best practice model means best choice from the collection of other practices that was built for purpose of construction waste management. The practice model can help the contractors in minimizing waste before the construction activities will be started. The importance of minimizing wastage will have direct impact on time, cost and quality of a construction project. This paper is focusing on the preliminary study to determine the factors of waste generation in the construction sites and identify the effectiveness of existing construction waste management practice conducted in Malaysia. The paper will also include the preliminary works of planned research location, data collection method, and analysis to be done by using the Analytical Hierarchy Process (AHP) to help in developing suitable waste management best practice model that can be used in the country.

  6. Modelled and Observed Diurnal SST Signals: "SSTDV:R.EX.-IM.A.M." Project Preliminary Results

    DEFF Research Database (Denmark)

    Karagali, Ioanna; Høyer, Jacob; LeBorgne, Pierre

    2013-01-01

    This study presents some of the preliminary results from the ESA Support To Science Element (STSE) funded project on the Diurnal Variability of the Sea Surface Temperature, regarding its Regional Extend and Implications in Atmospheric Modelling (SSTDV:R.EX.–IM.A.M.). During this phase...... of the project, the focus is on the regional extend of diurnal variability. Particularly, extensive sensitivity tests regarding the definition of SSTfound fields show that using only quality 5 SEVIRI data results in warmer foundation fields SSTfound while there is an added ∼0.2 K variability when using multi...... Ocean Turbulence Model (GOTM) is applied. Preliminary results show that the initial temperature profiles may give a warmer start-up in the model while the light extinction scheme is a controlling factor for the amplitude and vertical extend of the daily signal....

  7. Dissemination of computer skills among physicians: the infectious process model.

    Science.gov (United States)

    Quinn, F B; Hokanson, J A; McCracken, M M; Stiernberg, C M

    1984-08-01

    While the potential utility of computer technology to medicine is often acknowledged, little is known as to the best methods to actually teach physicians about computers. The current variability in physician computer fluency implies there is no accepted minimum required level of computer skills for physicians. Special techniques are needed to instill these skills in the physician and measure their effects within the medical profession. This hypothesis is suggested following the development of a specialized course for the new physician. In a population of physicians where medical computing usage was considered nonexistent, intense interest developed the following exposure to a role model having strong credentials in both medicine and computer science. This produced an atmosphere where there was a perceived benefit in being knowledgeable about the medical computer usage. The subsequent increase in computer systems use was the result of the availability of resources and development of computer skills that could be exchanged among the students and faculty. This growth in computer use is described using the parameters of an infectious process model. While other approaches may also be useful, the infectious process model permits the growth of medical computer usage to be quantitatively described, evaluates specific determinants of use patterns, and allows the future growth of computer utilization in medicine to be predicted.

  8. Integrated Multiscale Modeling of Molecular Computing Devices

    Energy Technology Data Exchange (ETDEWEB)

    Jerzy Bernholc

    2011-02-03

    photolithography will some day reach a miniaturization limit, forcing designers of Si-based electronics to pursue increased performance by other means. Any other alternative approach would have the unenviable task of matching the ability of Si technology to pack more than a billion interconnected and addressable devices on a chip the size of a thumbnail. Nevertheless, the prospects of developing alternative approaches to fabricate electronic devices have spurred an ever-increasing pace of fundamental research. One of the promising possibilities is molecular electronics (ME), self-assembled molecular-based electronic systems composed of single-molecule devices in ultra dense, ultra fast molecular-sized components. This project focused on developing accurate, reliable theoretical modeling capabilities for describing molecular electronics devices. The participants in the project are given in Table 1. The primary outcomes of this fundamental computational science grant are publications in the open scientific literature. As listed below, 62 papers have been published from this project. In addition, the research has also been the subject of more than 100 invited talks at conferences, including several plenary or keynote lectures. Many of the goals of the original proposal were completed. Specifically, the multi-disciplinary group developed a unique set of capabilities and tools for investigating electron transport in fabricated and self-assembled nanostructures at multiple length and time scales.

  9. Studying an Eulerian Computer Model on Different High-performance Computer Platforms and Some Applications

    Science.gov (United States)

    Georgiev, K.; Zlatev, Z.

    2010-11-01

    The Danish Eulerian Model (DEM) is an Eulerian model for studying the transport of air pollutants on large scale. Originally, the model was developed at the National Environmental Research Institute of Denmark. The model computational domain covers Europe and some neighbour parts belong to the Atlantic Ocean, Asia and Africa. If DEM model is to be applied by using fine grids, then its discretization leads to a huge computational problem. This implies that such a model as DEM must be run only on high-performance computer architectures. The implementation and tuning of such a complex large-scale model on each different computer is a non-trivial task. Here, some comparison results of running of this model on different kind of vector (CRAY C92A, Fujitsu, etc.), parallel computers with distributed memory (IBM SP, CRAY T3E, Beowulf clusters, Macintosh G4 clusters, etc.), parallel computers with shared memory (SGI Origin, SUN, etc.) and parallel computers with two levels of parallelism (IBM SMP, IBM BlueGene/P, clusters of multiprocessor nodes, etc.) will be presented. The main idea in the parallel version of DEM is domain partitioning approach. Discussions according to the effective use of the cache and hierarchical memories of the modern computers as well as the performance, speed-ups and efficiency achieved will be done. The parallel code of DEM, created by using MPI standard library, appears to be highly portable and shows good efficiency and scalability on different kind of vector and parallel computers. Some important applications of the computer model output are presented in short.

  10. Preliminary Modeling of Accident Tolerant Fuel Concepts under Accident Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Gamble, Kyle A.; Hales, Jason D.

    2016-12-01

    The catastrophic events that occurred at the Fukushima-Daiichi nuclear power plant in 2011 have led to widespread interest in research of alternative fuels and claddings that are proposed to be accident tolerant. Thus, the United States Department of Energy through its NEAMS (Nuclear Energy Advanced Modeling and Simulation) program has funded an Accident Tolerant Fuel (ATF) High Impact Problem (HIP). The ATF HIP is funded for a three-year period. The purpose of the HIP is to perform research into two potential accident tolerant concepts and provide an in-depth report to the Advanced Fuels Campaign (AFC) describing the behavior of the concepts, both of which are being considered for inclusion in a lead test assembly scheduled for placement into a commercial reactor in 2022. The initial focus of the HIP is on uranium silicide fuel and iron-chromium-aluminum (FeCrAl) alloy cladding. Utilizing the expertise of three national laboratory participants (INL, LANL, and ANL) a comprehensive mulitscale approach to modeling is being used including atomistic modeling, molecular dynamics, rate theory, phase-field, and fuel performance simulations. In this paper, we present simulations of two proposed accident tolerant fuel systems: U3Si2 fuel with Zircaloy-4 cladding, and UO2 fuel with FeCrAl cladding. The simulations investigate the fuel performance response of the proposed ATF systems under Loss of Coolant and Station Blackout conditions using the BISON code. Sensitivity analyses are completed using Sandia National Laboratories’ DAKOTA software to determine which input parameters (e.g., fuel specific heat) have the greatest influence on the output metrics of interest (e.g., fuel centerline temperature). Early results indicate that each concept has significant advantages as well as areas of concern. Further work is required prior to formulating the proposition report for the Advanced Fuels Campaign.

  11. Preliminary modeling of BNCT beam tube on IRT in Sofia.

    Science.gov (United States)

    Belousov, S; Ilieva, K

    2009-07-01

    The technical design of the research reactor IRT in Sofia is in progress. It includes an arrangement for a BNCT facility for tumor treatment. Modeling of geometry and material composition of filter/collimator for the BNCT beam tube on IRT has been carried out following the beam tube configuration of the Massachusetts Institute of Technology Reactor [Harling et al., 2002. The fission converter-based epithermal neutron irradiation facility at the Massachusetts Institute of Technology Reactor. Nucl. Sci. Eng. 140, 223-240.] and taking into account an ability to include the tube into the IRT reactor geometry. The results of neutron and gamma transport calculations performed for the model have shown that the facility will be able to supply an epithermal neutron flux of about 5 x 10(9) n cm(-2)s(-1), with low contamination from fast neutrons and gamma rays that would be among the best facilities currently available. An optimiziation study has been performed for the beam collimator, following similar studies for the TAPIRO research reactor in Italy. [Nava et al., 2005. Monte Carlo optimization of a BNCT facility for treating brain gliomas at the TAPIRO reactor. Radiat. Prot. Dosim. 116 (1-4), 475-481.].

  12. Modeling Workflow Management in a Distributed Computing System ...

    African Journals Online (AJOL)

    Modeling Workflow Management in a Distributed Computing System Using Petri Nets. ... who use it to share information more rapidly and increases their productivity. ... Petri nets are an established tool for modelling and analyzing processes.

  13. Fugacity based modeling for pollutant fate and transport during floods. Preliminary results

    Science.gov (United States)

    Deda, M.; Fiorini, M.; Massabo, M.; Rudari, R.

    2010-09-01

    Fugacity based modeling for pollutant fate and transport during floods. Preliminary results Miranda Deda, Mattia Fiorini, Marco Massabò, Roberto Rudari One of the concerns that arises during floods is whether the wide-spreading of chemical contamination is associated with the flooding. Many potential sources of toxics releases during floods exists in cities or rural area; hydrocarbons fuel storage system, distribution facilities, commercial chemical storage, sewerage system are only few examples. When inundated homes and vehicles can also be source of toxics contaminants such as gasoline/diesel, detergents and sewage. Hazardous substances released into the environment are transported and dispersed in complex environmental systems that include air, plant, soil, water and sediment. Effective environmental models demand holistic modelling of the transport and transformation of the materials in the multimedia arena. Among these models, fugacity-based models are distribution based models incorporating all environmental compartments and are based on steady-state fluxes of pollutants across compartment interfaces (Mackay "Multimedia Environmental Models" 2001). They satisfy the primary objective of environmental chemistry which is to forecast the concentrations of pollutants in the environments with respect to space and time variables. Multimedia fugacity based-models has been used to assess contaminant distribution at very different spatial and temporal scales. The applications range from contaminant leaching to groundwater, runoff to surface water, partitioning in lakes and streams, distribution at regional and even global scale. We developped a two-dimensional fugacity based model for fate and transport of chemicals during floods. The model has three modules: the first module estimates toxins emission rates during floods; the second modules is the hydrodynamic model that simulates the water flood and the third module simulate the dynamic distribution of chemicals in

  14. Overview of ASC Capability Computing System Governance Model

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott W. [Los Alamos National Laboratory

    2012-07-11

    This document contains a description of the Advanced Simulation and Computing Program's Capability Computing System Governance Model. Objectives of the Governance Model are to ensure that the capability system resources are allocated on a priority-driven basis according to the Program requirements; and to utilize ASC Capability Systems for the large capability jobs for which they were designed and procured.

  15. Python for Scientific Computing Education: Modeling of Queueing Systems

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2014-01-01

    Full Text Available In this paper, we present the methodology for the introduction to scientific computing based on model-centered learning. We propose multiphase queueing systems as a basis for learning objects. We use Python and parallel programming for implementing the models and present the computer code and results of stochastic simulations.

  16. World Knowledge in Computational Models of Discourse Comprehension

    Science.gov (United States)

    Frank, Stefan L.; Koppen, Mathieu; Noordman, Leo G. M.; Vonk, Wietske

    2008-01-01

    Because higher level cognitive processes generally involve the use of world knowledge, computational models of these processes require the implementation of a knowledge base. This article identifies and discusses 4 strategies for dealing with world knowledge in computational models: disregarding world knowledge, "ad hoc" selection, extraction from…

  17. Flow Through a Laboratory Sediment Sample by Computer Simulation Modeling

    Science.gov (United States)

    2006-09-07

    Flow through a laboratory sediment sample by computer simulation modeling R.B. Pandeya’b*, Allen H. Reeda, Edward Braithwaitea, Ray Seyfarth0, J.F...through a laboratory sediment sample by computer simulation modeling 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S

  18. Preliminary insights into a model for mafic magma fragmentation

    Science.gov (United States)

    Edwards, Matt; Pioli, Laura; Andronico, Daniele; Cristaldi, Antonio; Scollo, Simona

    2017-04-01

    Fragmentation of mafic magmas remains a poorly understood process despite the common occurrence of low viscosity explosive eruptions. In fact, it has been commonly overlooked based on the assumption that low viscosity magmas have very limited explosivity and low potential to undergo brittle fragmentation. However, it is now known that highly explosive, ash forming eruptions can be relatively frequent at several mafic volcanoes. Three questions arise due to this - What is the specific fragmentation mechanism occuring in these eruptions? What are the primary factors controlling fragmentation efficiency? Can a link between eruption style and fragmentation efficiency be quantified? We addressed these questions by coupling theoretical observations and field analysis of the recent May 2016 eruption at Mount Etna volcano. Within this complex 10-day event three paroxysmal episodes of pulsating basaltic lava jets alternating with small lava flows were recorded from a vent within the Voragine crater. The associated plumes which were produced deposited tephra along narrow axes to the east and south east. Sampling was done on the deposits associated with the first two plumes and the third one. We briefly characterise the May 2016 eruption by assessing plume height, eruption phases, total erupted masses and fallout boundaries and comparing them to previous eruptions. We also analyse the total grainsize distribution (TGSD) of the scoria particles formed in the jets. Conventional methods for obtaining grainsize and total distributions of an eruption are based on mass and provide limited information on fragmentation though. For this reason, the TGSD was assessed by coupling particle analyser data and conventional sieving data to assess both particle size and number of particle distributions with better precision. This allowed for more accurate testing of several existing models describing the shape of the TGSD. Coupled further with observations on eruption dynamics and eruption

  19. Many-Task Computing Tools for Multiscale Modeling

    OpenAIRE

    Katz, Daniel S.; Ripeanu, Matei; Wilde, Michael

    2011-01-01

    This paper discusses the use of many-task computing tools for multiscale modeling. It defines multiscale modeling and places different examples of it on a coupling spectrum, discusses the Swift parallel scripting language, describes three multiscale modeling applications that could use Swift, and then talks about how the Swift model is being extended to cover more of the multiscale modeling coupling spectrum.

  20. Modeling of enterprise information systems implementation: a preliminary investigation

    Science.gov (United States)

    Yusuf, Yahaya Y.; Abthorpe, M. S.; Gunasekaran, Angappa; Al-Dabass, D.; Onuh, Spencer

    2001-10-01

    The business enterprise has never been in greater need of Agility and the current trend will continue unabated well into the future. It is now recognized that information system is both the foundation and a necessary condition for increased responsiveness. A successful implementation of Enterprise Resource Planning (ERP) can help a company to move towards delivering on its competitive objectives as it enables suppliers to reach out to customers beyond the borders of traditional market defined by geography. The cost of implementation, even when it is successful, could be significant. Bearing in mind the potential strategic benefits, it is important that the implementation project is managed effectively. To this end a project cost model against which to benchmark ongoing project expenditure versus activities completed has been proposed in this paper.

  1. Dynamic density functional theory of solid tumor growth: Preliminary models

    Directory of Open Access Journals (Sweden)

    Arnaud Chauviere

    2012-03-01

    Full Text Available Cancer is a disease that can be seen as a complex system whose dynamics and growth result from nonlinear processes coupled across wide ranges of spatio-temporal scales. The current mathematical modeling literature addresses issues at various scales but the development of theoretical methodologies capable of bridging gaps across scales needs further study. We present a new theoretical framework based on Dynamic Density Functional Theory (DDFT extended, for the first time, to the dynamics of living tissues by accounting for cell density correlations, different cell types, phenotypes and cell birth/death processes, in order to provide a biophysically consistent description of processes across the scales. We present an application of this approach to tumor growth.

  2. Preliminary characterization and modeling of SMA-based textile composites

    Science.gov (United States)

    Masuda, Arata; Ni, Qing-Qing; Sone, Akira; Zhang, Run-Xin; Yamamura, Takahiko

    2004-07-01

    In this paper, we conduct a feasibility study to investigate the future potential of textile composites with shape memory alloys. Two different types of SMA-based textile composites are presented. First, a composite plate with embedded woven SMA layer is fabricated, and the stiffness tuning capability is evaluated by impact vibration tests. The results are not favorable, but may be improved by increasing the volume fraction of SMA, and by controlling the prestrain more accurately during the lamination process. The modeling and analysis methodology for woven SMA-based composites are briefly discussed. Then, the possibility of textile composites with SMA stitching is discussed, that is expected to give the composites multi-functions such as tunable stiffness, shape control and sensing capability, selectively distributed on demand.

  3. Pervasive Computing Location-aware Model Based on Ontology

    Institute of Scientific and Technical Information of China (English)

    PU Fang; CAI Hai-bin; CAO Qi-ying; SUN Dao-qing; LI Tong

    2008-01-01

    In order to integrate heterogeneous location-aware systems into pervasive computing environment, a novel pervasive computing location-aware model based on ontology is presented. A location-aware model ontology (LMO) is constructed. The location-aware model has the capabilities of sharing knowledge, reasoning and adjusting the usage policies of services dynamically through a unified semantic location manner. At last, the work process of our proposed location-aware model is explained by an application scenario.

  4. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    Science.gov (United States)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  5. Computational intelligence applications in modeling and control

    CERN Document Server

    Vaidyanathan, Sundarapandian

    2015-01-01

    The development of computational intelligence (CI) systems was inspired by observable and imitable aspects of intelligent activity of human being and nature. The essence of the systems based on computational intelligence is to process and interpret data of various nature so that that CI is strictly connected with the increase of available data as well as capabilities of their processing, mutually supportive factors. Developed theories of computational intelligence were quickly applied in many fields of engineering, data analysis, forecasting, biomedicine and others. They are used in images and sounds processing and identifying, signals processing, multidimensional data visualization, steering of objects, analysis of lexicographic data, requesting systems in banking, diagnostic systems, expert systems and many other practical implementations. This book consists of 16 contributed chapters by subject experts who are specialized in the various topics addressed in this book. The special chapters have been brought ...

  6. Attacker Modelling in Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Papini, Davide

    Within the last five to ten years we have experienced an incredible growth of ubiquitous technologies which has allowed for improvements in several areas, including energy distribution and management, health care services, border surveillance, secure monitoring and management of buildings......, localisation services and many others. These technologies can be classified under the name of ubiquitous systems. The term Ubiquitous System dates back to 1991 when Mark Weiser at Xerox PARC Lab first referred to it in writing. He envisioned a future where computing technologies would have been melted...... in with our everyday life. This future is visible to everyone nowadays: terms like smartphone, cloud, sensor, network etc. are widely known and used in our everyday life. But what about the security of such systems. Ubiquitous computing devices can be limited in terms of energy, computing power and memory...

  7. A DNA based model for addition computation

    Institute of Scientific and Technical Information of China (English)

    GAO Lin; YANG Xiao; LIU Wenbin; XU Jin

    2004-01-01

    Much effort has been made to solve computing problems by using DNA-an organic simulating method, which in some cases is preferable to the current electronic computer. However, No one at present has proposed an effective and applicable method to solve addition problem with molecular algorithm due to the difficulty in solving the carry problem which can be easily solved by hardware of an electronic computer. In this article, we solved this problem by employing two kinds of DNA strings, one is called result and operation string while the other is named carrier. The result and operation string contains some carry information by its own and denotes the ultimate result while the carrier is just for carrying use. The significance of this algorithm is the original code, the fairly easy steps to follow and the feasibility under current molecular biological technology.

  8. A computational model of the human hand 93-ERI-053

    Energy Technology Data Exchange (ETDEWEB)

    Hollerbach, K.; Axelrod, T.

    1996-03-01

    The objectives of the Computational Hand Modeling project were to prove the feasibility of the Laboratory`s NIKE3D finite element code to orthopaedic problems. Because of the great complexity of anatomical structures and the nonlinearity of their behavior, we have focused on a subset of joints of the hand and lower extremity and have developed algorithms to model their behavior. The algorithms developed here solve fundamental problems in computational biomechanics and can be expanded to describe any other joints of the human body. This kind of computational modeling has never successfully been attempted before, due in part to a lack of biomaterials data and a lack of computational resources. With the computational resources available at the National Laboratories and the collaborative relationships we have established with experimental and other modeling laboratories, we have been in a position to pursue our innovative approach to biomechanical and orthopedic modeling.

  9. The one-way quantum computer - a non-network model of quantum computation

    CERN Document Server

    Raussendorf, R; Briegel, H J; Raussendorf, Robert; Browne, Daniel E.; Briegel, Hans J.

    2001-01-01

    A one-way quantum computer works by only performing a sequence of one-qubit measurements on a particular entangled multi-qubit state, the cluster state. No non-local operations are required in the process of computation. Any quantum logic network can be simulated on the one-way quantum computer. On the other hand, the network model of quantum computation cannot explain all ways of processing quantum information possible with the one-way quantum computer. In this paper, two examples of the non-network character of the one-way quantum computer are given. First, circuits in the Clifford group can be performed in a single time step. Second, the realisation of a particular circuit --the bit-reversal gate-- on the one-way quantum computer has no network interpretation. (Submitted to J. Mod. Opt, Gdansk ESF QIT conference issue.)

  10. A Swarm Intelligence Based Model for Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ahmed S. Salama

    2015-01-01

    Full Text Available Mobile Computing (MC provides multi services and a lot of advantages for millions of users across the world over the internet. Millions of business customers have leveraged cloud computing services through mobile devices to get what is called Mobile Cloud Computing (MCC. MCC aims at using cloud computing techniques for storage and processing of data on mobile devices, thereby reducing their limitations. This paper proposes architecture for a Swarm Intelligence Based Mobile Cloud Computing Model (SIBMCCM. A model that uses a proposed Parallel Particle Swarm Optimization (PPSO algorithm to enhance the access time for the mobile cloud computing services which support different E Commerce models and to better secure the communication through the mobile cloud and the mobile commerce transactions.

  11. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  12. Application of Computed Tomography Virtual Noncontrast Spectral Imaging in Evaluation of Hepatic Metastases: A Preliminary Study

    Institute of Scientific and Technical Information of China (English)

    Shi-Feng Tian; Ai-Lian Liu; Jing-Hong Liu; Mei-Yu Sun; He-Qing Wang; Yi-Jun Liu

    2015-01-01

    Objective:The objective was to qualitatively and quantitatively evaluate hepatic metastases using computed tomography (CT) virtual noncontrast (VNC) spectral imaging in a retrospective analysis.Methods:Forty hepatic metastases patients underwent CT scans including the conventional true noncontrast (TNC) and the tri-phasic contrast-enhanced dual energy spectral scans in the hepatic arterial,portal venous,and equilibrium phases.The tri-phasic spectral CT images were used to obtain three groups of VNC images including in the arterial (VNCa),venous (VNCv),and equilibrium (VNCe) phase by the material decomposition process using water and iodine as a base material pair.The image quality and the contrast-to-noise ratio (CNR) of metastasis of the four groups were compared with ANOVA analysis.The metastasis detection rates with the four nonenhanced image groups were calculated and compared using the Chi-square test.Results:There were no significant differences in image quality among TNC,VNCa and VNCv images (P > 0.05).The quality of VNCe images was significantly worse than that of other three groups (P < 0.05).The mean CNR of metastasis in the TNC and VNCs images was 1.86,2.42,1.92,and 1.94,respectively; the mean CNR of metastasis in VNCa images was significantly higher than that in other three groups (P < 0.05),while no statistically significant difference was observed among VNCv,VNCe and TNC images (P > 0.05).The metastasis detection rate of the four nonenhanced groups with no statistically significant difference (P > 0.05).Conclusions:The quality of VNCa and VNCv images is identical to that of TNC images,and the metastasis detection rate in VNC images is similar to that in TNC images.VNC images obtained from arterial phase show metastases more clearly.Thus,VNCa imaging may be a surrogate to TNC imaging in hepatic metastasis diagnosis.

  13. CHOREO: An Interactive Computer Model for Dance.

    Science.gov (United States)

    Savage, G. J.; Officer, J. M.

    1978-01-01

    Establishes the need for literacy in dance; and describes two dance notation systems: the Massine notation method, and the Labanotation method. The use of interactive computer graphics as a tool for both learning and interpreting dance notation is introduced. (Author/VT)

  14. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  15. An integrated introduction to computer graphics and geometric modeling

    CERN Document Server

    Goldman, Ronald

    2009-01-01

    … this book may be the first book on geometric modelling that also covers computer graphics. In addition, it may be the first book on computer graphics that integrates a thorough introduction to 'freedom' curves and surfaces and to the mathematical foundations for computer graphics. … the book is well suited for an undergraduate course. … The entire book is very well presented and obviously written by a distinguished and creative researcher and educator. It certainly is a textbook I would recommend. …-Computer-Aided Design, 42, 2010… Many books concentrate on computer programming and soon beco

  16. Enabling Interoperation of High Performance, Scientific Computing Applications: Modeling Scientific Data with the Sets & Fields (SAF) Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    Miller, M C; Reus, J F; Matzke, R P; Arrighi, W J; Schoof, L A; Hitt, R T; Espen, P K; Butler, D M

    2001-02-07

    This paper describes the Sets and Fields (SAF) scientific data modeling system. It is a revolutionary approach to interoperation of high performance, scientific computing applications based upon rigorous, math-oriented data modeling principles. Previous technologies have required all applications to use the same data structures and/or meshes to represent scientific data or lead to an ever expanding set of incrementally different data structures and/or meshes. SAF addresses this problem by providing a small set of mathematical building blocks--sets, relations and fields--out of which a wide variety of scientific data can be characterized. Applications literally model their data by assembling these building blocks. A short historical perspective, a conceptual model and an overview of SAF along with preliminary results from its use in a few ASCI codes are discussed.

  17. Assessing Internet addiction using the parsimonious Internet addiction components model - a preliminary study [forthcoming

    OpenAIRE

    Kuss, DJ; Shorter, GW; Van Rooij, AJ; Griffiths, MD; Schoenmakers, T.

    2014-01-01

    Internet usage has grown exponentially over the last decade. Research indicates that excessive Internet use can lead to symptoms associated with addiction. To date, assessment of potential Internet addiction has varied regarding populations studied and instruments used, making reliable prevalence estimations difficult. To overcome the present problems a preliminary study was conducted testing a parsimonious Internet addiction components model based on Griffiths’ addiction components (2005), i...

  18. SmartShadow models and methods for pervasive computing

    CERN Document Server

    Wu, Zhaohui

    2013-01-01

    SmartShadow: Models and Methods for Pervasive Computing offers a new perspective on pervasive computing with SmartShadow, which is designed to model a user as a personality ""shadow"" and to model pervasive computing environments as user-centric dynamic virtual personal spaces. Just like human beings' shadows in the physical world, it follows people wherever they go, providing them with pervasive services. The model, methods, and software infrastructure for SmartShadow are presented and an application for smart cars is also introduced.  The book can serve as a valuable reference work for resea

  19. Computer-aided design–computer-aided engineering associative feature-based heterogeneous object modeling

    Directory of Open Access Journals (Sweden)

    Jikai Liu

    2015-12-01

    Full Text Available Conventionally, heterogeneous object modeling methods paid limited attention to the concurrent modeling of geometry design and material composition distribution. Procedural method was normally employed to generate the geometry first and then determine the heterogeneous material distribution, which ignores the mutual influence. Additionally, limited capability has been established about irregular material composition distribution modeling with strong local discontinuities. This article overcomes these limitations by developing the computer-aided design–computer-aided engineering associative feature-based heterogeneous object modeling method. Level set functions are applied to model the geometry within computer-aided design module, which enables complex geometry modeling. Finite element mesh is applied to store the local material compositions within computer-aided engineering module, which allows any local discontinuities. Then, the associative feature concept builds the correspondence relationship between these modules. Additionally, the level set geometry and material optimization method are developed to concurrently generate the geometry and material information which fills the contents of the computer-aided design–computer-aided engineering associative feature model. Micro-geometry is investigated as well, instead of only the local material composition. A few cases are studied to prove the effectiveness of this new heterogeneous object modeling method.

  20. Computer modeling of ORNL storage tank sludge mobilization and mixing

    Energy Technology Data Exchange (ETDEWEB)

    Terrones, G.; Eyler, L.L.

    1993-09-01

    This report presents and analyzes the results of the computer modeling of mixing and mobilization of sludge in horizontal, cylindrical storage tanks using submerged liquid jets. The computer modeling uses the TEMPEST computational fluid dynamics computer program. The horizontal, cylindrical storage tank configuration is similar to the Melton Valley Storage Tanks (MVST) at Oak Ridge National (ORNL). The MVST tank contents exhibit non-homogeneous, non-Newtonian rheology characteristics. The eventual goals of the simulations are to determine under what conditions sludge mobilization using submerged liquid jets is feasible in tanks of this configuration, and to estimate mixing times required to approach homogeneity of the contents of the tanks.

  1. Macro—Dataflow Computational Model and Its Simulation

    Institute of Scientific and Technical Information of China (English)

    孙昱东; 谢志良

    1990-01-01

    This paper discusses the relationship between parallelism granularity and system overhead of dataflow computer systems,and indicates that a trade-off between them should be determined to obtain optimal efficiency of the overall system.On the basis of this discussion,a macro-dataflow computational model is established to exploit the task-level parallelism.Working as a macro-dataflow computer,an Experimental Distributed Dataflow Simulation System(EDDSS)is developed to examine the effectiveness of the macro-dataflow computational model.

  2. Computational modeling in melanoma for novel drug discovery.

    Science.gov (United States)

    Pennisi, Marzio; Russo, Giulia; Di Salvatore, Valentina; Candido, Saverio; Libra, Massimo; Pappalardo, Francesco

    2016-06-01

    There is a growing body of evidence highlighting the applications of computational modeling in the field of biomedicine. It has recently been applied to the in silico analysis of cancer dynamics. In the era of precision medicine, this analysis may allow the discovery of new molecular targets useful for the design of novel therapies and for overcoming resistance to anticancer drugs. According to its molecular behavior, melanoma represents an interesting tumor model in which computational modeling can be applied. Melanoma is an aggressive tumor of the skin with a poor prognosis for patients with advanced disease as it is resistant to current therapeutic approaches. This review discusses the basics of computational modeling in melanoma drug discovery and development. Discussion includes the in silico discovery of novel molecular drug targets, the optimization of immunotherapies and personalized medicine trials. Mathematical and computational models are gradually being used to help understand biomedical data produced by high-throughput analysis. The use of advanced computer models allowing the simulation of complex biological processes provides hypotheses and supports experimental design. The research in fighting aggressive cancers, such as melanoma, is making great strides. Computational models represent the key component to complement these efforts. Due to the combinatorial complexity of new drug discovery, a systematic approach based only on experimentation is not possible. Computational and mathematical models are necessary for bringing cancer drug discovery into the era of omics, big data and personalized medicine.

  3. Integrating Numerical Computation into the Modeling Instruction Curriculum

    CERN Document Server

    Caballero, Marcos D; Aiken, John M; Douglas, Scott S; Scanlon, Erin M; Thoms, Brian; Schatz, Michael F

    2012-01-01

    We describe a way to introduce physics high school students with no background in programming to computational problem-solving experiences. Our approach builds on the great strides made by the Modeling Instruction reform curriculum. This approach emphasizes the practices of "Developing and using models" and "Computational thinking" highlighted by the NRC K-12 science standards framework. We taught 9th-grade students in a Modeling-Instruction-based physics course to construct computational models using the VPython programming environment. Numerical computation within the Modeling Instruction curriculum provides coherence among the curriculum's different force and motion models, links the various representations which the curriculum employs, and extends the curriculum to include real-world problems that are inaccessible to a purely analytic approach.

  4. Models of parallel computation :a survey and classification

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yunquan; CHEN Guoliang; SUN Guangzhong; MIAO Qiankun

    2007-01-01

    In this paper,the state-of-the-art parallel computational model research is reviewed.We will introduce various models that were developed during the past decades.According to their targeting architecture features,especially memory organization,we classify these parallel computational models into three generations.These models and their characteristics are discussed based on three generations classification.We believe that with the ever increasing speed gap between the CPU and memory systems,incorporating non-uniform memory hierarchy into computational models will become unavoidable.With the emergence of multi-core CPUs,the parallelism hierarchy of current computing platforms becomes more and more complicated.Describing this complicated parallelism hierarchy in future computational models becomes more and more important.A semi-automatic toolkit that can extract model parameters and their values on real computers can reduce the model analysis complexity,thus allowing more complicated models with more parameters to be adopted.Hierarchical memory and hierarchical parallelism will be two very important features that should be considered in future model design and research.

  5. Computer Modeling for Optical Waveguide Sensors.

    Science.gov (United States)

    1987-12-15

    COSATI CODES 18 SUBJECT TERMS (Continue on reverse it necessary and cleritify by DIock numnerl FIEL GRUP SB-GOUP Optical waveguide sensors Computer...reflection. The resultant probe beam transmission may be plotted as a function of changes in the refractive index of the surrounding fluid medium. BASIC...all angles of incidence about the critical angle ecr. It should be noted that N in equation (3) is a function of e, since = sin - l sin 8 , see

  6. Operation of the computer model for microenvironment atomic oxygen exposure

    Science.gov (United States)

    Bourassa, R. J.; Gillis, J. R.; Gruenbaum, P. E.

    1995-01-01

    A computer model for microenvironment atomic oxygen exposure has been developed to extend atomic oxygen modeling capability to include shadowing and reflections. The model uses average exposure conditions established by the direct exposure model and extends the application of these conditions to treat surfaces of arbitrary shape and orientation.

  7. Introduction to computation and modeling for differential equations

    CERN Document Server

    Edsberg, Lennart

    2008-01-01

    An introduction to scientific computing for differential equationsIntroduction to Computation and Modeling for Differential Equations provides a unified and integrated view of numerical analysis, mathematical modeling in applications, and programming to solve differential equations, which is essential in problem-solving across many disciplines, such as engineering, physics, and economics. This book successfully introduces readers to the subject through a unique ""Five-M"" approach: Modeling, Mathematics, Methods, MATLAB, and Multiphysics. This approach facilitates a thorough understanding of h

  8. Computational modeling of induced emotion using GEMS

    NARCIS (Netherlands)

    Aljanaki, Anna; Wiering, Frans; Veltkamp, Remco

    2014-01-01

    Most researchers in the automatic music emotion recognition field focus on the two-dimensional valence and arousal model. This model though does not account for the whole diversity of emotions expressible through music. Moreover, in many cases it might be important to model induced (felt) emotion, r

  9. Computational neurorehabilitation: modeling plasticity and learning to predict recovery.

    Science.gov (United States)

    Reinkensmeyer, David J; Burdet, Etienne; Casadio, Maura; Krakauer, John W; Kwakkel, Gert; Lang, Catherine E; Swinnen, Stephan P; Ward, Nick S; Schweighofer, Nicolas

    2016-01-01

    Despite progress in using computational approaches to inform medicine and neuroscience in the last 30 years, there have been few attempts to model the mechanisms underlying sensorimotor rehabilitation. We argue that a fundamental understanding of neurologic recovery, and as a result accurate predictions at the individual level, will be facilitated by developing computational models of the salient neural processes, including plasticity and learning systems of the brain, and integrating them into a context specific to rehabilitation. Here, we therefore discuss Computational Neurorehabilitation, a newly emerging field aimed at modeling plasticity and motor learning to understand and improve movement recovery of individuals with neurologic impairment. We first explain how the emergence of robotics and wearable sensors for rehabilitation is providing data that make development and testing of such models increasingly feasible. We then review key aspects of plasticity and motor learning that such models will incorporate. We proceed by discussing how computational neurorehabilitation models relate to the current benchmark in rehabilitation modeling - regression-based, prognostic modeling. We then critically discuss the first computational neurorehabilitation models, which have primarily focused on modeling rehabilitation of the upper extremity after stroke, and show how even simple models have produced novel ideas for future investigation. Finally, we conclude with key directions for future research, anticipating that soon we will see the emergence of mechanistic models of motor recovery that are informed by clinical imaging results and driven by the actual movement content of rehabilitation therapy as well as wearable sensor-based records of daily activity.

  10. Markov Graph Model Computation and Its Application to Intrusion Detection

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Markov model is usually selected as the base model of user action in the intrusion detection system (IDS). However, the performance of the IDS depends on the status space of Markov model and it will degrade as the space dimension grows. Here, Markov Graph Model (MGM) is proposed to handle this issue. Specification of the model is described, and several methods for probability computation with MGM are also presented. Based on MGM,algorithms for building user model and predicting user action are presented. And the performance of these algorithms such as computing complexity, prediction accuracy, and storage requirement of MGM are analyzed.

  11. Computational technology of multiscale modeling the gas flows in microchannels

    Science.gov (United States)

    Podryga, V. O.

    2016-11-01

    The work is devoted to modeling the gas mixture flows in engineering microchannels under conditions of many scales of computational domain. The computational technology of using the multiscale approach combining macro - and microscopic models is presented. At macrolevel the nature of the flow and the external influence on it are considered. As a model the system of quasigasdynamic equations is selected. At microlevel the correction of gasdynamic parameters and the determination of boundary conditions are made. As a numerical model the Newton's equations and the molecular dynamics method are selected. Different algorithm types used for implementation of multiscale modeling are considered. The results of the model problems for separate stages are given.

  12. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    Directory of Open Access Journals (Sweden)

    Jiunn-Woei Lian PhD

    2017-01-01

    Full Text Available The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  13. Establishing a Cloud Computing Success Model for Hospitals in Taiwan.

    Science.gov (United States)

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  14. Mobile Cloud Computing: A Comparison of Application Models

    CERN Document Server

    Kovachev, Dejan; Klamma, Ralf

    2011-01-01

    Cloud computing is an emerging concept combining many fields of computing. The foundation of cloud computing is the delivery of services, software and processing capacity over the Internet, reducing cost, increasing storage, automating systems, decoupling of service delivery from underlying technology, and providing flexibility and mobility of information. However, the actual realization of these benefits is far from being achieved for mobile applications and open many new research questions. In order to better understand how to facilitate the building of mobile cloud-based applications, we have surveyed existing work in mobile computing through the prism of cloud computing principles. We give a definition of mobile cloud coputing and provide an overview of the results from this review, in particular, models of mobile cloud applications. We also highlight research challenges in the area of mobile cloud computing. We conclude with recommendations for how this better understanding of mobile cloud computing can ...

  15. Numerical computations and mathematical modelling with infinite and infinitesimal numbers

    CERN Document Server

    Sergeyev, Yaroslav D

    2012-01-01

    Traditional computers work with finite numbers. Situations where the usage of infinite or infinitesimal quantities is required are studied mainly theoretically. In this paper, a recently introduced computational methodology (that is not related to the non-standard analysis) is used to work with finite, infinite, and infinitesimal numbers \\textit{numerically}. This can be done on a new kind of a computer - the Infinity Computer - able to work with all these types of numbers. The new computational tools both give possibilities to execute computations of a new type and open new horizons for creating new mathematical models where a computational usage of infinite and/or infinitesimal numbers can be useful. A number of numerical examples showing the potential of the new approach and dealing with divergent series, limits, probability theory, linear algebra, and calculation of volumes of objects consisting of parts of different dimensions are given.

  16. The Validation of Computer-based Models in Engineering: Some Lessons from Computing Science

    Directory of Open Access Journals (Sweden)

    D. J. Murray-Smith

    2001-01-01

    Full Text Available Questions of the quality of computer-based models and the formal processes of model testing, involving internal verification and external validation, are usually given only passing attention in engineering reports and in technical publications. However, such models frequently provide a basis for analysis methods, design calculations or real-time decision-making in complex engineering systems. This paper reviews techniques used for external validation of computer-based models and contrasts the somewhat casual approach which is usually adopted in this field with the more formal approaches to software testing and documentation recommended for large software projects. Both activities require intimate knowledge of the intended application, a systematic approach and considerable expertise and ingenuity in the design of tests. It is concluded that engineering degree courses dealing with modelling techniques and computer simulation should put more emphasis on model limitations, testing and validation.

  17. Supersonic Retro-Propulsion Experimental Design for Computational Fluid Dynamics Model Validation

    Science.gov (United States)

    Berry, Scott A.; Laws, Christopher T.; Kleb, W. L.; Rhode, Matthew N.; Spells, Courtney; McCrea, Andrew C.; Truble, Kerry A.; Schauerhamer, Daniel G.; Oberkampf, William L.

    2011-01-01

    The development of supersonic retro-propulsion, an enabling technology for heavy payload exploration missions to Mars, is the primary focus for the present paper. A new experimental model, intended to provide computational fluid dynamics model validation data, was recently designed for the Langley Research Center Unitary Plan Wind Tunnel Test Section 2. Pre-test computations were instrumental for sizing and refining the model, over the Mach number range of 2.4 to 4.6, such that tunnel blockage and internal flow separation issues would be minimized. A 5-in diameter 70-deg sphere-cone forebody, which accommodates up to four 4:1 area ratio nozzles, followed by a 10-in long cylindrical aftbody was developed for this study based on the computational results. The model was designed to allow for a large number of surface pressure measurements on the forebody and aftbody. Supplemental data included high-speed Schlieren video and internal pressures and temperatures. The run matrix was developed to allow for the quantification of various sources of experimental uncertainty, such as random errors due to run-to-run variations and bias errors due to flow field or model misalignments. Some preliminary results and observations from the test are presented, although detailed analyses of the data and uncertainties are still on going.

  18. Computational Psychometrics for Modeling System Dynamics during Stressful Disasters

    Directory of Open Access Journals (Sweden)

    Pietro Cipresso

    2017-08-01

    Full Text Available Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future.

  19. Robust speech features representation based on computational auditory model

    Institute of Scientific and Technical Information of China (English)

    LU Xugang; JIA Chuan; DANG Jianwu

    2004-01-01

    A speech signal processing and features extracting method based on computational auditory model is proposed. The computational model is based on psychological, physiological knowledge and digital signal processing methods. In each stage of a hearing perception system, there is a corresponding computational model to simulate its function. Based on this model, speech features are extracted. In each stage, the features in different kinds of level are extracted. A further processing for primary auditory spectrum based on lateral inhibition is proposed to extract much more robust speech features. All these features can be regarded as the internal representations of speech stimulation in hearing system. The robust speech recognition experiments are conducted to test the robustness of the features. Results show that the representations based on the proposed computational auditory model are robust representations for speech signals.

  20. Performance Predictable ServiceBSP Model for Grid Computing

    Institute of Scientific and Technical Information of China (English)

    TONG Weiqin; MIAO Weikai

    2007-01-01

    This paper proposes a performance prediction model for grid computing model ServiceBSP to support developing high quality applications in grid environment. In ServiceBSP model,the agents carrying computing tasks are dispatched to the local domain of the selected computation services. By using the IP (integer program) approach, the Service Selection Agent selects the computation services with global optimized QoS (quality of service) consideration. The performance of a ServiceBSP application can be predicted according to the performance prediction model based on the QoS of the selected services. The performance prediction model can help users to analyze their applications and improve them by optimized the factors which affects the performance. The experiment shows that the Service Selection Agent can provide ServiceBSP users with satisfied QoS of applications.

  1. Transforming High School Physics with Modeling and Computation

    CERN Document Server

    Aiken, John M

    2013-01-01

    The Engage to Excel (PCAST) report, the National Research Council's Framework for K-12 Science Education, and the Next Generation Science Standards all call for transforming the physics classroom into an environment that teaches students real scientific practices. This work describes the early stages of one such attempt to transform a high school physics classroom. Specifically, a series of model-building and computational modeling exercises were piloted in a ninth grade Physics First classroom. Student use of computation was assessed using a proctored programming assignment, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Student views on computation and its link to mechanics was assessed with a written essay and a series of think-aloud interviews. This pilot study shows computation's ability for connecting scientific practice to the high school science classroom.

  2. Model identification in computational stochastic dynamics using experimental modal data

    Science.gov (United States)

    Batou, A.; Soize, C.; Audebert, S.

    2015-01-01

    This paper deals with the identification of a stochastic computational model using experimental eigenfrequencies and mode shapes. In the presence of randomness, it is difficult to construct a one-to-one correspondence between the results provided by the stochastic computational model and the experimental data because of the random modes crossing and veering phenomena that may occur from one realization to another one. In this paper, this correspondence is constructed by introducing an adapted transformation for the computed modal quantities. Then the transformed computed modal quantities can be compared with the experimental data in order to identify the parameters of the stochastic computational model. The methodology is applied to a booster pump of thermal units for which experimental modal data have been measured on several sites.

  3. Computational modeling of shallow geothermal systems

    CERN Document Server

    Al-Khoury, Rafid

    2011-01-01

    A Step-by-step Guide to Developing Innovative Computational Tools for Shallow Geothermal Systems Geothermal heat is a viable source of energy and its environmental impact in terms of CO2 emissions is significantly lower than conventional fossil fuels. Shallow geothermal systems are increasingly utilized for heating and cooling of buildings and greenhouses. However, their utilization is inconsistent with the enormous amount of energy available underneath the surface of the earth. Projects of this nature are not getting the public support they deserve because of the uncertainties associated with

  4. Computational modeling in cognitive science: a manifesto for change.

    Science.gov (United States)

    Addyman, Caspar; French, Robert M

    2012-07-01

    Computational modeling has long been one of the traditional pillars of cognitive science. Unfortunately, the computer models of cognition being developed today have not kept up with the enormous changes that have taken place in computer technology and, especially, in human-computer interfaces.  For all intents and purposes, modeling is still done today as it was 25, or even 35, years ago. Everyone still programs in his or her own favorite programming language, source code is rarely made available, accessibility of models to non-programming researchers is essentially non-existent, and even for other modelers, the profusion of source code in a multitude of programming languages, written without programming guidelines, makes it almost impossible to access, check, explore, re-use, or continue to develop. It is high time to change this situation, especially since the tools are now readily available to do so. We propose that the modeling community adopt three simple guidelines that would ensure that computational models would be accessible to the broad range of researchers in cognitive science. We further emphasize the pivotal role that journal editors must play in making computational models accessible to readers of their journals.

  5. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  6. Environmental Barrier Coating (EBC) Durability Modeling; An Overview and Preliminary Analysis

    Science.gov (United States)

    Abdul-Aziz, A.; Bhatt, R. T.; Grady, J. E.; Zhu, D.

    2012-01-01

    A study outlining a fracture mechanics based model that is being developed to investigate crack growth and spallation of environmental barrier coating (EBC) under thermal cycling conditions is presented. A description of the current plan and a model to estimate thermal residual stresses in the coating and preliminary fracture mechanics concepts for studying crack growth in the coating are also discussed. A road map for modeling life and durability of the EBC and the results of FEA model(s) developed for predicting thermal residual stresses and the cracking behavior of the coating are generated and described. Further initial assessment and preliminary results showed that developing a comprehensive EBC life prediction model incorporating EBC cracking, degradation and spalling mechanism under stress and temperature gradients typically seen in turbine components is difficult. This is basically due to mismatch in thermal expansion difference between sub-layers of EBC as well as between EBC and substrate, diffusion of moisture and oxygen though the coating, and densification of the coating during operating conditions as well as due to foreign object damage, the EBC can also crack and spall from the substrate causing oxidation and recession and reducing the design life of the EBC coated substrate.

  7. A preliminary model to avoid the overestimation of sample size in bioequivalence studies.

    Science.gov (United States)

    Ramírez, E; Abraira, V; Guerra, P; Borobia, A M; Duque, B; López, J L; Mosquera, B; Lubomirov, R; Carcas, A J; Frías, J

    2013-02-01

    Often the only available data in literature for sample size estimations in bioequivalence studies is intersubject variability, which tends to result in overestimation of sample size. In this paper, we proposed a preliminary model of intrasubject variability based on intersubject variability for Cmax and AUC data from randomized, crossovers, bioequivalence (BE) studies. From 93 Cmax and 121 AUC data from test-reference comparisons that fulfilled BE criteria, we calculated intersubject variability for the reference formulation and intrasubject variability from ANOVA. Lineal and exponential models (y=a(1-e-bx)) were fitted weighted by the inverse of the variance, to predict the intrasubject variability based on intersubject variability. To validate the model we calculated the coefficient of cross-validation of data from 30 new BE studies. The models fit very well (R2=0.997 and 0.990 for Cmax and AUC respectively) and the cross-validation correlation were 0.847 for Cmax and 0.572 for AUC. A preliminary model analyses allow us to estimate the intrasubject variability based on intersubject variability for sample size calculation purposes in BE studies. This approximation provides an opportunity for sample size reduction avoiding unnecessary exposure of healthy volunteers. Further modelling studies are desirable to confirm these results especially suggestions of the higher intersubject variability range.

  8. A Computational Trust Model for Collaborative Ventures

    Directory of Open Access Journals (Sweden)

    Weigang Wang

    2012-01-01

    Full Text Available Problem statement: The conceptual notion of trust and its underlying computational methods has been an important issue for researchers in electronic communities. While the independent trust evaluation is suitable in certain circumstances, such unilateral process falls short in supporting mutual evaluation between partners. Perceived reputation, the depth and breadth of trust, Trust Perception (TP, Repeat Collaborators at a Threshold (RCT and a collective trust index (c index have all been defined to specify the optimal trust criteria. Approach: By taking the evaluator’s own trust level as a threshold to identify compatible partners, a mutual balance between excess and deficiency in trust has been addressed. Since the number of repeated collaborations which signify retested confidence is more straightforward to capture than the manually provided feedback ratings, we have developed computational definitions for the above-mentioned concepts. Results and Conclusion: The results from the experiments based on the eBay dataset shows that the c index can be used to classify PowerSellers into normally distributed and comprehensible categories that can facilitate mutual evaluation.

  9. Spreading of a chain macromolecule onto a cell membrane by a computer simulation Model

    Science.gov (United States)

    Xie, Jun; Pandey, Ras

    2002-03-01

    Computer simulations are performed to study conformation and dynamics of a relatively large chain macromolecule at the surface of a model membrane - a preliminary attempt to ultimately realistic model for protein on a cell membrane. We use a discrete lattice of size Lx × L × L. The chain molecule of length Lc is modeled by consecutive nodes connected by bonds on the trail of a random walk with appropriate constraints such as excluded volume, energy dependent configurational bias, etc. Monte Carlo method is used to move chains via segmental dynamics, i.e., end-move, kink-jump, crank-shaft, reptation, etc. Membrane substrate is designed by a self-assemble biased short chains on a substrate. Large chain molecule is then driven toward the membrane by a field. We investigate the dynamics of chain macromolecule, spread of its density, and conformation.

  10. Frictional sliding in layered rock model: Preliminary experiments. Yucca Mountain Site Characterization Project

    Energy Technology Data Exchange (ETDEWEB)

    Perry, K.E. Jr.; Buescher, B.J.; Anderson, D.; Epstein, J.S. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-09-01

    An important aspect of determining the suitability of Yucca Mountain as a possible nuclear waste repository requires understanding the mechanical behavior of jointed rock-masses. To this end we have studied the frictional sliding between simulated rock joints in the laboratory using the technique of phase shifting moire interferometry. The models were made from stacks of Lexan plates and contained a central hole to induce slip between the plates when the models were loaded in compression. These preliminary results confirm the feasibility of the approach and show a clear evolution of slip as function of load.

  11. Computational Modeling of Turbulent Spray Combustion

    NARCIS (Netherlands)

    Ma, L.

    2016-01-01

    The objective of the research presented in this thesis is development and validation of predictive models or modeling approaches of liquid fuel combustion (spray combustion) in hot-diluted environments, known as flameless combustion or MILD combustion. The goal is to combine good physical insight,

  12. Computational Modeling of Turbulent Spray Combustion

    NARCIS (Netherlands)

    Ma, L.

    2016-01-01

    The objective of the research presented in this thesis is development and validation of predictive models or modeling approaches of liquid fuel combustion (spray combustion) in hot-diluted environments, known as flameless combustion or MILD combustion. The goal is to combine good physical insight, a

  13. Aespoe Pillar Stability Experiment. Final coupled 3D thermo-mechanical modeling. Preliminary particle mechanical modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wanne, Toivo; Johansson, Erik; Potyondy, David [Saanio and Riekkola Oy, Helsinki (Finland)

    2004-02-01

    SKB is planning to perform a large-scale pillar stability experiment called APSE (Aespoe Pillar Stability Experiment) at Aespoe HRL. The study is focused on understanding and control of progressive rock failure in hard crystalline rock and damage caused by high stresses. The elastic thermo-mechanical modeling was carried out in three dimensions because of the complex test geometry and in-situ stress tensor by using a finite-difference modeling software FLAC3D. Cracking and damage formation were modeled in the area of interest (pillar between two large scale holes) in two dimensions by using the Particle Flow Code (PFC), which is based on particle mechanics. FLAC and PFC were coupled to minimize the computer resources and the computing time. According to the modeling the initial temperature rises from 15 deg C to about 65 deg C in the pillar area during the heating period of 120 days. The rising temperature due to thermal expansion induces stresses in the pillar area and after 120 days heating the stresses have increased about 33% from the excavation induced maximum stress of 150 MPa to 200 MPa in the end of the heating period. The results from FLAC3D model showed that only regions where the crack initiation stress has exceeded were identified and they extended to about two meters down the hole wall. These could be considered the areas where damage may occur during the in-situ test. When the other hole is pressurized with a 0.8 MPa confining pressure it yields that 5 MPa more stress is needed to damage the rock than without confining pressure. This makes the damaged area in some degree smaller. High compressive stresses in addition to some tensile stresses might induce some AE (acoustic emission) activity in the upper part of the hole from the very beginning of the test and are thus potential areas where AE activities may be detected. Monitoring like acoustic emissions will be measured during the test execution. The 2D coupled PFC-FLAC modeling indicated that

  14. A computational model of cardiovascular physiology and heart sound generation.

    Science.gov (United States)

    Watrous, Raymond L

    2009-01-01

    A computational model of the cardiovascular system is described which provides a framework for implementing and testing quantitative physiological models of heart sound generation. The lumped-parameter cardiovascular model can be solved for the hemodynamic variables on which the heart sound generation process is built. Parameters of the cardiovascular model can be adjusted to represent various normal and pathological conditions, and the acoustic consequences of those adjustments can be explored. The combined model of the physiology of cardiovascular circulation and heart sound generation has promise for application in teaching, training and algorithm development in computer-aided auscultation of the heart.

  15. Functional Analysis and Preliminary Specifications for a Single Integrated Central Computer System for Secondary Schools and Junior Colleges. A Feasibility and Preliminary Design Study. Interim Report.

    Science.gov (United States)

    Computation Planning, Inc., Bethesda, MD.

    A feasibility analysis of a single integrated central computer system for secondary schools and junior colleges finds that a central computing facility capable of serving 50 schools with a total enrollment of 100,000 students is feasible at a cost of $18 per student per year. The recommended system is a multiprogrammed-batch operation. Preliminary…

  16. Computational model for Halorhodopsin photocurrent kinetics

    Science.gov (United States)

    Bravo, Jaime; Stefanescu, Roxana; Talathi, Sachin

    2013-03-01

    Optogenetics is a rapidly developing novel optical stimulation technique that employs light activated ion channels to excite (using channelrhodopsin (ChR)) or suppress (using halorhodopsin (HR)) impulse activity in neurons with high temporal and spatial resolution. This technique holds enormous potential to externally control activity states in neuronal networks. The channel kinetics of ChR and HR are well understood and amenable for mathematical modeling. Significant progress has been made in recent years to develop models for ChR channel kinetics. To date however, there is no model to mimic photocurrents produced by HR. Here, we report the first model developed for HR photocurrents based on a four-state model of the HR photocurrent kinetics. The model provides an excellent fit (root-mean-square error of 3.1862x10-4, to an empirical profile of experimentally measured HR photocurrents. In combination, mathematical models for ChR and HR photocurrents can provide effective means to design test light based control systems to regulate neural activity, which in turn may have implications for the development of novel light based stimulation paradigms for brain disease control. I would like to thank the University of Florida and the Physics Research Experience for Undergraduates (REU) program, funded through NSF DMR-1156737. This research was also supported through start-up funds provided to Dr. Sachin Talathi

  17. COMPUTATION MODELING OF TCDD DISRUPTION OF B CELL TERMINAL DIFFERENTIATION

    Science.gov (United States)

    In this study, we established a computational model describing the molecular circuit underlying B cell terminal differentiation and how TCDD may affect this process by impinging upon various molecular targets.

  18. The Next Generation ARC Middleware and ATLAS Computing Model

    CERN Document Server

    Filipcic, A; The ATLAS collaboration; Smirnova, O; Konstantinov, A; Karpenko, D

    2012-01-01

    The distributed NDGF Tier-1 and associated Nordugrid clusters are well integrated into the ATLAS computing model but follow a slightly different paradigm than other ATLAS resources. The current strategy does not divide the sites as in the commonly used hierarchical model, but rather treats them as a single storage endpoint and a pool of distributed computing nodes. The next generation ARC middleware with its several new technologies provides new possibilities in development of the ATLAS computing model, such as pilot jobs with pre-cached input files, automatic job migration between the sites, integration of remote sites without connected storage elements, and automatic brokering for jobs with non-standard resource requirements. ARC's data transfer model provides an automatic way for the computing sites to participate in ATLAS' global task management system without requiring centralised brokering or data transfer services. The powerful API combined with Python and Java bindings can easily be used to build new ...

  19. PETRI NET MODELING OF COMPUTER VIRUS LIFE CYCLE

    African Journals Online (AJOL)

    Dr Obe

    dynamic system analysis is applied to model the virus life cycle. Simulation of the .... of successful applications of Petri nets include distributed database systems, communication protocols, .... Concepts and Design". McGraw-Hill Computer.

  20. An analysis of symbolic linguistic computing models in decision making

    Science.gov (United States)

    Rodríguez, Rosa M.; Martínez, Luis

    2013-01-01

    It is common that experts involved in complex real-world decision problems use natural language for expressing their knowledge in uncertain frameworks. The language is inherent vague, hence probabilistic decision models are not very suitable in such cases. Therefore, other tools such as fuzzy logic and fuzzy linguistic approaches have been successfully used to model and manage such vagueness. The use of linguistic information implies to operate with such a type of information, i.e. processes of computing with words (CWW). Different schemes have been proposed to deal with those processes, and diverse symbolic linguistic computing models have been introduced to accomplish the linguistic computations. In this paper, we overview the relationship between decision making and CWW, and focus on symbolic linguistic computing models that have been widely used in linguistic decision making to analyse if all of them can be considered inside of the CWW paradigm.

  1. Hybrid Computational Model for High-Altitude Aeroassist Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A hybrid continuum/noncontinuum computational model will be developed for analyzing the aerodynamics and heating on aeroassist vehicles. Unique features of this...

  2. Computational modelling in materials at the University of the North

    CSIR Research Space (South Africa)

    Ngoepe, PE

    2005-09-01

    Full Text Available The authors review computational modelling studies in materials resulting from the National Research Foundation-Royal Society collaboration. Initially, investigations were confined to transport and defect properties in fluorine and oxygen ion...

  3. Recent Applications of Hidden Markov Models in Computational Biology

    Institute of Scientific and Technical Information of China (English)

    Khar Heng Choo; Joo Chuan Tong; Louxin Zhang

    2004-01-01

    This paper examines recent developments and applications of Hidden Markov Models (HMMs) to various problems in computational biology, including multiple sequence alignment, homology detection, protein sequences classification, and genomic annotation.

  4. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  5. Towards diagnostic model calibration and evaluation: Approximate Bayesian computation

    NARCIS (Netherlands)

    Vrugt, J.A.; Sadegh, M.

    2013-01-01

    The ever increasing pace of computational power, along with continued advances in measurement technologies and improvements in process understanding has stimulated the development of increasingly complex hydrologic models that simulate soil moisture flow, groundwater recharge, surface runoff, root w

  6. Thole's interacting polarizability model in computational chemistry practice

    NARCIS (Netherlands)

    deVries, AH; vanDuijnen, PT; Zijlstra, RWJ; Swart, M

    1997-01-01

    Thole's interacting polarizability model to calculate molecular polarizabilities from interacting atomic polarizabilities is reviewed and its major applications in computational chemistry are illustrated. The applications include prediction of molecular polarizabilities, use in classical expressions

  7. ON GLOBAL STABILITY OF A NONRESIDENT COMPUTER VIRUS MODEL

    Institute of Scientific and Technical Information of China (English)

    Yoshiaki MUROYA; Huaixing LI; Toshikazu KUNIYA

    2014-01-01

    In this paper, we establish new sufficient conditions for the infected equilibrium of a nonresident computer virus model to be globally asymptotically stable. Our results extend two kind of known results in recent literature.

  8. Hybrid Computational Model for High-Altitude Aeroassist Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort addresses a need for accurate computational models to support aeroassist and entry vehicle system design over a broad range of flight conditions...

  9. Emotion in Music: representation and computational modeling

    NARCIS (Netherlands)

    Aljanaki, A.|info:eu-repo/dai/nl/34570956X

    2016-01-01

    Music emotion recognition (MER) deals with music classification by emotion using signal processing and machine learning techniques. Emotion ontology for music is not well established yet. Musical emotion can be conceptualized through various emotional models: categorical, dimensional, or

  10. Computational Models for Analysis of Illicit Activities

    DEFF Research Database (Denmark)

    Nizamani, Sarwat

    Numerous illicit activities happen in our society, which, from time to time affect the population by harming individuals directly or indirectly. Researchers from different disciplines have contributed to developing strategies to analyze such activities, in order to help law enforcement agents....... These models include a model for analyzing evolution of terrorist networks; a text classification model for detecting suspicious text and identification of suspected authors of anonymous emails; and a semantic analysis model for news reports, which may help analyze the illicit activities in certain area...... with location and temporal information. For the network evolution, the hierarchical agglomerative clustering approach has been applied to terrorist networks as case studies. The networks' evolutions show that how individual actors who are initially isolated from each other are converted in small groups, which...

  11. Computational model of cellular metabolic dynamics

    DEFF Research Database (Denmark)

    Li, Yanjun; Solomon, Thomas; Haus, Jacob M

    2010-01-01

    : intracellular metabolite concentrations and patterns of glucose disposal. Model variations were simulated to investigate three alternative mechanisms to explain insulin enhancements: Model 1 (M.1), simple mass action; M.2, insulin-mediated activation of key metabolic enzymes (i.e., hexokinase, glycogen synthase......Identifying the mechanisms by which insulin regulates glucose metabolism in skeletal muscle is critical to understanding the etiology of insulin resistance and type 2 diabetes. Our knowledge of these mechanisms is limited by the difficulty of obtaining in vivo intracellular data. To quantitatively...... distinguish significant transport and metabolic mechanisms from limited experimental data, we developed a physiologically based, multiscale mathematical model of cellular metabolic dynamics in skeletal muscle. The model describes mass transport and metabolic processes including distinctive processes...

  12. Computer modelling of granular material microfracturing

    CSIR Research Space (South Africa)

    Malan, DF

    1995-08-15

    Full Text Available Microscopic observations indicate that intra- and transgranular fracturing are ubiquitous processes in the damage of rock fabrics. Extensive modelling of intergranular fracturing has been carried out previously using the distinct-element approach...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  14. Fish-Friendly Hydropower Turbine Development & Deployment: Alden Turbine Preliminary Engineering and Model Testing

    Energy Technology Data Exchange (ETDEWEB)

    Foust, J. [Voith Hydro, Inc., York, PA (USA); Hecker, G. [Alden Research Laboratory, Inc., Holden, MA (USA); Li, S. [Alden Research Laboratory, Inc., Holden, MA (USA); Allen, G. [Alden Research Laboratory, Inc., Holden, MA (USA)

    2011-10-01

    The Alden turbine was developed through the U.S. Department of Energy's (DOE's) former Advanced Hydro Turbine Systems Program (1994-2006) and, more recently, through the Electric Power Research Institute (EPRI) and the DOE's Wind & Water Power Program. The primary goal of the engineering study described here was to provide a commercially competitive turbine design that would yield fish passage survival rates comparable to or better than the survival rates of bypassing or spilling flow. Although the turbine design was performed for site conditions corresponding to 92 ft (28 m) net head and a discharge of 1500 cfs (42.5 cms), the design can be modified for additional sites with differing operating conditions. During the turbine development, design modifications were identified for the spiral case, distributor (stay vanes and wicket gates), runner, and draft tube to improve turbine performance while maintaining features for high fish passage survival. Computational results for pressure change rates and shear within the runner passage were similar in the original and final turbine geometries, while predicted minimum pressures were higher for the final turbine. The final turbine geometry and resulting flow environments are expected to further enhance the fish passage characteristics of the turbine. Computational results for the final design were shown to improve turbine efficiencies by over 6% at the selected operating condition when compared to the original concept. Prior to the release of the hydraulic components for model fabrication, finite element analysis calculations were conducted for the stay vanes, wicket gates, and runner to verify that structural design criteria for stress and deflections were met. A physical model of the turbine was manufactured and tested with data collected for power and efficiency, cavitation limits, runaway speed, axial and radial thrust, pressure pulsations, and wicket gate torque. All parameters were observed to fall

  15. Computational modelling of the impact of AIDS on business.

    Science.gov (United States)

    Matthews, Alan P

    2007-07-01

    An overview of computational modelling of the impact of AIDS on business in South Africa, with a detailed description of the AIDS Projection Model (APM) for companies, developed by the author, and suggestions for further work. Computational modelling of the impact of AIDS on business in South Africa requires modelling of the epidemic as a whole, and of its impact on a company. This paper gives an overview of epidemiological modelling, with an introduction to the Actuarial Society of South Africa (ASSA) model, the most widely used such model for South Africa. The APM produces projections of HIV prevalence, new infections, and AIDS mortality on a company, based on the anonymous HIV testing of company employees, and projections from the ASSA model. A smoothed statistical model of the prevalence test data is computed, and then the ASSA model projection for each category of employees is adjusted so that it matches the measured prevalence in the year of testing. FURTHER WORK: Further techniques that could be developed are microsimulation (representing individuals in the computer), scenario planning for testing strategies, and models for the business environment, such as models of entire sectors, and mapping of HIV prevalence in time and space, based on workplace and community data.

  16. Formal modelling techniques in human-computer interaction

    NARCIS (Netherlands)

    Haan, de G.; Veer, van der G.C.; Vliet, van J.C.

    1991-01-01

    This paper is a theoretical contribution, elaborating the concept of models as used in Cognitive Ergonomics. A number of formal modelling techniques in human-computer interaction will be reviewed and discussed. The analysis focusses on different related concepts of formal modelling techniques in hum

  17. Theoretic computing model of combustion process of asphalt smoke

    Institute of Scientific and Technical Information of China (English)

    HUANG Rui; CHAI Li-yuan; HE De-wen; PENG Bing; WANG Yun-yan

    2005-01-01

    Based on the data and methods provided by research literature, dispersing mathematical model of combustion process of asphalt smoke is set by theoretic analysis. Through computer programming, the dynamic combustion process of asphalt smoke is calculated to simulate an experimental model. The computing result shows that the temperature and the concentration of asphalt smoke influence its burning temperature in approximatively linear manner. The consumed quantity of fuel to ignite the asphalt smoke needs to be measured from the two factors.

  18. A Situative Space Model for Mobile Mixed-Reality Computing

    DEFF Research Database (Denmark)

    Pederson, Thomas; Janlert, Lars-Erik; Surie, Dipak

    2011-01-01

    This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time.......This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time....

  19. Language acquisition and implication for language change: A computational model.

    OpenAIRE

    Clark, Robert A.J.

    1997-01-01

    Computer modeling techniques, when applied to language acquisition problems, give an often unrealized insight into the diachronic change that occurs in language over successive generations. This paper shows that using assumptions about language acquisition to model successive generations of learners in a computer simulation, can have a drastic effect on the long term changes that occur in a language. More importantly, it shows that slight changes in the acquisition ...

  20. Cascade recursion models of computing the temperatures of underground layers

    Institute of Scientific and Technical Information of China (English)

    HAN; Liqun; BI; Siwen; SONG; Shixin

    2006-01-01

    An RBF neural network was used to construct computational models of the underground temperatures of different layers, using ground-surface parameters and the temperatures of various underground layers. Because series recursion models also enable researchers to use above-ground surface parameters to compute the temperatures of different underground layers, this method provides a new way of using thermal infrared remote sensing to monitor the suture zones of large areas of blocks and to research thermal anomalies in geologic structures.

  1. Mathematical and computational modeling in biology at multiple scales

    OpenAIRE

    Tuszynski, Jack A; Winter, Philip; White, Diana; Tseng, Chih-Yuan; Sahu, Kamlesh K.; Gentile, Francesco; Spasevska, Ivana; Omar, Sara Ibrahim; Nayebi, Niloofar; Churchill, Cassandra DM; Klobukowski, Mariusz; El-Magd, Rabab M Abou

    2014-01-01

    A variety of topics are reviewed in the area of mathematical and computational modeling in biology, covering the range of scales from populations of organisms to electrons in atoms. The use of maximum entropy as an inference tool in the fields of biology and drug discovery is discussed. Mathematical and computational methods and models in the areas of epidemiology, cell physiology and cancer are surveyed. The technique of molecular dynamics is covered, with special attention to force fields f...

  2. Models for the Discrete Berth Allocation Problem: A Computational Comparison

    DEFF Research Database (Denmark)

    Buhrkal, Katja; Zuglian, Sara; Røpke, Stefan

    In this paper we consider the problem of allocating arriving ships to discrete berth locations at container terminals. This problem is recognized as one of the most important processes for any container terminal. We review and describe the three main models of the discrete dynamic berth allocation...... problem, improve the performance of one model, and, through extensive numerical tests, compare all models from a computational perspective. The results indicate that a generalized setpartitioning model outperforms all other existing models....

  3. Models for the discrete berth allocation problem: A computational comparison

    DEFF Research Database (Denmark)

    Buhrkal, Katja Frederik; Zuglian, Sara; Røpke, Stefan;

    2011-01-01

    In this paper we consider the problem of allocating arriving ships to discrete berth locations at container terminals. This problem is recognized as one of the most important processes for any container terminal. We review and describe three main models of the discrete dynamic berth allocation pr...... problem, improve the performance of one model, and, through extensive numerical tests, compare all models from a computational perspective. The results indicate that a generalized set-partitioning model outperforms all other existing models....

  4. Computer Models and Automata Theory in Biology and Medicine

    CERN Document Server

    Baianu, I C

    2004-01-01

    The applications of computers to biological and biomedical problem solving goes back to the very beginnings of computer science, automata theory [1], and mathematical biology [2]. With the advent of more versatile and powerful computers, biological and biomedical applications of computers have proliferated so rapidly that it would be virtually impossible to compile a comprehensive review of all developments in this field. Limitations of computer simulations in biology have also come under close scrutiny, and claims have been made that biological systems have limited information processing power [3]. Such general conjectures do not, however, deter biologists and biomedical researchers from developing new computer applications in biology and medicine. Microprocessors are being widely employed in biological laboratories both for automatic data acquisition/processing and modeling; one particular area, which is of great biomedical interest, involves fast digital image processing and is already established for rout...

  5. Improved Computational Model of Grid Cells Based on Column Structure

    Institute of Scientific and Technical Information of China (English)

    Yang Zhou; Dewei Wu; Weilong Li; Jia Du

    2016-01-01

    To simulate the firing pattern of biological grid cells, this paper presents an improved computational model of grid cells based on column structure. In this model, the displacement along different directions is processed by modulus operation, and the obtained remainder is associated with firing rate of grid cell. Compared with the original model, the improved parts include that: the base of modulus operation is changed, and the firing rate in firing field is encoded by Gaussian⁃like function. Simulation validates that the firing pattern generated by the improved computational model is more consistent with biological characteristic than original model. Besides, the firing pattern is badly influenced by the cumulative positioning error, but the computational model can also generate the regularly hexagonal firing pattern when the real⁃time positioning results are modified.

  6. Complex system modelling and control through intelligent soft computations

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The book offers a snapshot of the theories and applications of soft computing in the area of complex systems modeling and control. It presents the most important findings discussed during the 5th International Conference on Modelling, Identification and Control, held in Cairo, from August 31-September 2, 2013. The book consists of twenty-nine selected contributions, which have been thoroughly reviewed and extended before their inclusion in the volume. The different chapters, written by active researchers in the field, report on both current theories and important applications of soft-computing. Besides providing the readers with soft-computing fundamentals, and soft-computing based inductive methodologies/algorithms, the book also discusses key industrial soft-computing applications, as well as multidisciplinary solutions developed for a variety of purposes, like windup control, waste management, security issues, biomedical applications and many others. It is a perfect reference guide for graduate students, r...

  7. Computational model of miniature pulsating heat pipes.

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, Mario J.; Givler, Richard C.

    2013-01-01

    The modeling work described herein represents Sandia National Laboratories (SNL) portion of a collaborative three-year project with Northrop Grumman Electronic Systems (NGES) and the University of Missouri to develop an advanced, thermal ground-plane (TGP), which is a device, of planar configuration, that delivers heat from a source to an ambient environment with high efficiency. Work at all three institutions was funded by DARPA/MTO; Sandia was funded under DARPA/MTO project number 015070924. This is the final report on this project for SNL. This report presents a numerical model of a pulsating heat pipe, a device employing a two phase (liquid and its vapor) working fluid confined in a closed loop channel etched/milled into a serpentine configuration in a solid metal plate. The device delivers heat from an evaporator (hot zone) to a condenser (cold zone). This new model includes key physical processes important to the operation of flat plate pulsating heat pipes (e.g. dynamic bubble nucleation, evaporation and condensation), together with conjugate heat transfer with the solid portion of the device. The model qualitatively and quantitatively predicts performance characteristics and metrics, which was demonstrated by favorable comparisons with experimental results on similar configurations. Application of the model also corroborated many previous performance observations with respect to key parameters such as heat load, fill ratio and orientation.

  8. Computational model of miniature pulsating heat pipes

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, Mario J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Givler, Richard C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-01-01

    The modeling work described herein represents Sandia National Laboratories (SNL) portion of a collaborative three-year project with Northrop Grumman Electronic Systems (NGES) and the University of Missouri to develop an advanced, thermal ground-plane (TGP), which is a device, of planar configuration, that delivers heat from a source to an ambient environment with high efficiency. Work at all three institutions was funded by DARPA/MTO; Sandia was funded under DARPA/MTO project number 015070924. This is the final report on this project for SNL. This report presents a numerical model of a pulsating heat pipe, a device employing a two phase (liquid and its vapor) working fluid confined in a closed loop channel etched/milled into a serpentine configuration in a solid metal plate. The device delivers heat from an evaporator (hot zone) to a condenser (cold zone). This new model includes key physical processes important to the operation of flat plate pulsating heat pipes (e.g. dynamic bubble nucleation, evaporation and condensation), together with conjugate heat transfer with the solid portion of the device. The model qualitatively and quantitatively predicts performance characteristics and metrics, which was demonstrated by favorable comparisons with experimental results on similar configurations. Application of the model also corroborated many previous performance observations with respect to key parameters such as heat load, fill ratio and orientation.

  9. Computational quantum chemistry and adaptive ligand modeling in mechanistic QSAR.

    Science.gov (United States)

    De Benedetti, Pier G; Fanelli, Francesca

    2010-10-01

    Drugs are adaptive molecules. They realize this peculiarity by generating different ensembles of prototropic forms and conformers that depend on the environment. Among the impressive amount of available computational drug discovery technologies, quantitative structure-activity relationship approaches that rely on computational quantum chemistry descriptors are the most appropriate to model adaptive drugs. Indeed, computational quantum chemistry descriptors are able to account for the variation of the intramolecular interactions of the training compounds, which reflect their adaptive intermolecular interaction propensities. This enables the development of causative, interpretive and reasonably predictive quantitative structure-activity relationship models, and, hence, sound chemical information finalized to drug design and discovery.

  10. Global Stability of an Epidemic Model of Computer Virus

    Directory of Open Access Journals (Sweden)

    Xiaofan Yang

    2014-01-01

    Full Text Available With the rapid popularization of the Internet, computers can enter or leave the Internet increasingly frequently. In fact, no antivirus software can detect and remove all sorts of computer viruses. This implies that viruses would persist on the Internet. To better understand the spread of computer viruses in these situations, a new propagation model is established and analyzed. The unique equilibrium of the model is globally asymptotically stable, in accordance with the reality. A parameter analysis of the equilibrium is also conducted.

  11. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    -friendly system, which will make the model development process easier and faster and provide the way for unified and consistent model documentation. The modeller can use the template for their specific problem or to extend and/or adopt a model. This is based on the idea of model reuse, which emphasizes the use...... and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer...... aided methods and tools, that include procedures to perform model translation, model analysis, model verification/validation, model solution and model documentation; 4) model transfer – export/import to/from other application for further extension and application – several types of formats, such as XML...

  12. Computer models to study uterine activation at labour.

    Science.gov (United States)

    Sharp, G C; Saunders, P T K; Norman, J E

    2013-11-01

    Improving our understanding of the initiation of labour is a major aim of modern obstetric research, in order to better diagnose and treat pregnant women in which the process occurs abnormally. In particular, increased knowledge will help us identify the mechanisms responsible for preterm labour, the single biggest cause of neonatal morbidity and mortality. Attempts to improve our understanding of the initiation of labour have been restricted by the inaccessibility of gestational tissues to study during pregnancy and at labour, and by the lack of fully informative animal models. However, computer modelling provides an exciting new approach to overcome these restrictions and offers new insights into uterine activation during term and preterm labour. Such models could be used to test hypotheses about drugs to treat or prevent preterm labour. With further development, an effective computer model could be used by healthcare practitioners to develop personalized medicine for patients on a pregnancy-by-pregnancy basis. Very promising work is already underway to build computer models of the physiology of uterine activation and contraction. These models aim to predict changes and patterns in uterine electrical excitation during term labour. There have been far fewer attempts to build computer models of the molecular pathways driving uterine activation and there is certainly scope for further work in this area. The integration of computer models of the physiological and molecular mechanisms that initiate labour will be particularly useful.

  13. Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.

    Science.gov (United States)

    Pearl, Lisa S; Sprouse, Jon

    2015-06-01

    Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.

  14. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  15. Computational social network modeling of terrorist recruitment.

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Nina M.; Turnley, Jessica Glicken (Sandia National Laboratories, Albuquerque, NM); Smrcka, Julianne D. (Sandia National Laboratories, Albuquerque, NM); Ko, Teresa H.; Moy, Timothy David (Sandia National Laboratories, Albuquerque, NM); Wu, Benjamin C.

    2004-10-01

    The Seldon terrorist model represents a multi-disciplinary approach to developing organization software for the study of terrorist recruitment and group formation. The need to incorporate aspects of social science added a significant contribution to the vision of the resulting Seldon toolkit. The unique addition of and abstract agent category provided a means for capturing social concepts like cliques, mosque, etc. in a manner that represents their social conceptualization and not simply as a physical or economical institution. This paper provides an overview of the Seldon terrorist model developed to study the formation of cliques, which are used as the major recruitment entity for terrorist organizations.

  16. Computational modelling of buckling of woven fabrics

    CSIR Research Space (South Africa)

    Anandjiwala, RD

    2006-02-01

    Full Text Available generalized model of a plain woven fabric and subsequently for modifying Huang’s extension analysis. Although, Kang et al have utilized Huang’s bilinearity in their model, the obvious inconsistency of applying the classical beam theory to the textile problem... couple which influences the behaviour of textile materials, such as yarns and fabrics. This implies that M a = 0 and B = B*. When substituting these values in Equations (4) to (16) equations are obtained that are similar to the buckling of a strut...

  17. Scratch as a computational modelling tool for teaching physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-05-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling programs. In this article, we briefly discuss why Scratch could be a useful tool for computational modelling in the primary or secondary physics classroom, and we present practical examples of how it can be used to build a model.

  18. Assessment of weld thickness loss in offshore pipelines using computed radiography and computational modeling

    Energy Technology Data Exchange (ETDEWEB)

    Correa, S.C.A. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Centro Universitario Estadual da Zona Oeste - UEZO, Avenida Manuel Caldeira de Alvarenga, 1203, 23070-200, Rio de Janeiro, RJ (Brazil)], E-mail: scorrea@con.ufrj.br; Souza, E.M. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Departamento de Geologia/IGEO, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Oliveira, D.F. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Silva, A.X. [PEN/COPPE-DNC/Poli-CT, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear, COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Marinho, C.; Camerini, C.S. [CENPES/PDEP/TMEC/PETROBRAS, Ilha do Fundao, Cidade Universitaria, 21949-900, Rio de Janeiro, RJ (Brazil)

    2009-10-15

    In order to guarantee the structural integrity of oil plants it is crucial to monitor the amount of weld thickness loss in offshore pipelines. However, in spite of its relevance, this parameter is very difficult to determine, due to both the large diameter of most pipes and the complexity of the multi-variable system involved. In this study, a computational modeling based on Monte Carlo MCNPX code is combined with computed radiography to estimate the weld thickness loss in large-diameter offshore pipelines. Results show that computational modeling is a powerful tool to estimate intensity variations in radiographic images generated by weld thickness variations, and it can be combined with computed radiography to assess weld thickness loss in offshore and subsea pipelines.

  19. Computational Modeling of Fluorescence Loss in Photobleaching

    DEFF Research Database (Denmark)

    Hansen, Christian Valdemar; Schroll, Achim; Wüstner, Daniel

    2015-01-01

    Fluorescence loss in photobleaching (FLIP) is a modern microscopy method for visualization of transport processes in living cells. Although FLIP is widespread, an automated reliable analysis of image data is still lacking. This paper presents a framework for modeling and simulation of FLIP...

  20. Electricity load modelling using computational intelligence

    NARCIS (Netherlands)

    Ter Borg, R.W.

    2005-01-01

    As a consequence of the liberalisation of the electricity markets in Europe, market players have to continuously adapt their future supply to match their customers' demands. This poses the challenge of obtaining a predictive model that accurately describes electricity loads, current in this thesis.

  1. A Computational Model of Spatial Development

    Science.gov (United States)

    Hiraki, Kazuo; Sashima, Akio; Phillips, Steven

    Psychological experiments on children's development of spatial knowledge suggest experience at self-locomotion with visual tracking as important factors. Yet, the mechanism underlying development is unknown. We propose a robot that learns to mentally track a target object (i.e., maintaining a representation of an object's position when outside the field-of-view) as a model for spatial development. Mental tracking is considered as prediction of an object's position given the previous environmental state and motor commands, and the current environment state resulting from movement. Following Jordan & Rumelhart's (1992) forward modeling architecture the system consists of two components: an inverse model of sensory input to desired motor commands; and a forward model of motor commands to desired sensory input (goals). The robot was tested on the `three cups' paradigm (where children are required to select the cup containing the hidden object under various movement conditions). Consistent with child development, without the capacity for self-locomotion the robot's errors are self-center based. When given the ability of self-locomotion the robot responds allocentrically.

  2. Emotion in Music: representation and computational modeling

    NARCIS (Netherlands)

    Aljanaki, A.

    2016-01-01

    Music emotion recognition (MER) deals with music classification by emotion using signal processing and machine learning techniques. Emotion ontology for music is not well established yet. Musical emotion can be conceptualized through various emotional models: categorical, dimensional, or domain-spec

  3. Computational Failure Modeling of Lower Extremities

    Science.gov (United States)

    2012-01-01

    0.3 σc = 132 MPa c = 0.1 ρ = 1810 kg/m3 [15] Trabecular bone Elastic with maximum principle stress-based fracture model E = 300 MPa v = 0.45 σc...39762 1 SANDIA NATL LAB NANOSCALE AND REACTIVE PROCESSES S SCHUMACHER PO BOX 5800 MS 0836 ALBUQUERQUE NEW MEXICO 87185-0836

  4. Predictive models applied to groundwater level forecasting: a preliminary experience on the alluvial aquifer of the Magra River (Italy).

    Science.gov (United States)

    Brozzo, Gianpiero; Doveri, Marco; Lelli, Matteo; Scozzari, Andrea

    2010-05-01

    Computer-based decision support systems are getting a growing interest for water managing authorities and water distribution companies. This work discusses a preliminary experience in the application of computational intelligence in a hydrological modeling framework, regarding the study area of the alluvial aquifer of the Magra River (Italy). Two sites in the studied area, corresponding to two distinct groups of wells (Battifollo and Fornola) are managed by the local drinkable water distribution company (ACAM Acque), which serves the area of La Spezia, on the Ligurian coast. Battifollo has 9 wells with a total extraction rate of about 240 liters per second, while Fornola has 44 wells with an extraction rate of about 900 liters per second. Objective of this work is to make use of time series coming from long-term monitoring activities in order to assess the trend of the groundwater level with respect to a set of environmental and exploitation parameters; this is accomplished by the experimentation of a suitable model, eligible to be used as a predictor. This activity moves on from the modeling of the system behavior, based on a set of Input/Output data, in order to characterize it without necessarily a prior knowledge of any deterministic mechanism (system identification). In this context, data series collected by continuous hydrological monitoring instrumentation installed in the studied sites, together with meteorological and water extraction data, have been analyzed in order to assess the applicability and performance of a predictive model of the groundwater level. A mixed approach (both data driven and process-based) has been experimented on the whole dataset relating to the last ten years of continuous monitoring activity. The system identification approach presented here is based on the integration of an adaptive technique based on Artificial Neural Networks (ANNs) and a blind deterministic identification approach. According to this concept, the behavior of

  5. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  6. Enabling Grid Computing resources within the KM3NeT computing model

    Science.gov (United States)

    Filippidis, Christos

    2016-04-01

    KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that - located at the bottom of the Mediterranean Sea - will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  7. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  8. PanDA's Role in ATLAS Computing Model Evolution

    CERN Document Server

    Maeno, T; The ATLAS collaboration

    2014-01-01

    During Run 1 at the Large Hadron Collider from 2009-2013, the ATLAS experiment successfully met the computing challenge of accumulating, managing and analyzing a volume of data now exceeding 140 PB, processed at over 100 sites around the world, and accessed by thousands of physicists. This accomplishment required nimbleness and flexibility in the distributed computing infrastructure, both hardware and software, as the operational computing model evolved during the run based on experience. A critical enabler for this evolution was PanDA, the ATLAS workload management system used for production and distributed analysis. PanDA's capabilities were utilized and extended to dynamically and intelligently distribute data and processing workloads across ATLAS resources based on data popularity and resource availability, thereby 'flattening' an originally hierarchical computing model, in order to use resources more efficiently. A new round of PanDA development now taking place will continue to evolve the model for bett...

  9. Computational challenges in modeling and simulating living matter

    Science.gov (United States)

    Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.; de Castro, Maria Clicia Stelling

    2016-12-01

    Computational modeling has been successfully used to help scientists understand physical and biological phenomena. Recent technological advances allowthe simulation of larger systems, with greater accuracy. However, devising those systems requires new approaches and novel architectures, such as the use of parallel programming, so that the application can run in the new high performance environments, which are often computer clusters composed of different computation devices, as traditional CPUs, GPGPUs, Xeon Phis and even FPGAs. It is expected that scientists take advantage of the increasing computational power to model and simulate more complex structures and even merge different models into larger and more extensive ones. This paper aims at discussing the challenges of using those devices to simulate such complex systems.

  10. A Lumped Computational Model for Sodium Sulfur Battery Analysis

    Science.gov (United States)

    Wu, Fan

    Due to the cost of materials and time consuming testing procedures, development of new batteries is a slow and expensive practice. The purpose of this study is to develop a computational model and assess the capabilities of such a model designed to aid in the design process and control of sodium sulfur batteries. To this end, a transient lumped computational model derived from an integral analysis of the transport of species, energy and charge throughout the battery has been developed. The computation processes are coupled with the use of Faraday's law, and solutions for the species concentrations, electrical potential and current are produced in a time marching fashion. Properties required for solving the governing equations are calculated and updated as a function of time based on the composition of each control volume. The proposed model is validated against multi- dimensional simulations and experimental results from literatures, and simulation results using the proposed model is presented and analyzed. The computational model and electrochemical model used to solve the equations for the lumped model are compared with similar ones found in the literature. The results obtained from the current model compare favorably with those from experiments and other models.

  11. Modeling Strategic Use of Human Computer Interfaces with Novel Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Laura Jane Mariano

    2015-07-01

    Full Text Available Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game’s functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic

  12. Images as drivers of progress in cardiac computational modelling.

    Science.gov (United States)

    Lamata, Pablo; Casero, Ramón; Carapella, Valentina; Niederer, Steve A; Bishop, Martin J; Schneider, Jürgen E; Kohl, Peter; Grau, Vicente

    2014-08-01

    Computational models have become a fundamental tool in cardiac research. Models are evolving to cover multiple scales and physical mechanisms. They are moving towards mechanistic descriptions of personalised structure and function, including effects of natural variability. These developments are underpinned to a large extent by advances in imaging technologies. This article reviews how novel imaging technologies, or the innovative use and extension of established ones, integrate with computational models and drive novel insights into cardiac biophysics. In terms of structural characterization, we discuss how imaging is allowing a wide range of scales to be considered, from cellular levels to whole organs. We analyse how the evolution from structural to functional imaging is opening new avenues for computational models, and in this respect we review methods for measurement of electrical activity, mechanics and flow. Finally, we consider ways in which combined imaging and modelling research is likely to continue advancing cardiac research, and identify some of the main challenges that remain to be solved.

  13. Computer modeling of road bridge for simulation moving load

    Directory of Open Access Journals (Sweden)

    Miličić Ilija M.

    2016-01-01

    Full Text Available In this paper is shown computational modelling one span road structures truss bridge with the roadway on the upper belt of. Calculation models were treated as planar and spatial girders made up of 1D finite elements with applications for CAA: Tower and Bridge Designer 2016 (2nd Edition. The conducted computer simulations results are obtained for each comparison of the impact of moving load according to the recommendations of the two standards SRPS and AASHATO. Therefore, it is a variant of the bridge structure modeling application that provides Bridge Designer 2016 (2nd Edition identical modeled in an environment of Tower. As important information for the selection of a computer applications point out that the application Bridge Designer 2016 (2nd Edition we arent unable to treat the impacts moving load model under national standard - V600. .

  14. A novel computer simulation for modeling grain growth

    Energy Technology Data Exchange (ETDEWEB)

    Chen, L.Q. (Pennsylvania State Univ., University Park, PA (United States). Dept. of Materials Science and Engineering)

    1995-01-01

    In this paper, the author proposes a new computer simulation model for investigating grain growth kinetics, born from the recent work on the domain growth kinetics of a quenched system with many non-conserved order parameters. A key new feature of this model for studying grain growth is that the grain boundaries are diffuse, as opposed to previous meanfield and statistical theories and Monte-Carlo simulations which assumed that grain boundaries were sharp. Unlike the Monte-Carlo simulations in which grain boundaries are made up of kinks, grain boundaries in the continuum model are smooth. Below, he describes this model in detail, give prescriptions for computer simulation, and then present computer simulation results on a two-dimensional model system.

  15. Integrated Multiscale Modeling of Molecular Computing Devices

    Energy Technology Data Exchange (ETDEWEB)

    Gregory Beylkin

    2012-03-23

    Significant advances were made on all objectives of the research program. We have developed fast multiresolution methods for performing electronic structure calculations with emphasis on constructing efficient representations of functions and operators. We extended our approach to problems of scattering in solids, i.e. constructing fast algorithms for computing above the Fermi energy level. Part of the work was done in collaboration with Robert Harrison and George Fann at ORNL. Specific results (in part supported by this grant) are listed here and are described in greater detail. (1) We have implemented a fast algorithm to apply the Green's function for the free space (oscillatory) Helmholtz kernel. The algorithm maintains its speed and accuracy when the kernel is applied to functions with singularities. (2) We have developed a fast algorithm for applying periodic and quasi-periodic, oscillatory Green's functions and those with boundary conditions on simple domains. Importantly, the algorithm maintains its speed and accuracy when applied to functions with singularities. (3) We have developed a fast algorithm for obtaining and applying multiresolution representations of periodic and quasi-periodic Green's functions and Green's functions with boundary conditions on simple domains. (4) We have implemented modifications to improve the speed of adaptive multiresolution algorithms for applying operators which are represented via a Gaussian expansion. (5) We have constructed new nearly optimal quadratures for the sphere that are invariant under the icosahedral rotation group. (6) We obtained new results on approximation of functions by exponential sums and/or rational functions, one of the key methods that allows us to construct separated representations for Green's functions. (7) We developed a new fast and accurate reduction algorithm for obtaining optimal approximation of functions by exponential sums and/or their rational representations.

  16. Instability phenomena in plasticity: Modelling and computation

    Science.gov (United States)

    Stein, E.; Steinmann, P.; Miehe, C.

    1995-12-01

    We presented aspects and results related to the broad field of strain localization with special focus on large strain elastoplastic response. Therefore, we first re-examined issues related to the classification of discontinuities and the classical description of localization with a particular emphasis on an Eulerian geometric representation. We touched the problem of mesh objectivity and discussed results of a particular regularization method, namely the micropolar approach. Generally, regularization has to preserve ellipticity and to reflect the underlying physics. For example ductile materials have to be modelled including viscous effects whereas geomaterials are adequately described by the micropolar approach. Then we considered localization phenomena within solids undergoing large strain elastoplastic deformations. Here, we documented the influence of isotropic damage on the failure analysis. Next, the interesting influence of an orthotropic yield condition on the spatial orientation of localized zones has been studied. Finally, we investigated the localization condition for an algorithmic model of finite strain single crystal plasticity.

  17. Computational Intelligence in a Human Brain Model

    Directory of Open Access Journals (Sweden)

    Viorel Gaftea

    2016-06-01

    Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

  18. Computational Intelligence in a Human Brain Model

    Directory of Open Access Journals (Sweden)

    Viorel Gaftea

    2016-06-01

    Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

  19. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  20. Hearing dummies: individualized computer models of hearing impairment.

    Science.gov (United States)

    Panda, Manasa R; Lecluyse, Wendy; Tan, Christine M; Jürgens, Tim; Meddis, Ray

    2014-10-01

    Objective: Our aim was to explore the usage of individualized computer models to simulate hearing loss based on detailed psychophysical assessment and to offer hypothetical diagnoses of the underlying pathology. Individualized computer models of normal and impaired hearing were constructed and evaluated using the psychophysical data obtained from human listeners. Computer models of impaired hearing were generated to reflect the hypothesized underlying pathology (e.g. dead regions, outer hair cell dysfunction, or reductions in endocochlear potential). These models were evaluated in terms of their ability to replicate the original patient data. Auditory profiles were measured for two normal and five hearing-impaired listeners using a battery of three psychophysical tests (absolute thresholds, frequency selectivity, and compression). The individualized computer models were found to match the data. Useful fits to the impaired profiles could be obtained by changing only a single parameter in the model of normal hearing. Sometimes, however, it was necessary to include an additional dead region. The creation of individualized computer models of hearing loss can be used to simulate auditory profiles of impaired listeners and suggest hypotheses concerning the underlying peripheral pathology.