WorldWideScience

Sample records for computer model called

  1. Turchin's Relation for Call-by-Name Computations: A Formal Approach

    Directory of Open Access Journals (Sweden)

    Antonina Nepeivoda

    2016-07-01

    Full Text Available Supercompilation is a program transformation technique that was first described by V. F. Turchin in the 1970s. In supercompilation, Turchin's relation as a similarity relation on call-stack configurations is used both for call-by-value and call-by-name semantics to terminate unfolding of the program being transformed. In this paper, we give a formal grammar model of call-by-name stack behaviour. We classify the model in terms of the Chomsky hierarchy and then formally prove that Turchin's relation can terminate all computations generated by the model.

  2. Call Centre- Computer Telephone Integration

    Directory of Open Access Journals (Sweden)

    Dražen Kovačević

    2012-10-01

    Full Text Available Call centre largely came into being as a result of consumerneeds converging with enabling technology- and by the companiesrecognising the revenue opportunities generated by meetingthose needs thereby increasing customer satisfaction. Regardlessof the specific application or activity of a Call centre, customersatisfaction with the interaction is critical to the revenuegenerated or protected by the Call centre. Physical(v, Call centreset up is a place that includes computer, telephone and supervisorstation. Call centre can be available 24 hours a day - whenthe customer wants to make a purchase, needs information, orsimply wishes to register a complaint.

  3. Computer Assisted Language Learning” (CALL

    Directory of Open Access Journals (Sweden)

    Nazlı Gündüz

    2005-10-01

    Full Text Available This article will provide an overview of computers; an overview of the history of CALL, itspros and cons, the internet, World Wide Web, Multimedia, and research related to the uses of computers in the language classroom. Also, it also aims to provide some background for the beginnerson using the Internet in language classes today. It discusses some of the common types of Internetactivities that are being used today, what the minimum requirements are for using the Internet forlanguage learning, and some easy activities you can adapt for your classes. Some special terminology related to computers will also be used in this paper. For example, computer assisted language learning(CALL refers to the sets of instructions which need to be loaded into the computer for it to be able to work in the language classroom. It should be borne in mind that CALL does not refer to the use of acomputer by a teacher to type out a worksheet or a class list or preparing his/her own teaching alone.Hardware refers to any computer equipment used, including the computer itself, the keyboard, screen (or the monitor, the disc-drive, and the printer. Software (computer programs refers to the sets of instructions which need to be loaded into the computer for it to be able to work.

  4. Interactionability in Computer Game: Call of Duty

    Directory of Open Access Journals (Sweden)

    masoud Kowsari

    2009-11-01

    Full Text Available For a long time communication theorists have criticized public media for being unilateral. What they prescript is to transform pattern of communication into bilateral one; in other words, to make media interactional. Telephone is the first fully interactional, however, there was a long road to the contemporary communication and information media which are highly interactional. Nevertheless, not all the modern media are equally interactional. Therefore, it is necessary to evaluate levels of interactionability in modern media and strength them. As other concepts in communication, interaction has different definitions, implying various aspects of the audience-media relation. And this feature of multi-dimensionality is to be considered. Quistis (2002 suggests a model in which all the technical, social, and comprehensive aspects of interaction are inherent. In other words, not only the technical aspect of media, but also audience’s perception plays a key role in the model. Visual-computer games are good instance of interactionability in modern media. However, not all games are equally interactional. Analyzing a well-known computer game “Call of Duty”, this article attempts to study different levels of interactionability. The main question is: how can one offer a pragmatic definition of three dimensions of interactionability to study computer games; and how this features are in Call of Duty applied?

  5. The Effects of the CALL Model on College English Reading Teaching

    Directory of Open Access Journals (Sweden)

    Dan Zhang

    2017-12-01

    Full Text Available Computer Assisted Language Learning (CALL is an important concept in English teaching method reform. College students’ English reading ability is an important indicator in the evaluation on the college students’ English proficiency. Therefore, this paper applies the CALL model in English reading teaching. Firstly, it introduces the application and development prospect of the CALL model, and analyzes its advantages and disadvantages; secondly, it analyzes the present situation of college English teaching and its influencing factors and then designs an application example to integrate the CALL model with different aspect of English reading. Finally, it analyzes the teaching results of college English reading under the CALL model. Therefore, in both theory and practice, this paper proves the effectiveness and innovativeness of the CALL model.

  6. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    Science.gov (United States)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  7. Computer Assisted Language Learning (CALL) Software: Evaluation ...

    African Journals Online (AJOL)

    Evaluating the nature and extent of the influence of Computer Assisted Language Learning (CALL) on the quality of language learning is highly problematic. This is owing to the number and complexity of interacting variables involved in setting the items for teaching and learning languages. This paper identified and ...

  8. Attitudes of Jordanian Undergraduate Students towards Using Computer Assisted Language Learning (CALL

    Directory of Open Access Journals (Sweden)

    Farah Jamal Abed Alrazeq Saeed

    2018-01-01

    Full Text Available The study aimed at investigating the attitudes of Jordanian undergraduate students towards using computer assisted -language learning (CALL and its effectiveness in the process of learning the English language.  In order to fulfill the study’s objective, the researchers used a questionnaire to collect data, followed-up with semi-structured interviews to investigate the students’ beliefs towards CALL. Twenty- one of Jordanian BA students majoring in English language and literature were selected according to simple random sampling. The results revealed positive attitudes towards CALL in facilitating the process of writing assignments, gaining information; making learning enjoyable; improving their creativity, productivity, academic achievement, critical thinking skills, and enhancing their knowledge about vocabulary grammar, and culture. Furthermore, they believed that computers can motivate them to learn English language and help them to communicate and interact with their teachers and colleagues. The researchers recommended conducting a research on the same topic, taking into consideration the variables of age, gender, experience in using computers, and computer skills.

  9. Pre-Service Teachers' Uses of and Barriers from Adopting Computer-Assisted Language Learning (CALL) Programs

    Science.gov (United States)

    Samani, Ebrahim; Baki, Roselan; Razali, Abu Bakar

    2014-01-01

    Success in implementation of computer-assisted language learning (CALL) programs depends on the teachers' understanding of the roles of CALL programs in education. Consequently, it is also important to understand the barriers teachers face in the use of computer-assisted language learning (CALL) programs. The current study was conducted on 14…

  10. Interactomes to Biological Phase Space: a call to begin thinking at a new level in computational biology.

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, George S.; Brown, William Michael

    2007-09-01

    Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes to make use of the new data.3

  11. Turchin's Relation for Call-by-Name Computations: A Formal Approach

    OpenAIRE

    Antonina Nepeivoda

    2016-01-01

    Supercompilation is a program transformation technique that was first described by V. F. Turchin in the 1970s. In supercompilation, Turchin's relation as a similarity relation on call-stack configurations is used both for call-by-value and call-by-name semantics to terminate unfolding of the program being transformed. In this paper, we give a formal grammar model of call-by-name stack behaviour. We classify the model in terms of the Chomsky hierarchy and then formally prove that Turchin's rel...

  12. A Computational Analysis Model for Open-ended Cognitions

    Science.gov (United States)

    Morita, Junya; Miwa, Kazuhisa

    In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.

  13. Computer Assisted Language Learning (CALL): Using Internet for Effective Language Learning

    NARCIS (Netherlands)

    Kremenska, Anelly

    2006-01-01

    Please, cite this publication as: Kremenska, A. (2006). Computer Assisted Language Learning (CALL): Using Internet for Effective Language Learning. Proceedings of International Workshop in Learning Networks for Lifelong Competence Development, TENCompetence Conference. March 30th-31st, Sofia,

  14. SPLAI: Computational Finite Element Model for Sensor Networks

    Directory of Open Access Journals (Sweden)

    Ruzana Ishak

    2006-01-01

    Full Text Available Wireless sensor network refers to a group of sensors, linked by a wireless medium to perform distributed sensing task. The primary interest is their capability in monitoring the physical environment through the deployment of numerous tiny, intelligent, wireless networked sensor nodes. Our interest consists of a sensor network, which includes a few specialized nodes called processing elements that can perform some limited computational capabilities. In this paper, we propose a model called SPLAI that allows the network to compute a finite element problem where the processing elements are modeled as the nodes in the linear triangular approximation problem. Our model also considers the case of some failures of the sensors. A simulation model to visualize this network has been developed using C++ on the Windows environment.

  15. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU

    Science.gov (United States)

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis. PMID:23840507

  16. Computer-Aided Transformation of PDE Models: Languages, Representations, and a Calculus of Operations

    Science.gov (United States)

    2016-01-05

    Computer-aided transformation of PDE models: languages, representations, and a calculus of operations A domain-specific embedded language called...languages, representations, and a calculus of operations Report Title A domain-specific embedded language called ibvp was developed to model initial...Computer-aided transformation of PDE models: languages, representations, and a calculus of operations 1 Vision and background Physical and engineered systems

  17. Calling computers names in Swedish

    International Nuclear Information System (INIS)

    Carlsson, Johan

    2017-01-01

    I very much enjoyed reading Jim Fleming’s article on Carl-Gustaf Rossby and the seminal contributions Rossby made to meteorology. Furthermore, the otherwise excellent article has two errors. Something must have gotten lost in translation to cause Fleming to claim that “Rossby pursued numerical weather prediction in Sweden in an era in which there was no Swedish word for digital computer.” With applied mathematician Germund Dahlquist, Rossby developed a weather model for the Binär Elektronisk Sekvens Kalkylator (BESK; Binary Electronic Sequence Calculator). Designed and built in Sweden, BESK was the world’s fastest computer when it became operational in 1953. From September 1954, BESK weather simulations enabled routine 24-hour national forecasts.

  18. Performance indicators for call centers with impatience

    NARCIS (Netherlands)

    Jouini, O.; Koole, G.M.; Roubos, A.

    2013-01-01

    An important feature of call center modeling is the presence of impatient customers. This article considers single-skill call centers including customer abandonments. A number of different service-level definitions are structured, including all those used in practice, and the explicit computation of

  19. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  20. Risk factors for computer visual syndrome (CVS) among operators of two call centers in São Paulo, Brazil.

    Science.gov (United States)

    Sa, Eduardo Costa; Ferreira Junior, Mario; Rocha, Lys Esther

    2012-01-01

    The aims of this study were to investigate work conditions, to estimate the prevalence and to describe risk factors associated with Computer Vision Syndrome among two call centers' operators in São Paulo (n = 476). The methods include a quantitative cross-sectional observational study and an ergonomic work analysis, using work observation, interviews and questionnaires. The case definition was the presence of one or more specific ocular symptoms answered as always, often or sometimes. The multiple logistic regression model, were created using the stepwise forward likelihood method and remained the variables with levels below 5% (p vision (43.5%). The prevalence of Computer Vision Syndrome was 54.6%. Associations verified were: being female (OR 2.6, 95% CI 1.6 to 4.1), lack of recognition at work (OR 1.4, 95% CI 1.1 to 1.8), organization of work in call center (OR 1.4, 95% CI 1.1 to 1.7) and high demand at work (OR 1.1, 95% CI 1.0 to 1.3). The organization and psychosocial factors at work should be included in prevention programs of visual syndrome among call centers' operators.

  1. Attitudes of Jordanian Undergraduate Students towards Using Computer Assisted Language Learning (CALL)

    Science.gov (United States)

    Saeed, Farah Jamal Abed Alrazeq; Al-Zayed, Norma Nawaf

    2018-01-01

    The study aimed at investigating the attitudes of Jordanian undergraduate students towards using computer assisted-language learning (CALL) and its effectiveness in the process of learning the English language. In order to fulfill the study's objective, the researchers used a questionnaire to collect data, followed-up with semi-structured…

  2. Analytical Call Center Model with Voice Response Unit and Wrap-Up Time

    Directory of Open Access Journals (Sweden)

    Petr Hampl

    2015-01-01

    Full Text Available The last twenty years of computer integration significantly changed the process of service in a call center service systems. Basic building modules of classical call centers – a switching system and a group of humans agents – was extended with other special modules such as skills-based routing module, automatic call distribution module, interactive voice response module and others to minimize the customer waiting time and wage costs. A calling customer of a modern call center is served in the first stage by the interactive voice response module without any human interaction. If the customer requirements are not satisfied in the first stage, the service continues to the second stage realized by the group of human agents. The service time of second stage – the average handle time – is divided into a conversation time and wrap-up time. During the conversation time, the agent answers customer questions and collects its requirements and during the wrap-up time (administrative time the agent completes the task without any customer interaction. The analytical model presented in this contribution is solved under the condition of statistical equilibrium and takes into account the interactive voice response module service time, the conversation time and the wrap-up time.

  3. Perceiving a calling, living a calling, and job satisfaction: testing a moderated, multiple mediator model.

    Science.gov (United States)

    Duffy, Ryan D; Bott, Elizabeth M; Allan, Blake A; Torrey, Carrie L; Dik, Bryan J

    2012-01-01

    The current study examined the relation between perceiving a calling, living a calling, and job satisfaction among a diverse group of employed adults who completed an online survey (N = 201). Perceiving a calling and living a calling were positively correlated with career commitment, work meaning, and job satisfaction. Living a calling moderated the relations of perceiving a calling with career commitment and work meaning, such that these relations were more robust for those with a stronger sense they were living their calling. Additionally, a moderated, multiple mediator model was run to examine the mediating role of career commitment and work meaning in the relation of perceiving a calling and job satisfaction, while accounting for the moderating role of living a calling. Results indicated that work meaning and career commitment fully mediated the relation between perceiving a calling and job satisfaction. However, the indirect effects of work meaning and career commitment were only significant for individuals with high levels of living a calling, indicating the importance of living a calling in the link between perceiving a calling and job satisfaction. Implications for research and practice are discussed. (c) 2012 APA, all rights reserved.

  4. Computer Vision Syndrome among Call Center Employees at Telecommunication Company in Bandung

    Directory of Open Access Journals (Sweden)

    Ghea Nursyifa

    2016-06-01

    Full Text Available Background: The occurrence of Computer Vision Syndrome (CVS at the workplace has increased within decades due to theprolonged use of computers. Knowledge of CVS is necessary in order to develop an awareness of how to prevent and alleviate itsprevalence . The objective of this study was to assess the knowledge of CVS among call center employees and to explore the most frequent CVS symptom experienced by the workers. Methods: A descriptive cross sectional study was conducted during the period of September to November 2014 at Telecommunication Company in Bandung using a questionnaire consisting of 30 questions. Out of the 30 questions/statements, 15 statements were about knowledge of CVS and other 15 questions were about the occurrence of CVS and its symptoms. In this study 125 call center employees participated as respondents using consecutive sampling. The level of knowledge was divided into 3 categories: good (76–100%, fair (75–56% and poor (<56%. The collected data was presented in frequency tabulation. Results: There was 74.4% of the respondents had poor knowledge of CVS. The most symptom experienced by the respondents was asthenopia. Conclusions: The CVS occurs in call center employees with various symptoms and signs. This situation is not supported by good knowledge of the syndrome which can hamper prevention programs.

  5. BUSINESS MODELS FOR EXTENDING OF 112 EMERGENCY CALL CENTER CAPABILITIES WITH E-CALL FUNCTION INSERTION

    Directory of Open Access Journals (Sweden)

    Pop Dragos Paul

    2010-12-01

    Full Text Available The present article concerns present status of implementation in Romania and Europe of eCall service and the proposed business models regarding eCall function implementation in Romania. eCall system is used for reliable transmission in case of crush between In Vehicle System and Public Service Answering Point, via the voice channel of cellular and Public Switched Telephone Network (PSTN. eCall service could be initiated automatically or manual the driver. All data presented in this article are part of researches made by authors in the Sectorial Contract Implementation study regarding eCall system, having as partners ITS Romania and Electronic Solution, with the Romanian Ministry of Communication and Information Technology as beneficiary.

  6. GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique

    Science.gov (United States)

    Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.

    2015-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).

  7. Computer-Assisted Language Learning (CALL) in Support of (Re)-Learning Native Languages: The Case of Runyakitara

    Science.gov (United States)

    Katushemererwe, Fridah; Nerbonne, John

    2015-01-01

    This study presents the results from a computer-assisted language learning (CALL) system of Runyakitara (RU_CALL). The major objective was to provide an electronic language learning environment that can enable learners with mother tongue deficiencies to enhance their knowledge of grammar and acquire writing skills in Runyakitara. The system…

  8. From Computer Assisted Language Learning (CALL) to Mobile Assisted Language Use (MALU)

    Science.gov (United States)

    Jarvis, Huw; Achilleos, Marianna

    2013-01-01

    This article begins by critiquing the long-established acronym CALL (Computer Assisted Language Learning). We then go on to report on a small-scale study which examines how student non-native speakers of English use a range of digital devices beyond the classroom in both their first (L1) and second (L2) languages. We look also at the extent to…

  9. A Generative Computer Model for Preliminary Design of Mass Housing

    Directory of Open Access Journals (Sweden)

    Ahmet Emre DİNÇER

    2014-05-01

    Full Text Available Today, we live in what we call the “Information Age”, an age in which information technologies are constantly being renewed and developed. Out of this has emerged a new approach calledComputational Design” or “Digital Design”. In addition to significantly influencing all fields of engineering, this approach has come to play a similar role in all stages of the design process in the architectural field. In providing solutions for analytical problems in design such as cost estimate, circulation systems evaluation and environmental effects, which are similar to engineering problems, this approach is being used in the evaluation, representation and presentation of traditionally designed buildings. With developments in software and hardware technology, it has evolved as the studies based on design of architectural products and production implementations with digital tools used for preliminary design stages. This paper presents a digital model which may be used in the preliminary stage of mass housing design with Cellular Automata, one of generative design systems based on computational design approaches. This computational model, developed by scripts of 3Ds Max software, has been implemented on a site plan design of mass housing, floor plan organizations made by user preferences and facade designs. By using the developed computer model, many alternative housing types could be rapidly produced. The interactive design tool of this computational model allows the user to transfer dimensional and functional housing preferences by means of the interface prepared for model. The results of the study are discussed in the light of innovative architectural approaches.

  10. A Computational Model for Observation in Quantum Mechanics.

    Science.gov (United States)

    1987-03-16

    Interferometer experiment ............. 17 2.3 The EPR Paradox experiment ................. 22 3 The Computational Model, an Overview 28 4 Implementation 34...40 4.4 Code for the EPR paradox experiment ............... 46 4.5 Code for the double slit interferometer experiment ..... .. 50 5 Conclusions 59 A...particle run counter to fact. The EPR paradox experiment (see section 2.3) is hard to resolve with this class of models, collectively called hidden

  11. On turbulence models for rod bundle flow computations

    International Nuclear Information System (INIS)

    Hazi, Gabor

    2005-01-01

    In commercial computational fluid dynamics codes there is more than one turbulence model built in. It is the user responsibility to choose one of those models, suitable for the problem studied. In the last decade, several computations were presented using computational fluid dynamics for the simulation of various problems of the nuclear industry. A common feature in a number of those simulations is that they were performed using the standard k-ε turbulence model without justifying the choice of the model. The simulation results were rarely satisfactory. In this paper, we shall consider the flow in a fuel rod bundle as a case study and discuss why the application of the standard k-ε model fails to give reasonable results in this situation. We also show that a turbulence model based on the Reynolds stress transport equations can provide qualitatively correct results. Generally, our aim is pedagogical, we would like to call the readers attention to the fact that turbulence models have to be selected based on theoretical considerations and/or adequate information obtained from measurements

  12. Distributed parallel computing in stochastic modeling of groundwater systems.

    Science.gov (United States)

    Dong, Yanhui; Li, Guomin; Xu, Haizhen

    2013-03-01

    Stochastic modeling is a rapidly evolving, popular approach to the study of the uncertainty and heterogeneity of groundwater systems. However, the use of Monte Carlo-type simulations to solve practical groundwater problems often encounters computational bottlenecks that hinder the acquisition of meaningful results. To improve the computational efficiency, a system that combines stochastic model generation with MODFLOW-related programs and distributed parallel processing is investigated. The distributed computing framework, called the Java Parallel Processing Framework, is integrated into the system to allow the batch processing of stochastic models in distributed and parallel systems. As an example, the system is applied to the stochastic delineation of well capture zones in the Pinggu Basin in Beijing. Through the use of 50 processing threads on a cluster with 10 multicore nodes, the execution times of 500 realizations are reduced to 3% compared with those of a serial execution. Through this application, the system demonstrates its potential in solving difficult computational problems in practical stochastic modeling. © 2012, The Author(s). Groundwater © 2012, National Ground Water Association.

  13. Persons with Alzheimer's Disease Make Phone Calls Independently Using a Computer-Aided Telephone System

    Science.gov (United States)

    Perilli, Viviana; Lancioni, Giulio E.; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Cassano, Germana; Cordiano, Noemi; Pinto, Katia; Minervini, Mauro G.; Oliva, Doretta

    2012-01-01

    This study assessed whether four patients with a diagnosis of Alzheimer's disease could make independent phone calls via a computer-aided telephone system. The study was carried out according to a non-concurrent multiple baseline design across participants. All participants started with baseline during which the telephone system was not available,…

  14. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  15. Model to Implement Virtual Computing Labs via Cloud Computing Services

    Directory of Open Access Journals (Sweden)

    Washington Luna Encalada

    2017-07-01

    Full Text Available In recent years, we have seen a significant number of new technological ideas appearing in literature discussing the future of education. For example, E-learning, cloud computing, social networking, virtual laboratories, virtual realities, virtual worlds, massive open online courses (MOOCs, and bring your own device (BYOD are all new concepts of immersive and global education that have emerged in educational literature. One of the greatest challenges presented to e-learning solutions is the reproduction of the benefits of an educational institution’s physical laboratory. For a university without a computing lab, to obtain hands-on IT training with software, operating systems, networks, servers, storage, and cloud computing similar to that which could be received on a university campus computing lab, it is necessary to use a combination of technological tools. Such teaching tools must promote the transmission of knowledge, encourage interaction and collaboration, and ensure students obtain valuable hands-on experience. That, in turn, allows the universities to focus more on teaching and research activities than on the implementation and configuration of complex physical systems. In this article, we present a model for implementing ecosystems which allow universities to teach practical Information Technology (IT skills. The model utilizes what is called a “social cloud”, which utilizes all cloud computing services, such as Software as a Service (SaaS, Platform as a Service (PaaS, and Infrastructure as a Service (IaaS. Additionally, it integrates the cloud learning aspects of a MOOC and several aspects of social networking and support. Social clouds have striking benefits such as centrality, ease of use, scalability, and ubiquity, providing a superior learning environment when compared to that of a simple physical lab. The proposed model allows students to foster all the educational pillars such as learning to know, learning to be, learning

  16. Computer synthesis of bird songs and calls

    Science.gov (United States)

    Kahrs, Mark; Avanzini, Federico

    2002-05-01

    We describe the computer simulation of a one-mass source together with a simple transmission line model for a psittacine bird. The syrinx is modeled as a lumped mass subject to elastic restoring forces and internal dissipation. Nonlinear interaction with the airflow follows the Ishizaka and Flanagan glottal model. This is also used to model complete closure of the syrinx: during closure an additional restoring force is added and dissipation is increased. During the whole closed phase the syringeal flow is zero, and consequently its spectrum is broadened and higher partials are generated. The vocal tract transmission lines are implemented with fractional delay lines and the transmission lines are assumed lossless. The beak is implemented as two complementary fifth order Butterworth filters. The pole trajectory for the filters approximates the nonlinear path of the beak cutoff as discussed by Fletcher. [M.K. was supported by the Fulbright Foundation as well as Tekes. F.A. was funded by the Sound Source Modeling Project. Both authors were supported in part by the Academy of Finland.

  17. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  18. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  19. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  20. Cosmic logic: a computational model

    International Nuclear Information System (INIS)

    Vanchurin, Vitaly

    2016-01-01

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps

  1. Computer models for optimizing radiation therapy

    International Nuclear Information System (INIS)

    Duechting, W.

    1998-01-01

    The aim of this contribution is to outline how methods of system analysis, control therapy and modelling can be applied to simulate normal and malignant cell growth and to optimize cancer treatment as for instance radiation therapy. Based on biological observations and cell kinetic data, several types of models have been developed describing the growth of tumor spheroids and the cell renewal of normal tissue. The irradiation model is represented by the so-called linear-quadratic model describing the survival fraction as a function of the dose. Based thereon, numerous simulation runs for different treatment schemes can be performed. Thus, it is possible to study the radiation effect on tumor and normal tissue separately. Finally, this method enables a computer-assisted recommendation for an optimal patient-specific treatment schedule prior to clinical therapy. (orig.) [de

  2. Definition, modeling and simulation of a grid computing system for high throughput computing

    CERN Document Server

    Caron, E; Tsaregorodtsev, A Yu

    2006-01-01

    In this paper, we study and compare grid and global computing systems and outline the benefits of having an hybrid system called dirac. To evaluate the dirac scheduling for high throughput computing, a new model is presented and a simulator was developed for many clusters of heterogeneous nodes belonging to a local network. These clusters are assumed to be connected to each other through a global network and each cluster is managed via a local scheduler which is shared by many users. We validate our simulator by comparing the experimental and analytical results of a M/M/4 queuing system. Next, we do the comparison with a real batch system and we obtain an average error of 10.5% for the response time and 12% for the makespan. We conclude that the simulator is realistic and well describes the behaviour of a large-scale system. Thus we can study the scheduling of our system called dirac in a high throughput context. We justify our decentralized, adaptive and oppor! tunistic approach in comparison to a centralize...

  3. SNP calling using genotype model selection on high-throughput sequencing data

    KAUST Repository

    You, Na; Murillo, Gabriel; Su, Xiaoquan; Zeng, Xiaowei; Xu, Jian; Ning, Kang; Zhang, ShouDong; Zhu, Jian-Kang; Cui, Xinping

    2012-01-01

    calling SNPs. Thus, errors not involved in base-calling or alignment, such as those in genomic sample preparation, are not accounted for.Results: A novel method of consensus and SNP calling, Genotype Model Selection (GeMS), is given which accounts

  4. Predictive model for determining the quality of a call

    Science.gov (United States)

    Voznak, M.; Rozhon, J.; Partila, P.; Safarik, J.; Mikulec, M.; Mehic, M.

    2014-05-01

    In this paper the predictive model for speech quality estimation is described. This model allows its user to gain the information about the speech quality in VoIP networks without the need of performing the actual call and the consecutive time consuming sound file evaluation. This rapidly increases usability of the speech quality measurement especially in high load networks, where the actual processing of all calls is rendered difficult or even impossible. This model can reach its results that are highly conformant with the PESQ algorithm only based on the network state parameters that are easily obtainable by the commonly used software tools. Experiments were carried out to investigate whether different languages (English, Czech) have an effect on perceived voice quality for the same network conditions and the language factor was incorporated directly into the model.

  5. Computational comparison of quantum-mechanical models for multistep direct reactions

    International Nuclear Information System (INIS)

    Koning, A.J.; Akkermans, J.M.

    1993-01-01

    We have carried out a computational comparison of all existing quantum-mechanical models for multistep direct (MSD) reactions. The various MSD models, including the so-called Feshbach-Kerman-Koonin, Tamura-Udagawa-Lenske and Nishioka-Yoshida-Weidenmueller models, have been implemented in a single computer system. All model calculations thus use the same set of parameters and the same numerical techniques; only one adjustable parameter is employed. The computational results have been compared with experimental energy spectra and angular distributions for several nuclear reactions, namely, 90 Zr(p,p') at 80 MeV, 209 Bi(p,p') at 62 MeV, and 93 Nb(n,n') at 25.7 MeV. In addition, the results have been compared with the Kalbach systematics and with semiclassical exciton model calculations. All quantum MSD models provide a good fit to the experimental data. In addition, they reproduce the systematics very well and are clearly better than semiclassical model calculations. We furthermore show that the calculated predictions do not differ very strongly between the various quantum MSD models, leading to the conclusion that the simplest MSD model (the Feshbach-Kerman-Koonin model) is adequate for the analysis of experimental data

  6. Fluid approximation analysis of a call center model with time-varying arrivals and after-call work

    Directory of Open Access Journals (Sweden)

    Yosuke Kawai

    2015-12-01

    Full Text Available Important features to be included in queueing-theoretic models of the call center operation are multiple servers, impatient customers, time-varying arrival process, and operator’s after-call work (ACW. We propose a fluid approximation technique for the queueing model with these features by extending the analysis of a similar model without ACW recently developed by Liu and Whitt (2012. Our model assumes that the service for each quantum of fluid consists of a sequence of two stages, the first stage for the conversation with a customer and the second stage for the ACW. When the duration of each stage has exponential, hyperexponential or hypo-exponential distribution, we derive the time-dependent behavior of the content of fluid in each stage of service as well as that in the waiting room. Numerical examples are shown to illustrate the system performance for the cases in which the input rate and/or the number of servers vary in sinusoidal fashion as well as in adaptive ways and in stationary cases.

  7. Modelling physics detectors in a computer aided design system for simulation purposes

    International Nuclear Information System (INIS)

    Ahvenainen, J.; Oksakivi, T.; Vuoskoski, J.

    1995-01-01

    The possibility of transferring physics detector models from computer aided design systems into physics simulation packages like GEANT is receiving increasing attention. The problem of exporting detector models constructed in CAD systems into GEANT is well known. We discuss the problem and describe an application, called DDT, which allows one to design detector models in a CAD system and then transfer the models into GEANT for simulation purposes. (orig.)

  8. Integrating Computer-Assisted Language Learning in Saudi Schools: A Change Model

    Science.gov (United States)

    Alresheed, Saleh; Leask, Marilyn; Raiker, Andrea

    2015-01-01

    Computer-assisted language learning (CALL) technology and pedagogy have gained recognition globally for their success in supporting second language acquisition (SLA). In Saudi Arabia, the government aims to provide most educational institutions with computers and networking for integrating CALL into classrooms. However, the recognition of CALL's…

  9. Simulation models for computational plasma physics: Concluding report

    International Nuclear Information System (INIS)

    Hewett, D.W.

    1994-01-01

    In this project, the authors enhanced their ability to numerically simulate bounded plasmas that are dominated by low-frequency electric and magnetic fields. They moved towards this goal in several ways; they are now in a position to play significant roles in the modeling of low-frequency electromagnetic plasmas in several new industrial applications. They have significantly increased their facility with the computational methods invented to solve the low frequency limit of Maxwell's equations (DiPeso, Hewett, accepted, J. Comp. Phys., 1993). This low frequency model is called the Streamlined Darwin Field model (SDF, Hewett, Larson, and Doss, J. Comp. Phys., 1992) has now been implemented in a fully non-neutral SDF code BEAGLE (Larson, Ph.D. dissertation, 1993) and has further extended to the quasi-neutral limit (DiPeso, Hewett, Comp. Phys. Comm., 1993). In addition, they have resurrected the quasi-neutral, zero-electron-inertia model (ZMR) and began the task of incorporating internal boundary conditions into this model that have the flexibility of those in GYMNOS, a magnetostatic code now used in ion source work (Hewett, Chen, ICF Quarterly Report, July--September, 1993). Finally, near the end of this project, they invented a new type of banded matrix solver that can be implemented on a massively parallel computer -- thus opening the door for the use of all their ADI schemes on these new computer architecture's (Mattor, Williams, Hewett, submitted to Parallel Computing, 1993)

  10. A computationally exact method of Dawson's model for hole dynamics of one-dimensional plasma

    International Nuclear Information System (INIS)

    Kitahara, Kazuo; Tanno, Kohki; Takada, Toshio; Hatori, Tadatsugu; Urata, Kazuhiro; Irie, Haruyuki; Nambu, Mitsuhiro; Saeki, Kohichi.

    1990-01-01

    We show a simple but computationally exact solution of the one-dimensional plasma model, so-called 'Dawson's model'. Using this solution, we can describe the evolution of the plasma and find the relative stabilization of a big hole after the instability of two streams. (author)

  11. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  12. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  13. Sharing programming resources between Bio* projects through remote procedure call and native call stack strategies

    DEFF Research Database (Denmark)

    Prins, Pjotr; Goto, Naohisa; Yates, Andrew

    2012-01-01

    Open-source software (OSS) encourages computer programmers to reuse software components written by others. In evolutionary bioinformatics, OSS comes in a broad range of programming languages, including C/C++, Perl, Python, Ruby, Java, and R. To avoid writing the same functionality multiple times...... for different languages, it is possible to share components by bridging computer languages and Bio* projects, such as BioPerl, Biopython, BioRuby, BioJava, and R/Bioconductor. In this chapter, we compare the two principal approaches for sharing software between different programming languages: either by remote...... procedure call (RPC) or by sharing a local call stack. RPC provides a language-independent protocol over a network interface; examples are RSOAP and Rserve. The local call stack provides a between-language mapping not over the network interface, but directly in computer memory; examples are R bindings, RPy...

  14. Cloud Computing Platform for an Online Model Library System

    Directory of Open Access Journals (Sweden)

    Mingang Chen

    2013-01-01

    Full Text Available The rapid developing of digital content industry calls for online model libraries. For the efficiency, user experience, and reliability merits of the model library, this paper designs a Web 3D model library system based on a cloud computing platform. Taking into account complex models, which cause difficulties in real-time 3D interaction, we adopt the model simplification and size adaptive adjustment methods to make the system with more efficient interaction. Meanwhile, a cloud-based architecture is developed to ensure the reliability and scalability of the system. The 3D model library system is intended to be accessible by online users with good interactive experiences. The feasibility of the solution has been tested by experiments.

  15. ADGEN: ADjoint GENerator for computer models

    Energy Technology Data Exchange (ETDEWEB)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.

    1989-05-01

    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs.

  16. ADGEN: ADjoint GENerator for computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.

    1989-05-01

    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs

  17. Computer tomography of flows external to test models

    Science.gov (United States)

    Prikryl, I.; Vest, C. M.

    1982-01-01

    Computer tomographic techniques for reconstruction of three-dimensional aerodynamic density fields, from interferograms recorded from several different viewing directions were studied. Emphasis is on the case in which an opaque object such as a test model in a wind tunnel obscures significant regions of the interferograms (projection data). A method called the Iterative Convolution Method (ICM), existing methods in which the field is represented by a series expansions, and analysis of real experimental data in the form of aerodynamic interferograms are discussed.

  18. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  19. An ODP computational model of a cooperative binding object

    Science.gov (United States)

    Logé, Christophe; Najm, Elie; Chen, Ken

    1997-12-01

    A next generation of systems that should appear will have to manage simultaneously several geographically distributed users. These systems belong to the class of computer-supported cooperative work systems (CSCW). The development of such complex systems requires rigorous development methods and flexible open architectures. Open distributed processing (ODP) is a standardization effort that aims at providing such architectures. ODP features appropriate abstraction levels and a clear articulation between requirements, programming and infrastructure support. ODP advocates the use of formal methods for the specification of systems and components. The computational model, an object-based model, one of the abstraction levels identified within ODP, plays a central role in the global architecture. In this model, basic objects can be composed with communication and distribution abstractions (called binding objects) to form a computational specification of distributed systems, or applications. Computational specifications can then be mapped (in a mechanism akin to compilation) onto an engineering solution. We use an ODP-inspired method to computationally specify a cooperative system. We start from a general purpose component that we progressively refine into a collection of basic and binding objects. We focus on two issues of a co-authoring application, namely, dynamic reconfiguration and multiview synchronization. We discuss solutions for these issues and formalize them using the MT-LOTOS specification language that is currently studied in the ISO standardization formal description techniques group.

  20. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  1. SNP calling using genotype model selection on high-throughput sequencing data

    KAUST Repository

    You, Na

    2012-01-16

    Motivation: A review of the available single nucleotide polymorphism (SNP) calling procedures for Illumina high-throughput sequencing (HTS) platform data reveals that most rely mainly on base-calling and mapping qualities as sources of error when calling SNPs. Thus, errors not involved in base-calling or alignment, such as those in genomic sample preparation, are not accounted for.Results: A novel method of consensus and SNP calling, Genotype Model Selection (GeMS), is given which accounts for the errors that occur during the preparation of the genomic sample. Simulations and real data analyses indicate that GeMS has the best performance balance of sensitivity and positive predictive value among the tested SNP callers. © The Author 2012. Published by Oxford University Press. All rights reserved.

  2. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  3. The job demands-resources model of work engagement in South African call centres

    Directory of Open Access Journals (Sweden)

    Yolandi Janse van Rensburg

    2013-09-01

    Full Text Available Orientation: A ‘sacrificial human resource strategy’ is practised in call centres, resulting in poor employee occupational health. Consequently, questions are posed in terms of the consequences of call centre work and which salient antecedent variables impact the engagement and wellbeing of call centre representatives. Research purpose: Firstly, to gauge the level of employee engagement amongst a sample of call centre representatives in South Africa and, secondly, to track the paths through which salient personal and job resources affect this engagement. More specifically, the relationships between sense of coherence, leadership effectiveness, team effectiveness and engagement were investigated, thus testing the Job Demands-Resources model of work engagement. Motivation for the study: To present an application of the Job Demands-Resources model of work engagement in a call centre environment in order to diagnose current ills and consequently propose remedies. Research design: A cross-sectional survey design was used and a non-probability convenient sample of 217 call centre representatives was selected. The measuring instruments comprise the Utrecht Work Engagement Scale to measure engagement, the Team Diagnostic Survey to measure team effectiveness, the leadership practices inventory to gauge leadership effectiveness, and the Orientation to Life Questionnaire to measure sense of coherence. A series of structural equation modelling analyses were performed. Main findings: Contrary to the ‘electronic sweatshop’ image attached to call centre jobs depicted in the literature, results show a high level of employee engagement for call centre representatives in the sample. Also, personal resources such as sense of coherence and job resources such as team effectiveness related significantly to engagement. A non-significant relationship exists between leadership effectiveness and engagement. Practical/managerial implications: Both the content and

  4. The job demands-resources model of work engagement in South African call centres

    Directory of Open Access Journals (Sweden)

    Yolandi Janse van Rensburg

    2013-09-01

    Full Text Available Orientation: A ‘sacrificial human resource strategy’ is practised in call centres, resulting in poor employee occupational health. Consequently, questions are posed in terms of the consequences of call centre work and which salient antecedent variables impact the engagement and wellbeing of call centre representatives.Research purpose: Firstly, to gauge the level of employee engagement amongst a sample of call centre representatives in South Africa and, secondly, to track the paths through which salient personal and job resources affect this engagement. More specifically, the relationships between sense of coherence, leadership effectiveness, team effectiveness and engagement were investigated, thus testing the Job Demands-Resources model of work engagement.Motivation for the study: To present an application of the Job Demands-Resources model of work engagement in a call centre environment in order to diagnose current ills and consequently propose remedies.Research design: A cross-sectional survey design was used and a non-probability convenient sample of 217 call centre representatives was selected. The measuring instruments comprise the Utrecht Work Engagement Scale to measure engagement, the Team Diagnostic Survey to measure team effectiveness, the leadership practices inventory to gauge leadership effectiveness, and the Orientation to Life Questionnaire to measure sense of coherence. A series of structural equation modelling analyses were performed.Main findings: Contrary to the ‘electronic sweatshop’ image attached to call centre jobs depicted in the literature, results show a high level of employee engagement for call centre representatives in the sample. Also, personal resources such as sense of coherence and job resources such as team effectiveness related significantly to engagement. A non-significant relationship exists between leadership effectiveness and engagement.Practical/managerial implications: Both the content and

  5. Mechatronic Model Based Computed Torque Control of a Parallel Manipulator

    Directory of Open Access Journals (Sweden)

    Zhiyong Yang

    2008-11-01

    Full Text Available With high speed and accuracy the parallel manipulators have wide application in the industry, but there still exist many difficulties in the actual control process because of the time-varying and coupling. Unfortunately, the present-day commercial controlles cannot provide satisfying performance for its single axis linear control only. Therefore, aimed at a novel 2-DOF (Degree of Freedom parallel manipulator called Diamond 600, a motor-mechanism coupling dynamic model based control scheme employing the computed torque control algorithm are presented in this paper. First, the integrated dynamic coupling model is deduced, according to equivalent torques between the mechanical structure and the PM (Permanent Magnetism servomotor. Second, computed torque controller is described in detail for the above proposed model. At last, a series of numerical simulations and experiments are carried out to test the effectiveness of the system, and the results verify the favourable tracking ability and robustness.

  6. Mechatronic Model Based Computed Torque Control of a Parallel Manipulator

    Directory of Open Access Journals (Sweden)

    Zhiyong Yang

    2008-03-01

    Full Text Available With high speed and accuracy the parallel manipulators have wide application in the industry, but there still exist many difficulties in the actual control process because of the time-varying and coupling. Unfortunately, the present-day commercial controlles cannot provide satisfying performance for its single axis linear control only. Therefore, aimed at a novel 2-DOF (Degree of Freedom parallel manipulator called Diamond 600, a motor-mechanism coupling dynamic model based control scheme employing the computed torque control algorithm are presented in this paper. First, the integrated dynamic coupling model is deduced, according to equivalent torques between the mechanical structure and the PM (Permanent Magnetism servomotor. Second, computed torque controller is described in detail for the above proposed model. At last, a series of numerical simulations and experiments are carried out to test the effectiveness of the system, and the results verify the favourable tracking ability and robustness.

  7. Sharing programming resources between Bio* projects through remote procedure call and native call stack strategies.

    Science.gov (United States)

    Prins, Pjotr; Goto, Naohisa; Yates, Andrew; Gautier, Laurent; Willis, Scooter; Fields, Christopher; Katayama, Toshiaki

    2012-01-01

    Open-source software (OSS) encourages computer programmers to reuse software components written by others. In evolutionary bioinformatics, OSS comes in a broad range of programming languages, including C/C++, Perl, Python, Ruby, Java, and R. To avoid writing the same functionality multiple times for different languages, it is possible to share components by bridging computer languages and Bio* projects, such as BioPerl, Biopython, BioRuby, BioJava, and R/Bioconductor. In this chapter, we compare the two principal approaches for sharing software between different programming languages: either by remote procedure call (RPC) or by sharing a local call stack. RPC provides a language-independent protocol over a network interface; examples are RSOAP and Rserve. The local call stack provides a between-language mapping not over the network interface, but directly in computer memory; examples are R bindings, RPy, and languages sharing the Java Virtual Machine stack. This functionality provides strategies for sharing of software between Bio* projects, which can be exploited more often. Here, we present cross-language examples for sequence translation, and measure throughput of the different options. We compare calling into R through native R, RSOAP, Rserve, and RPy interfaces, with the performance of native BioPerl, Biopython, BioJava, and BioRuby implementations, and with call stack bindings to BioJava and the European Molecular Biology Open Software Suite. In general, call stack approaches outperform native Bio* implementations and these, in turn, outperform RPC-based approaches. To test and compare strategies, we provide a downloadable BioNode image with all examples, tools, and libraries included. The BioNode image can be run on VirtualBox-supported operating systems, including Windows, OSX, and Linux.

  8. Evaluation of Static JavaScript Call Graph Algorithms

    NARCIS (Netherlands)

    J.-J. Dijkstra (Jorryt-Jan)

    2014-01-01

    htmlabstractThis thesis consists of a replication study in which two algorithms to compute JavaScript call graphs have been implemented and evaluated. Existing IDE support for JavaScript is hampered due to the dynamic nature of the language. Previous studies partially solve call graph computation

  9. The Support Reduction Algorithm for Computing Non-Parametric Function Estimates in Mixture Models

    OpenAIRE

    GROENEBOOM, PIET; JONGBLOED, GEURT; WELLNER, JON A.

    2008-01-01

    In this paper, we study an algorithm (which we call the support reduction algorithm) that can be used to compute non-parametric M-estimators in mixture models. The algorithm is compared with natural competitors in the context of convex regression and the ‘Aspect problem’ in quantum physics.

  10. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Herberger, Sarah Elizabeth Marie [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  11. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    International Nuclear Information System (INIS)

    Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie; Mandelli, Diego; Smith, Curtis Lee

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS 'pathways,' or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  12. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs).

    Science.gov (United States)

    Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong

    2014-01-01

    Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  13. A Call for Computational Thinking in Undergraduate Psychology

    Science.gov (United States)

    Anderson, Nicole D.

    2016-01-01

    Computational thinking is an approach to problem solving that is typically employed by computer programmers. The advantage of this approach is that solutions can be generated through algorithms that can be implemented as computer code. Although computational thinking has historically been a skill that is exclusively taught within computer science,…

  14. Models of natural computation : gene assembly and membrane systems

    NARCIS (Netherlands)

    Brijder, Robert

    2008-01-01

    This thesis is concerned with two research areas in natural computing: the computational nature of gene assembly and membrane computing. Gene assembly is a process occurring in unicellular organisms called ciliates. During this process genes are transformed through cut-and-paste operations. We

  15. Asiddhatva Principle in Computational Model of Aṣṭādhyāyī

    Science.gov (United States)

    Subbanna, Sridhar; Varakhedi, Shrinivasa

    Pāṇini's Aṣṭādhyāyī can be thought of as an automaton to generate Sanskrit words and sentences. Aṣṭādhyāyī consists of sūtras that are organized in a systematic manner. The words are derived from the roots and affixes by the application of these sūtras that follow a well defined procedure. Therefore, Aṣṭādhyāyī is best suited for computational modeling. A computational model with conflict resolution techniques was discussed by us (Sridhar et al, 2009)[12]. In continuation with that, this paper presents, an improvised computational model of Aṣṭādhyāyī. This model is further developed based on the principle of asiddhatva. A new mathematical technique called 'filter' is introduced to comprehensively envisage all usages of asiddhatva in Aṣṭādhyāyī.

  16. A simplified computational memory model from information processing.

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  17. Positioning graphical objects on computer screens: a three-phase model.

    Science.gov (United States)

    Pastel, Robert

    2011-02-01

    This experiment identifies and models phases during the positioning of graphical objects (called cursors in this article) on computer displays. The human computer-interaction community has traditionally used Fitts' law to model selection in graphical user interfaces, whereas human factors experiments have found the single-component Fitts' law inadequate to model positioning of real objects. Participants (N=145) repeatedly positioned variably sized square cursors within variably sized rectangular targets using computer mice. The times for the cursor to just touch the target, for the cursor to enter the target, and for participants to indicate positioning completion were observed. The positioning tolerances were varied from very precise and difficult to imprecise and easy. The time for the cursor to touch the target was proportional to the initial cursor-target distance. The time for the cursor to completely enter the target after touching was proportional to the logarithms of cursor size divided by target tolerances. The time for participants to indicate positioning after entering was inversely proportional to the tolerance. A three-phase model defined by regions--distant, proximate, and inside the target--was proposed and could model the positioning tasks. The three-phase model provides a framework for ergonomists to evaluate new positioning techniques and can explain their deficiencies. The model provides a means to analyze tasks and enhance interaction during positioning.

  18. Valuation of European Call Option via Inverse Fourier Transform

    Directory of Open Access Journals (Sweden)

    Rubenis Oskars

    2017-12-01

    Full Text Available Very few models allow expressing European call option price in closed form. Out of them, the famous Black- Scholes approach sets strong constraints - innovations should be normally distributed and independent. Availability of a corresponding characteristic function of log returns of underlying asset in analytical form allows pricing European call option by application of inverse Fourier transform. Characteristic function corresponds to Normal Inverse Gaussian (NIG probability density function. NIG distribution is obtained based on assumption that time series of log returns follows APARCH process. Thus, volatility clustering and leptokurtic nature of log returns are taken into account. The Fast Fourier transform based on trapezoidal quadrature is numerically unstable if a standard cumulative probability function is used. To solve the problem, a dampened cumulative probability is introduced. As a computation tool Matlab framework is chosen because it contains many effective vectorization tools that greatly enhance code readability and maintenance. The characteristic function of Normal Inverse Gaussian distribution is taken and exercised with the chosen set of parameters. Finally, the call price dependence on strike price is obtained and rendered in XY plot. Valuation of European call option with analytical form of characteristic function allows further developing models with higher accuracy, as well as developing models for some exotic options.

  19. DREAMS and IMAGE: A Model and Computer Implementation for Concurrent, Life-Cycle Design of Complex Systems

    Science.gov (United States)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.

  20. Perceiving a Calling, Living a Calling, and Job Satisfaction: Testing a Moderated, Multiple Mediator Model

    Science.gov (United States)

    Duffy, Ryan D.; Bott, Elizabeth M.; Allan, Blake A.; Torrey, Carrie L.; Dik, Bryan J.

    2012-01-01

    The current study examined the relation between perceiving a calling, living a calling, and job satisfaction among a diverse group of employed adults who completed an online survey (N = 201). Perceiving a calling and living a calling were positively correlated with career commitment, work meaning, and job satisfaction. Living a calling moderated…

  1. Investigating the Effectiveness of Computer-Assisted Language Learning (CALL) Using Google Documents in Enhancing Writing--A Study on Senior 1 Students in a Chinese Independent High School

    Science.gov (United States)

    Ambrose, Regina Maria; Palpanathan, Shanthini

    2017-01-01

    Computer-assisted language learning (CALL) has evolved through various stages in both technology as well as the pedagogical use of technology (Warschauer & Healey, 1998). Studies show that the CALL trend has facilitated students in their English language writing with useful tools such as computer based activities and word processing. Students…

  2. A Step-indexed Semantic Model of Types for the Call-by-Name Lambda Calculus

    OpenAIRE

    Meurer, Benedikt

    2011-01-01

    Step-indexed semantic models of types were proposed as an alternative to purely syntactic safety proofs using subject-reduction. Building upon the work by Appel and others, we introduce a generalized step-indexed model for the call-by-name lambda calculus. We also show how to prove type safety of general recursion in our call-by-name model.

  3. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  4. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs.

    Directory of Open Access Journals (Sweden)

    Qifan Kuang

    Full Text Available Early and accurate identification of adverse drug reactions (ADRs is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs.In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper.Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  5. A simplified computational memory model from information processing

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  6. Computational models of neuromodulation.

    Science.gov (United States)

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  7. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    Science.gov (United States)

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  8. A complex-plane strategy for computing rotating polytropic models - Numerical results for strong and rapid differential rotation

    International Nuclear Information System (INIS)

    Geroyannis, V.S.

    1990-01-01

    In this paper, a numerical method, called complex-plane strategy, is implemented in the computation of polytropic models distorted by strong and rapid differential rotation. The differential rotation model results from a direct generalization of the classical model, in the framework of the complex-plane strategy; this generalization yields very strong differential rotation. Accordingly, the polytropic models assume extremely distorted interiors, while their boundaries are slightly distorted. For an accurate simulation of differential rotation, a versatile method, called multiple partition technique is developed and implemented. It is shown that the method remains reliable up to rotation states where other elaborate techniques fail to give accurate results. 11 refs

  9. ABrox-A user-friendly Python module for approximate Bayesian computation with a focus on model comparison.

    Science.gov (United States)

    Mertens, Ulf Kai; Voss, Andreas; Radev, Stefan

    2018-01-01

    We give an overview of the basic principles of approximate Bayesian computation (ABC), a class of stochastic methods that enable flexible and likelihood-free model comparison and parameter estimation. Our new open-source software called ABrox is used to illustrate ABC for model comparison on two prominent statistical tests, the two-sample t-test and the Levene-Test. We further highlight the flexibility of ABC compared to classical Bayesian hypothesis testing by computing an approximate Bayes factor for two multinomial processing tree models. Last but not least, throughout the paper, we introduce ABrox using the accompanied graphical user interface.

  10. Modeling of BWR core meltdown accidents - for application in the MELRPI. MOD2 computer code

    Energy Technology Data Exchange (ETDEWEB)

    Koh, B R; Kim, S H; Taleyarkhan, R P; Podowski, M Z; Lahey, Jr, R T

    1985-04-01

    This report summarizes improvements and modifications made in the MELRPI computer code. A major difference between this new, updated version of the code, called MELRPI.MOD2, and the one reported previously, concerns the inclusion of a model for the BWR emergency core cooling systems (ECCS). This model and its computer implementation, the ECCRPI subroutine, account for various emergency injection modes, for both intact and rubblized geometries. Other changes to MELRPI deal with an improved model for canister wall oxidation, rubble bed modeling, and numerical integration of system equations. A complete documentation of the entire MELRPI.MOD2 code is also given, including an input guide, list of subroutines, sample input/output and program listing.

  11. Evaluating the Theoretic Adequacy and Applied Potential of Computational Models of the Spacing Effect.

    Science.gov (United States)

    Walsh, Matthew M; Gluck, Kevin A; Gunzelmann, Glenn; Jastrzembski, Tiffany; Krusmark, Michael

    2018-03-02

    The spacing effect is among the most widely replicated empirical phenomena in the learning sciences, and its relevance to education and training is readily apparent. Yet successful applications of spacing effect research to education and training is rare. Computational modeling can provide the crucial link between a century of accumulated experimental data on the spacing effect and the emerging interest in using that research to enable adaptive instruction. In this paper, we review relevant literature and identify 10 criteria for rigorously evaluating computational models of the spacing effect. Five relate to evaluating the theoretic adequacy of a model, and five relate to evaluating its application potential. We use these criteria to evaluate a novel computational model of the spacing effect called the Predictive Performance Equation (PPE). Predictive Performance Equation combines elements of earlier models of learning and memory including the General Performance Equation, Adaptive Control of Thought-Rational, and the New Theory of Disuse, giving rise to a novel computational account of the spacing effect that performs favorably across the complete sets of theoretic and applied criteria. We implemented two other previously published computational models of the spacing effect and compare them to PPE using the theoretic and applied criteria as guides. © 2018 Cognitive Science Society, Inc.

  12. Modeling and performance analysis for composite network–compute service provisioning in software-defined cloud environments

    Directory of Open Access Journals (Sweden)

    Qiang Duan

    2015-08-01

    Full Text Available The crucial role of networking in Cloud computing calls for a holistic vision of both networking and computing systems that leads to composite network–compute service provisioning. Software-Defined Network (SDN is a fundamental advancement in networking that enables network programmability. SDN and software-defined compute/storage systems form a Software-Defined Cloud Environment (SDCE that may greatly facilitate composite network–compute service provisioning to Cloud users. Therefore, networking and computing systems need to be modeled and analyzed as composite service provisioning systems in order to obtain thorough understanding about service performance in SDCEs. In this paper, a novel approach for modeling composite network–compute service capabilities and a technique for evaluating composite network–compute service performance are developed. The analytic method proposed in this paper is general and agnostic to service implementation technologies; thus is applicable to a wide variety of network–compute services in SDCEs. The results obtained in this paper provide useful guidelines for federated control and management of networking and computing resources to achieve Cloud service performance guarantees.

  13. An analytical model for computation of reliability of waste management facilities with intermediate storages

    International Nuclear Information System (INIS)

    Kallweit, A.; Schumacher, F.

    1977-01-01

    A high reliability is called for waste management facilities within the fuel cycle of nuclear power stations which can be fulfilled by providing intermediate storage facilities and reserve capacities. In this report a model based on the theory of Markov processes is described which allows computation of reliability characteristics of waste management facilities containing intermediate storage facilities. The application of the model is demonstrated by an example. (orig.) [de

  14. Queueing System with Heterogeneous Customers as a Model of a Call Center with a Call-Back for Lost Customers

    OpenAIRE

    Dudin, Sergey; Kim, Chesoong; Dudina, Olga; Baek, Janghyun

    2013-01-01

    A multiserver queueing system with infinite and finite buffers, two types of customers, and two types of servers as a model of a call center with a call-back for lost customers is investigated. Type 1 customers arrive to the system according to a Markovian arrival process. All rejected type 1 customers become type 2 customers. Type r, r=1,2, servers serve type r customers if there are any in the system and serve type r′, r′=1,2,  r′≠r, customers if there are no type r customers in the system....

  15. GRAVTool, Advances on the Package to Compute Geoid Model path by the Remove-Compute-Restore Technique, Following Helmert's Condensation Method

    Science.gov (United States)

    Marotta, G. S.

    2017-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astrogeodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove Compute Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and Global Geopotential Model (GGM), respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and adjust these models to one local vertical datum. This research presents the advances on the package called GRAVTool to compute geoid models path by the RCR, following Helmert's condensation method, and its application in a study area. The studied area comprehends the federal district of Brazil, with 6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show a geoid model computed by the GRAVTool package, after analysis of the density, DTM and GGM values, more adequate to the reference values used on the study area. The accuracy of the computed model (σ = ± 0.058 m, RMS = 0.067 m, maximum = 0.124 m and minimum = -0.155 m), using density value of 2.702 g/cm³ ±0.024 g/cm³, DTM SRTM Void Filled 3 arc-second and GGM EIGEN-6C4 up to degree and order 250, matches the uncertainty (σ =± 0.073) of 26 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.076 m, RMS = 0.098 m, maximum = 0.320 m and minimum = -0.061 m).

  16. A border-ownership model based on computational electromagnetism.

    Science.gov (United States)

    Zainal, Zaem Arif; Satoh, Shunji

    2018-03-01

    The mathematical relation between a vector electric field and its corresponding scalar potential field is useful to formulate computational problems of lower/middle-order visual processing, specifically related to the assignment of borders to the side of the object: so-called border ownership (BO). BO coding is a key process for extracting the objects from the background, allowing one to organize a cluttered scene. We propose that the problem is solvable simultaneously by application of a theorem of electromagnetism, i.e., "conservative vector fields have zero rotation, or "curl." We hypothesize that (i) the BO signal is definable as a vector electric field with arrowheads pointing to the inner side of perceived objects, and (ii) its corresponding scalar field carries information related to perceived order in depth of occluding/occluded objects. A simple model was developed based on this computational theory. Model results qualitatively agree with object-side selectivity of BO-coding neurons, and with perceptions of object order. The model update rule can be reproduced as a plausible neural network that presents new interpretations of existing physiological results. Results of this study also suggest that T-junction detectors are unnecessary to calculate depth order. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  18. The Effectiveness of Using Contextual Clues, Dictionary Strategy and Computer Assisted Language Learning (Call In Learning Vocabulary

    Directory of Open Access Journals (Sweden)

    Zuraina Ali

    2013-07-01

    Full Text Available This study investigates the effectiveness of three vocabulary learning methods that are Contextual Clues, Dictionary Strategy, and Computer Assisted Language Learning (CALL in learning vocabulary among ESL learners. First, it aims at finding which of the vocabulary learning methods namely Dictionary Strategy, Contextual Clues, and CALL that may result in the highest number of words learnt in the immediate and delayed recall tests. Second, it compares the results of the Pre-test and the Delayed Recall Post-test to determine the differences of learning vocabulary using the methods. A quasi-experiment that tested the effectiveness of learning vocabulary using Dictionary Strategy, Contextual clues, and CALL involved 123 first year university students. Qualitative procedures included the collection of data from interviews which were conducted to triangulate the data obtain from the quantitative inquiries. Findings from the study using ANOVA revealed that there were significant differences when students were exposed to Dictionary Strategy, Contextual Clues and CALL in the immediate recall tests but not in the Delayed Recall Post-test. Also, there were significant differences when t test was used to compare the scores between the Pre-test and the Delayed Recall Post-test in using the three methods of vocabulary learning. Although many researchers have advocated the relative effectiveness of Dictionary Strategy, Contextual Clues, and CALL in learning vocabulary, the study however, is still paramount since there is no study has ever empirically investigated the relative efficacy of these three methods in a single study.

  19. Client/server models for transparent, distributed computational resources

    International Nuclear Information System (INIS)

    Hammer, K.E.; Gilman, T.L.

    1991-01-01

    Client/server models are proposed to address issues of shared resources in a distributed, heterogeneous UNIX environment. Recent development of automated Remote Procedure Call (RPC) interface generator has simplified the development of client/server models. Previously, implementation of the models was only possible at the UNIX socket level. An overview of RPCs and the interface generator will be presented and will include a discussion of generation and installation of remote services, the RPC paradigm, and the three levels of RPC programming. Two applications, the Nuclear Plant Analyzer (NPA) and a fluids simulation using molecular modelling, will be presented to demonstrate how client/server models using RPCs and External Data Representations (XDR) have been used production/computation situations. The NPA incorporates a client/server interface for transferring/translation of TRAC or RELAP results from the UNICOS Cray to a UNIX workstation. The fluids simulation program utilizes the client/server model to access the Cray via a single function allowing it to become a shared co-processor to the workstation application. 5 refs., 6 figs

  20. Biomimetic agent based modelling using male Frog calling behaviour as a case study

    DEFF Research Database (Denmark)

    Jørgensen, Søren V.; Demazeau, Yves; Christensen-Dalsgaard, Jakob

    2014-01-01

    by individuals to generate their observed population behaviour. A number of existing agent-modelling frameworks are considered, but none have the ability to handle large numbers of time-dependent event-generating agents; hence the construction of a new tool, RANA. The calling behaviour of the Puerto Rican Tree...... Frog, E. coqui, is implemented as a case study for the presentation and discussion of the tool, and results from this model are presented. RANA, in its present stage of development, is shown to be able to handle the problem of modelling calling frogs, and several fruitful extensions are proposed...

  1. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    Models are playing important roles in design and analysis of chemicals based products and the processes that manufacture them. Computer-aided methods and tools have the potential to reduce the number of experiments, which can be expensive and time consuming, and there is a benefit of working...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...

  2. Image based Monte Carlo modeling for computational phantom

    International Nuclear Information System (INIS)

    Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.

    2013-01-01

    Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)

  3. Computer Profiling Based Model for Investigation

    OpenAIRE

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  4. Factors Hindering the Integration of CALL in a Tertiary Institution

    Directory of Open Access Journals (Sweden)

    Izaham Shah Ismail

    2008-12-01

    Full Text Available The field of Computer Assisted Language Learning (CALL is a field that is constantly evolving as it is very much dependent on the advancement of computer technologies. With new technologies being invented almost every day, experts in the field are looking for ways to apply these new technologies in the language classroom. Despite that, teachers are said to be slow at adopting technology in their classrooms and language teachers, whether at schools or tertiary institutions, are no exception. This study attempts to investigate the factors that hinder ESL instructors at an institution of higher learning from integrating CALL in their lessons. Interviews were conducted with five ESL instructors and results revealed that factors which hinder them from integrating CALL in their teaching are universal factors such as knowledge in technology and pedagogy, computer facilities and resources, absence of exemplary integration of CALL, personal beliefs on language teaching, views on the role of a computers as teacher, and evaluation of learning outcomes.

  5. Model-based quality assessment and base-calling for second-generation sequencing data.

    Science.gov (United States)

    Bravo, Héctor Corrada; Irizarry, Rafael A

    2010-09-01

    Second-generation sequencing (sec-gen) technology can sequence millions of short fragments of DNA in parallel, making it capable of assembling complex genomes for a small fraction of the price and time of previous technologies. In fact, a recently formed international consortium, the 1000 Genomes Project, plans to fully sequence the genomes of approximately 1200 people. The prospect of comparative analysis at the sequence level of a large number of samples across multiple populations may be achieved within the next five years. These data present unprecedented challenges in statistical analysis. For instance, analysis operates on millions of short nucleotide sequences, or reads-strings of A,C,G, or T's, between 30 and 100 characters long-which are the result of complex processing of noisy continuous fluorescence intensity measurements known as base-calling. The complexity of the base-calling discretization process results in reads of widely varying quality within and across sequence samples. This variation in processing quality results in infrequent but systematic errors that we have found to mislead downstream analysis of the discretized sequence read data. For instance, a central goal of the 1000 Genomes Project is to quantify across-sample variation at the single nucleotide level. At this resolution, small error rates in sequencing prove significant, especially for rare variants. Sec-gen sequencing is a relatively new technology for which potential biases and sources of obscuring variation are not yet fully understood. Therefore, modeling and quantifying the uncertainty inherent in the generation of sequence reads is of utmost importance. In this article, we present a simple model to capture uncertainty arising in the base-calling procedure of the Illumina/Solexa GA platform. Model parameters have a straightforward interpretation in terms of the chemistry of base-calling allowing for informative and easily interpretable metrics that capture the variability in

  6. Computational morphology of the lung and its virtual imaging

    International Nuclear Information System (INIS)

    Kitaoka, Hiroko

    2002-01-01

    The author proposes an entirely new approach called 'virtual imaging' of an organ based on 'computational morphology'. Computational morphology describes mathematically design as principles of an organ structure to generate the organ model via computer, which can be called virtual organ. Virtual imaging simulates image data using the virtual organ. The virtual organ is divided into cubic voxels, and the CT value or other intensity value for each voxel is calculated according to the tissue properties within the voxel. The validity of the model is examined by comparing virtual images with clinical images. Computational image analysis methods can be developed based on validated models. In this paper, computational anatomy of the lung and its virtual X-ray imaging are introduced

  7. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  8. Bio-inspired computational heuristics to study Lane-Emden systems arising in astrophysics model.

    Science.gov (United States)

    Ahmad, Iftikhar; Raja, Muhammad Asif Zahoor; Bilal, Muhammad; Ashraf, Farooq

    2016-01-01

    This study reports novel hybrid computational methods for the solutions of nonlinear singular Lane-Emden type differential equation arising in astrophysics models by exploiting the strength of unsupervised neural network models and stochastic optimization techniques. In the scheme the neural network, sub-part of large field called soft computing, is exploited for modelling of the equation in an unsupervised manner. The proposed approximated solutions of higher order ordinary differential equation are calculated with the weights of neural networks trained with genetic algorithm, and pattern search hybrid with sequential quadratic programming for rapid local convergence. The results of proposed solvers for solving the nonlinear singular systems are in good agreements with the standard solutions. Accuracy and convergence the design schemes are demonstrated by the results of statistical performance measures based on the sufficient large number of independent runs.

  9. Evaluating Emulation-based Models of Distributed Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Gabert, Kasimir G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Emulytics Initiatives

    2017-08-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  10. Modeling biological problems in computer science: a case study in genome assembly.

    Science.gov (United States)

    Medvedev, Paul

    2018-01-30

    As computer scientists working in bioinformatics/computational biology, we often face the challenge of coming up with an algorithm to answer a biological question. This occurs in many areas, such as variant calling, alignment and assembly. In this tutorial, we use the example of the genome assembly problem to demonstrate how to go from a question in the biological realm to a solution in the computer science realm. We show the modeling process step-by-step, including all the intermediate failed attempts. Please note this is not an introduction to how genome assembly algorithms work and, if treated as such, would be incomplete and unnecessarily long-winded. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Applying computer modeling to eddy current signal analysis for steam generator and heat exchanger tube inspections

    International Nuclear Information System (INIS)

    Sullivan, S.P.; Cecco, V.S.; Carter, J.R.; Spanner, M.; McElvanney, M.; Krause, T.W.; Tkaczyk, R.

    2000-01-01

    Licensing requirements for eddy current inspections for nuclear steam generators and heat exchangers are becoming increasingly stringent. The traditional industry-standard method of comparing inspection signals with flaw signals from simple in-line calibration standards is proving to be inadequate. A more complete understanding of eddy current and magnetic field interactions with flaws and other anomalies is required for the industry to generate consistently reliable inspections. Computer modeling is a valuable tool in improving the reliability of eddy current signal analysis. Results from computer modeling are helping inspectors to properly discriminate between real flaw signals and false calls, and improving reliability in flaw sizing. This presentation will discuss complementary eddy current computer modeling techniques such as the Finite Element Method (FEM), Volume Integral Method (VIM), Layer Approximation and other analytic methods. Each of these methods have advantages and limitations. An extension of the Layer Approximation to model eddy current probe responses to ferromagnetic materials will also be presented. Finally examples will be discussed demonstrating how some significant eddy current signal analysis problems have been resolved using appropriate electromagnetic computer modeling tools

  12. A machine learning model to determine the accuracy of variant calls in capture-based next generation sequencing.

    Science.gov (United States)

    van den Akker, Jeroen; Mishne, Gilad; Zimmer, Anjali D; Zhou, Alicia Y

    2018-04-17

    Next generation sequencing (NGS) has become a common technology for clinical genetic tests. The quality of NGS calls varies widely and is influenced by features like reference sequence characteristics, read depth, and mapping accuracy. With recent advances in NGS technology and software tools, the majority of variants called using NGS alone are in fact accurate and reliable. However, a small subset of difficult-to-call variants that still do require orthogonal confirmation exist. For this reason, many clinical laboratories confirm NGS results using orthogonal technologies such as Sanger sequencing. Here, we report the development of a deterministic machine-learning-based model to differentiate between these two types of variant calls: those that do not require confirmation using an orthogonal technology (high confidence), and those that require additional quality testing (low confidence). This approach allows reliable NGS-based calling in a clinical setting by identifying the few important variant calls that require orthogonal confirmation. We developed and tested the model using a set of 7179 variants identified by a targeted NGS panel and re-tested by Sanger sequencing. The model incorporated several signals of sequence characteristics and call quality to determine if a variant was identified at high or low confidence. The model was tuned to eliminate false positives, defined as variants that were called by NGS but not confirmed by Sanger sequencing. The model achieved very high accuracy: 99.4% (95% confidence interval: +/- 0.03%). It categorized 92.2% (6622/7179) of the variants as high confidence, and 100% of these were confirmed to be present by Sanger sequencing. Among the variants that were categorized as low confidence, defined as NGS calls of low quality that are likely to be artifacts, 92.1% (513/557) were found to be not present by Sanger sequencing. This work shows that NGS data contains sufficient characteristics for a machine-learning-based model to

  13. Overhead Crane Computer Model

    Science.gov (United States)

    Enin, S. S.; Omelchenko, E. Y.; Fomin, N. V.; Beliy, A. V.

    2018-03-01

    The paper has a description of a computer model of an overhead crane system. The designed overhead crane system consists of hoisting, trolley and crane mechanisms as well as a payload two-axis system. With the help of the differential equation of specified mechanisms movement derived through Lagrange equation of the II kind, it is possible to build an overhead crane computer model. The computer model was obtained using Matlab software. Transients of coordinate, linear speed and motor torque of trolley and crane mechanism systems were simulated. In addition, transients of payload swaying were obtained with respect to the vertical axis. A trajectory of the trolley mechanism with simultaneous operation with the crane mechanism is represented in the paper as well as a two-axis trajectory of payload. The designed computer model of an overhead crane is a great means for studying positioning control and anti-sway control systems.

  14. The Adaptation Study of the Questionnaires of the Attitude towards CALL (A-CALL), the Attitude towards CAL (A-CAL), the Attitude towards Foreign Language Learning (A-FLL) to Turkish Language

    Science.gov (United States)

    Erdem, Cahit; Saykili, Abdullah; Kocyigit, Mehmet

    2018-01-01

    This study primarily aims to adapt the Foreign Language Learning (FLL), Computer assisted Learning (CAL) and Computer assisted Language Learning (CALL) scales developed by Vandewaetere and Desmet into Turkish context. The instrument consists of three scales which are "the attitude towards CALL questionnaire" ("A-CALL")…

  15. RAPPORT-BUILDING THROUGH CALL IN TEACHING CHINESE AS A FOREIGN LANGUAGE: AN EXPLORATORY STUDY

    Directory of Open Access Journals (Sweden)

    Wenying Jiang

    2005-05-01

    Full Text Available Technological advances have brought about the ever-increasing utilisation of computer-assisted language learning (CALL media in the learning of a second language (L2. Computer-mediated communication, for example, provides a practical means for extending the learning of spoken language, a challenging process in tonal languages such as Chinese, beyond the realms of the classroom. In order to effectively improve spoken language competency, however, CALL applications must also reproduce the social interaction that lies at the heart of language learning and language use. This study draws on data obtained from the utilisation of CALL in the learning of L2 Chinese to explore whether this medium can be used to extend opportunities for rapport-building in language teaching beyond the face-to-face interaction of the classroom. Rapport's importance lies in its potential to enhance learning, motivate learners, and reduce learner anxiety. To date, CALL's potential in relation to this facet of social interaction remains a neglected area of research. The results of this exploratory study suggest that CALL may help foster learner-teacher rapport and that scaffolding, such as strategically composing rapport-fostering questions in sound-files, is conducive to this outcome. The study provides an instruction model for this application of CALL.

  16. Help Options in CALL: A Systematic Review

    Science.gov (United States)

    Cardenas-Claros, Monica S.; Gruba, Paul A.

    2009-01-01

    This paper is a systematic review of research investigating help options in the different language skills in computer-assisted language learning (CALL). In this review, emerging themes along with is-sues affecting help option research are identified and discussed. We argue that help options in CALL are application resources that do not only seem…

  17. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  18. Cloud Computing as Evolution of Distributed Computing – A Case Study for SlapOS Distributed Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2013-01-01

    Full Text Available The cloud computing paradigm has been defined from several points of view, the main two directions being either as an evolution of the grid and distributed computing paradigm, or, on the contrary, as a disruptive revolution in the classical paradigms of operating systems, network layers and web applications. This paper presents a distributed cloud computing platform called SlapOS, which unifies technologies and communication protocols into a new technology model for offering any application as a service. Both cloud and distributed computing can be efficient methods for optimizing resources that are aggregated from a grid of standard PCs hosted in homes, offices and small data centers. The paper fills a gap in the existing distributed computing literature by providing a distributed cloud computing model which can be applied for deploying various applications.

  19. Computational Modeling of Space Physiology

    Science.gov (United States)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  20. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  1. Two adults with multiple disabilities use a computer-aided telephone system to make phone calls independently.

    Science.gov (United States)

    Lancioni, Giulio E; O'Reilly, Mark F; Singh, Nirbhay N; Sigafoos, Jeff; Oliva, Doretta; Alberti, Gloria; Lang, Russell

    2011-01-01

    This study extended the assessment of a newly developed computer-aided telephone system with two participants (adults) who presented with blindness or severe visual impairment and motor or motor and intellectual disabilities. For each participant, the study was carried out according to an ABAB design, in which the A represented baseline phases and the B represented intervention phases, during which the special telephone system was available. The system involved among others a net-book computer provided with specific software, a global system for mobile communication modem, and a microswitch. Both participants learned to use the system very rapidly and managed to make phone calls independently to a variety of partners such as family members, friends and staff personnel. The results were discussed in terms of the technology under investigation (its advantages, drawbacks, and need of improvement) and the social-communication impact it can make for persons with multiple disabilities. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. The Effect of Computer Assisted Language Learning (CALL) on Performance in the Test of English for International Communication (TOEIC) Listening Module

    Science.gov (United States)

    van Han, Nguyen; van Rensburg, Henriette

    2014-01-01

    Many companies and organizations have been using the Test of English for International Communication (TOEIC) for business and commercial communication purpose in Vietnam and around the world. The present study investigated the effect of Computer Assisted Language Learning (CALL) on performance in the Test of English for International Communication…

  3. Investigation of a Markov Model for Computer System Security Threats

    Directory of Open Access Journals (Sweden)

    Alexey A. A. Magazev

    2017-01-01

    Full Text Available In this work, a model for computer system security threats formulated in terms of Markov processes is investigated. In the framework of this model the functioning of the computer system is considered as a sequence of failures and recovery actions which appear as results of information security threats acting on the system. We provide a detailed description of the model: the explicit analytical formulas for the probabilities of computer system states at any arbitrary moment of time are derived, some limiting cases are discussed, and the long-run dynamics of the system is analysed. The dependence of the security state probability (i.e. the state for which threats are absent on the probabilities of threats is separately investigated. In particular, it is shown that this dependence is qualitatively different for odd and even moments of time. For instance, in the case of one threat the security state probability demonstrates non-monotonic dependence on the probability of threat at even moments of time; this function admits at least one local minimum in its domain of definition. It is believed that the mentioned feature is important because it allows to locate the most dangerous areas of threats where the security state probability can be lower then the permissible level. Finally, we introduce an important characteristic of the model, called the relaxation time, by means of which we construct the permitting domain of the security parameters. Also the prospects of the received results application to the problem of finding the optimal values of the security parameters is discussed.

  4. Programmable architecture for quantum computing

    NARCIS (Netherlands)

    Chen, J.; Wang, L.; Charbon, E.; Wang, B.

    2013-01-01

    A programmable architecture called “quantum FPGA (field-programmable gate array)” (QFPGA) is presented for quantum computing, which is a hybrid model combining the advantages of the qubus system and the measurement-based quantum computation. There are two kinds of buses in QFPGA, the local bus and

  5. Demonstration of a computer model for residual radioactive material guidelines, RESRAD

    International Nuclear Information System (INIS)

    Yu, C.; Yuan, Y.C.; Zielen, A.J.; Wallo, A. III

    1989-01-01

    A computer model was developed to calculate residual radioactive material guidelines for the US Department of Energy (DOE). This model, called RESRAD, can be run on IBM or IBM-compatible microcomputer. Seven potential exposure pathways from contaminated soil are analyzed, including external radiation exposure and internal radiation exposure from inhalation and food digestion. The RESRAD code has been applied to several DOE sites to derive soil cleanup guidelines. The experience gained indicates that a comprehensive set of site-specific hydrogeologic and geochemical input parameters must be used for a realistic pathway analysis. The RESRAD code is a useful tool; it is easy to run and very user-friendly. 6 refs., 12 figs

  6. 76 Computer Assisted Language Learning (CALL) Software ...

    African Journals Online (AJOL)

    Ike Odimegwu

    combination with other factors which may enhance or ameliorate the ... form of computer-based learning which carries two important features: .... To take some commonplace examples, a ... photographs, and even full-motion video clips.

  7. Assessing call centers’ success:

    Directory of Open Access Journals (Sweden)

    Hesham A. Baraka

    2013-07-01

    This paper introduces a model to evaluate the performance of call centers based on the Delone and McLean Information Systems success model. A number of indicators are identified to track the call center’s performance. Mapping of the proposed indicators to the six dimensions of the D&M model is presented. A Weighted Call Center Performance Index is proposed to assess the call center performance; the index is used to analyze the effect of the identified indicators. Policy-Weighted approach was used to assume the weights with an analysis of different weights for each dimension. The analysis of the different weights cases gave priority to the User satisfaction and net Benefits dimension as the two outcomes from the system. For the input dimensions, higher priority was given to the system quality and the service quality dimension. Call centers decision makers can use the tool to tune the different weights in order to reach the objectives set by the organization. Multiple linear regression analysis was used in order to provide a linear formula for the User Satisfaction dimension and the Net Benefits dimension in order to be able to forecast the values for these two dimensions as function of the other dimensions

  8. Inbound Call Centers and Emotional Dissonance in the Job Demands - Resources Model.

    Science.gov (United States)

    Molino, Monica; Emanuel, Federica; Zito, Margherita; Ghislieri, Chiara; Colombo, Lara; Cortese, Claudio G

    2016-01-01

    Emotional labor, defined as the process of regulating feelings and expressions as part of the work role, is a major characteristic in call centers. In particular, interacting with customers, agents are required to show certain emotions that are considered acceptable by the organization, even though these emotions may be different from their true feelings. This kind of experience is defined as emotional dissonance and represents a feature of the job especially for call center inbound activities. The present study was aimed at investigating whether emotional dissonance mediates the relationship between job demands (workload and customer verbal aggression) and job resources (supervisor support, colleague support, and job autonomy) on the one hand, and, on the other, affective discomfort, using the job demands-resources model as a framework. The study also observed differences between two different types of inbound activities: customer assistance service (CA) and information service. The study involved agents of an Italian Telecommunication Company, 352 of whom worked in the CA and 179 in the information service. The hypothesized model was tested across the two groups through multi-group structural equation modeling. Analyses showed that CA agents experience greater customer verbal aggression and emotional dissonance than information service agents. RESULTS also showed, only for the CA group, a full mediation of emotional dissonance between workload and affective discomfort, and a partial mediation of customer verbal aggression and job autonomy, and affective discomfort. This study's findings contributed both to the emotional labor literature, investigating the mediational role of emotional dissonance in the job demands-resources model, and to call center literature, considering differences between two specific kinds of inbound activities. Suggestions for organizations and practitioners emerged in order to identify practical implications useful both to support

  9. Black-Scholes finite difference modeling in forecasting of call warrant prices in Bursa Malaysia

    Science.gov (United States)

    Mansor, Nur Jariah; Jaffar, Maheran Mohd

    2014-07-01

    Call warrant is a type of structured warrant in Bursa Malaysia. It gives the holder the right to buy the underlying share at a specified price within a limited period of time. The issuer of the structured warrants usually uses European style to exercise the call warrant on the maturity date. Warrant is very similar to an option. Usually, practitioners of the financial field use Black-Scholes model to value the option. The Black-Scholes equation is hard to solve analytically. Therefore the finite difference approach is applied to approximate the value of the call warrant prices. The central in time and central in space scheme is produced to approximate the value of the call warrant prices. It allows the warrant holder to forecast the value of the call warrant prices before the expiry date.

  10. International Conference on Computational Intelligence, Cyber Security, and Computational Models

    CERN Document Server

    Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn

    2016-01-01

    This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

  11. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  12. Modeling an Excitable Biosynthetic Tissue with Inherent Variability for Paired Computational-Experimental Studies.

    Directory of Open Access Journals (Sweden)

    Tanmay A Gokhale

    2017-01-01

    Full Text Available To understand how excitable tissues give rise to arrhythmias, it is crucially necessary to understand the electrical dynamics of cells in the context of their environment. Multicellular monolayer cultures have proven useful for investigating arrhythmias and other conduction anomalies, and because of their relatively simple structure, these constructs lend themselves to paired computational studies that often help elucidate mechanisms of the observed behavior. However, tissue cultures of cardiomyocyte monolayers currently require the use of neonatal cells with ionic properties that change rapidly during development and have thus been poorly characterized and modeled to date. Recently, Kirkton and Bursac demonstrated the ability to create biosynthetic excitable tissues from genetically engineered and immortalized HEK293 cells with well-characterized electrical properties and the ability to propagate action potentials. In this study, we developed and validated a computational model of these excitable HEK293 cells (called "Ex293" cells using existing electrophysiological data and a genetic search algorithm. In order to reproduce not only the mean but also the variability of experimental observations, we examined what sources of variation were required in the computational model. Random cell-to-cell and inter-monolayer variation in both ionic conductances and tissue conductivity was necessary to explain the experimentally observed variability in action potential shape and macroscopic conduction, and the spatial organization of cell-to-cell conductance variation was found to not impact macroscopic behavior; the resulting model accurately reproduces both normal and drug-modified conduction behavior. The development of a computational Ex293 cell and tissue model provides a novel framework to perform paired computational-experimental studies to study normal and abnormal conduction in multidimensional excitable tissue, and the methodology of modeling

  13. What do reversible programs compute?

    DEFF Research Database (Denmark)

    Axelsen, Holger Bock; Glück, Robert

    2011-01-01

    Reversible computing is the study of computation models that exhibit both forward and backward determinism. Understanding the fundamental properties of such models is not only relevant for reversible programming, but has also been found important in other fields, e.g., bidirectional model...... transformation, program transformations such as inversion, and general static prediction of program properties. Historically, work on reversible computing has focussed on reversible simulations of irreversible computations. Here, we take the viewpoint that the property of reversibility itself should...... are not strictly classically universal, but that they support another notion of universality; we call this RTM-universality. Thus, even though the RTMs are sub-universal in the classical sense, they are powerful enough as to include a self-interpreter. Lifting this to other computation models, we propose r...

  14. Effort-reward imbalance and one-year change in neck-shoulder and upper extremity pain among call center computer operators.

    Science.gov (United States)

    Krause, Niklas; Burgel, Barbara; Rempel, David

    2010-01-01

    The literature on psychosocial job factors and musculoskeletal pain is inconclusive in part due to insufficient control for confounding by biomechanical factors. The aim of this study was to investigate prospectively the independent effects of effort-reward imbalance (ERI) at work on regional musculoskeletal pain of the neck and upper extremities of call center operators after controlling for (i) duration of computer use both at work and at home, (ii) ergonomic workstation design, (iii) physical activities during leisure time, and (iv) other individual worker characteristics. This was a one-year prospective study among 165 call center operators who participated in a randomized ergonomic intervention trial that has been described previously. Over an approximate four-week period, we measured ERI and 28 potential confounders via a questionnaire at baseline. Regional upper-body pain and computer use was measured by weekly surveys for up to 12 months following the implementation of ergonomic interventions. Regional pain change scores were calculated as the difference between average weekly pain scores pre- and post intervention. A significant relationship was found between high average ERI ratios and one-year increases in right upper-extremity pain after adjustment for pre-intervention regional mean pain score, current and past physical workload, ergonomic workstation design, and anthropometric, sociodemographic, and behavioral risk factors. No significant associations were found with change in neck-shoulder or left upper-extremity pain. This study suggests that ERI predicts regional upper-extremity pain in -computer operators working >or=20 hours per week. Control for physical workload and ergonomic workstation design was essential for identifying ERI as a risk factor.

  15. Modeling Computer Virus and Its Dynamics

    Directory of Open Access Journals (Sweden)

    Mei Peng

    2013-01-01

    Full Text Available Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that the infected part of the computer disappears, and the virus dies out, and P0 is a globally asymptotically stable equilibrium if R01 then this model has only one viral equilibrium P*, which means that the computer persists at a constant endemic level, and P* is also globally asymptotically stable. Finally, some numerical examples are given to demonstrate the analytical results.

  16. An Evaluation Framework for CALL

    Science.gov (United States)

    McMurry, Benjamin L.; Williams, David Dwayne; Rich, Peter J.; Hartshorn, K. James

    2016-01-01

    Searching prestigious Computer-assisted Language Learning (CALL) journals for references to key publications and authors in the field of evaluation yields a short list. The "American Journal of Evaluation"--the flagship journal of the American Evaluation Association--is only cited once in both the "CALICO Journal and Language…

  17. COMPUTATIONAL MODELS FOR SUSTAINABLE DEVELOPMENT

    OpenAIRE

    Monendra Grover; Rajesh Kumar; Tapan Kumar Mondal; S. Rajkumar

    2011-01-01

    Genetic erosion is a serious problem and computational models have been developed to prevent it. The computational modeling in this field not only includes (terrestrial) reserve design, but also decision modeling for related problems such as habitat restoration, marine reserve design, and nonreserve approaches to conservation management. Models have been formulated for evaluating tradeoffs between socioeconomic, biophysical, and spatial criteria in establishing marine reserves. The percolatio...

  18. Ranked retrieval of Computational Biology models.

    Science.gov (United States)

    Henkel, Ron; Endler, Lukas; Peters, Andre; Le Novère, Nicolas; Waltemath, Dagmar

    2010-08-11

    The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models.

  19. Staffing to Maximize Profit for Call Centers with Impatient and Repeat-Calling Customers

    Directory of Open Access Journals (Sweden)

    Jun Gong

    2015-01-01

    Full Text Available Motivated by call center practice, we study the optimal staffing of many-server queues with impatient and repeat-calling customers. A call center is modeled as an M/M/s+M queue, which is developed to a behavioral queuing model in which customers come and go based on their satisfaction with waiting time. We explicitly take into account customer repeat behavior, which implies that satisfied customers might return and have an impact on the arrival rate. Optimality is defined as the number of agents that maximize revenues net of staffing costs, and we account for the characteristic that revenues are a direct function of staffing. Finally, we use numerical experiments to make certain comparisons with traditional models that do not consider customer repeat behavior. Furthermore, we indicate how managers might allocate staffing optimally with various customer behavior mechanisms.

  20. Acoustic model optimisation for a call routing system

    CSIR Research Space (South Africa)

    Kleynhans, N

    2012-11-01

    Full Text Available Secretary system and provides background on some application-specific ASR issues. Section III details the ASR development effort as well as corpus selection and design. Our experiments are described Fig. 1. High level AutoSecretary call flow. in Section IV... and results and a discussion are presented in Section V. Lastly, the conclusion and possible future work appear in Section VI. II. BACKGROUND A. AutoSecretary IVR System Figure 1 shows the high level call flow of the AutoSecretary call routing system...

  1. A Framework for Understanding Physics Students' Computational Modeling Practices

    Science.gov (United States)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  2. Metabolic Models of Protein Allocation Call for the Kinetome

    DEFF Research Database (Denmark)

    Nilsson, Avlant; Nielsen, Jens; Palsson, Bernhard

    2017-01-01

    The flux of metabolites in the living cell depend on enzyme activities. Recently, many metabolic phenotypes have been explained by computer models that incorporate enzyme activity data. To move further, the scientific community needs to measure the kinetics of all enzymes in a systematic way....

  3. Evaluating the Motivational Impact of CALL Systems: Current Practices and Future Directions

    Science.gov (United States)

    Bodnar, Stephen; Cucchiarini, Catia; Strik, Helmer; van Hout, Roeland

    2016-01-01

    A major aim of computer-assisted language learning (CALL) is to create computer environments that facilitate students' second language (L2) acquisition. To achieve this aim, CALL employs technological innovations to create novel types of language practice. Evaluations of the new practice types serve the important role of distinguishing effective…

  4. The use of Computer Assisted Language Learning (CALL) Devices ...

    African Journals Online (AJOL)

    Despite the numerous advantages that can be made of the computer in language teaching, many ... be made of the computer in English Language teaching by low technologically exposed teachers. ... http://dx.doi.org/10.4314/ict.v2i1.31950.

  5. CMS computing model evolution

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  6. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  7. A three-parameter model for classifying anurans into four genera based on advertisement calls.

    Science.gov (United States)

    Gingras, Bruno; Fitch, William Tecumseh

    2013-01-01

    The vocalizations of anurans are innate in structure and may therefore contain indicators of phylogenetic history. Thus, advertisement calls of species which are more closely related phylogenetically are predicted to be more similar than those of distant species. This hypothesis was evaluated by comparing several widely used machine-learning algorithms. Recordings of advertisement calls from 142 species belonging to four genera were analyzed. A logistic regression model, using mean values for dominant frequency, coefficient of variation of root-mean square energy, and spectral flux, correctly classified advertisement calls with regard to genus with an accuracy above 70%. Similar accuracy rates were obtained using these parameters with a support vector machine model, a K-nearest neighbor algorithm, and a multivariate Gaussian distribution classifier, whereas a Gaussian mixture model performed slightly worse. In contrast, models based on mel-frequency cepstral coefficients did not fare as well. Comparable accuracy levels were obtained on out-of-sample recordings from 52 of the 142 original species. The results suggest that a combination of low-level acoustic attributes is sufficient to discriminate efficiently between the vocalizations of these four genera, thus supporting the initial premise and validating the use of high-throughput algorithms on animal vocalizations to evaluate phylogenetic hypotheses.

  8. Voice Communications over 802.11 Ad Hoc Networks: Modeling, Optimization and Call Admission Control

    Science.gov (United States)

    Xu, Changchun; Xu, Yanyi; Liu, Gan; Liu, Kezhong

    Supporting quality-of-service (QoS) of multimedia communications over IEEE 802.11 based ad hoc networks is a challenging task. This paper develops a simple 3-D Markov chain model for queuing analysis of IEEE 802.11 MAC layer. The model is applied for performance analysis of voice communications over IEEE 802.11 single-hop ad hoc networks. By using the model, we finish the performance optimization of IEEE MAC layer and obtain the maximum number of voice calls in IEEE 802.11 ad hoc networks as well as the statistical performance bounds. Furthermore, we design a fully distributed call admission control (CAC) algorithm which can provide strict statistical QoS guarantee for voice communications over IEEE 802.11 ad hoc networks. Extensive simulations indicate the accuracy of the analytical model and the CAC scheme.

  9. The IceCube Computing Infrastructure Model

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Besides the big LHC experiments a number of mid-size experiments is coming online which need to define new computing models to meet the demands on processing and storage requirements of those experiments. We present the hybrid computing model of IceCube which leverages GRID models with a more flexible direct user model as an example of a possible solution. In IceCube a central datacenter at UW-Madison servers as Tier-0 with a single Tier-1 datacenter at DESY Zeuthen. We describe the setup of the IceCube computing infrastructure and report on our experience in successfully provisioning the IceCube computing needs.

  10. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  11. Impact of implementation choices on quantitative predictions of cell-based computational models

    Science.gov (United States)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  12. Sierra toolkit computational mesh conceptual model

    International Nuclear Information System (INIS)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-01-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  13. Inbound Call Centers and Emotional Dissonance in the Job Demands – Resources Model

    Science.gov (United States)

    Molino, Monica; Emanuel, Federica; Zito, Margherita; Ghislieri, Chiara; Colombo, Lara; Cortese, Claudio G.

    2016-01-01

    Background: Emotional labor, defined as the process of regulating feelings and expressions as part of the work role, is a major characteristic in call centers. In particular, interacting with customers, agents are required to show certain emotions that are considered acceptable by the organization, even though these emotions may be different from their true feelings. This kind of experience is defined as emotional dissonance and represents a feature of the job especially for call center inbound activities. Aim: The present study was aimed at investigating whether emotional dissonance mediates the relationship between job demands (workload and customer verbal aggression) and job resources (supervisor support, colleague support, and job autonomy) on the one hand, and, on the other, affective discomfort, using the job demands-resources model as a framework. The study also observed differences between two different types of inbound activities: customer assistance service (CA) and information service. Method: The study involved agents of an Italian Telecommunication Company, 352 of whom worked in the CA and 179 in the information service. The hypothesized model was tested across the two groups through multi-group structural equation modeling. Results: Analyses showed that CA agents experience greater customer verbal aggression and emotional dissonance than information service agents. Results also showed, only for the CA group, a full mediation of emotional dissonance between workload and affective discomfort, and a partial mediation of customer verbal aggression and job autonomy, and affective discomfort. Conclusion: This study’s findings contributed both to the emotional labor literature, investigating the mediational role of emotional dissonance in the job demands-resources model, and to call center literature, considering differences between two specific kinds of inbound activities. Suggestions for organizations and practitioners emerged in order to identify

  14. OccuPeak: ChIP-Seq peak calling based on internal background modelling

    NARCIS (Netherlands)

    de Boer, Bouke A.; van Duijvenboden, Karel; van den Boogaard, Malou; Christoffels, Vincent M.; Barnett, Phil; Ruijter, Jan M.

    2014-01-01

    ChIP-seq has become a major tool for the genome-wide identification of transcription factor binding or histone modification sites. Most peak-calling algorithms require input control datasets to model the occurrence of background reads to account for local sequencing and GC bias. However, the

  15. A model of phone call intervention in sensitizing the change of dietary pattern

    Directory of Open Access Journals (Sweden)

    Eliane Corrêa Chaves

    2010-06-01

    Full Text Available Objective: To propose a model of phone call intervention for changing dietary patterns and to assess its effectiveness. Method: A study carried out at the Health Promotion School of Medicine, University of São Paulo, with 27 subjects, 3-5 phone calls contacts per user, by means of which were given orientations and interventions on the principles of Cognitive Behavioral Therapy and the Transtheoretical Model on healthy eating. We analyzed the variables weight and body mass index, dietary patterns and overall stage of motivation to change. The data were submitted to analysis of variance with repeated measures at different stages of evaluation: pre-contact, 3rd and 5th phone calls. Results: After intervention, users showed a change in eating behavior in the third contact, and change occurred in weight and BMI in one patient. All findings were not statistically significant. There was improvement in the motivation to acquire new eating habits, also not significant. Conclusion: There was a slight change in feeding behavior, the motivation to change improved for all participants, without, however, have been effective in this type of approach.

  16. Modelling computer networks

    International Nuclear Information System (INIS)

    Max, G

    2011-01-01

    Traffic models in computer networks can be described as a complicated system. These systems show non-linear features and to simulate behaviours of these systems are also difficult. Before implementing network equipments users wants to know capability of their computer network. They do not want the servers to be overloaded during temporary traffic peaks when more requests arrive than the server is designed for. As a starting point for our study a non-linear system model of network traffic is established to exam behaviour of the network planned. The paper presents setting up a non-linear simulation model that helps us to observe dataflow problems of the networks. This simple model captures the relationship between the competing traffic and the input and output dataflow. In this paper, we also focus on measuring the bottleneck of the network, which was defined as the difference between the link capacity and the competing traffic volume on the link that limits end-to-end throughput. We validate the model using measurements on a working network. The results show that the initial model estimates well main behaviours and critical parameters of the network. Based on this study, we propose to develop a new algorithm, which experimentally determines and predict the available parameters of the network modelled.

  17. Mathematical Modeling and Computational Thinking

    Science.gov (United States)

    Sanford, John F.; Naidu, Jaideep T.

    2017-01-01

    The paper argues that mathematical modeling is the essence of computational thinking. Learning a computer language is a valuable assistance in learning logical thinking but of less assistance when learning problem-solving skills. The paper is third in a series and presents some examples of mathematical modeling using spreadsheets at an advanced…

  18. Call Forecasting for Inbound Call Center

    Directory of Open Access Journals (Sweden)

    Peter Vinje

    2009-01-01

    Full Text Available In a scenario of inbound call center customer service, the ability to forecast calls is a key element and advantage. By forecasting the correct number of calls a company can predict staffing needs, meet service level requirements, improve customer satisfaction, and benefit from many other optimizations. This project will show how elementary statistics can be used to predict calls for a specific company, forecast the rate at which calls are increasing/decreasing, and determine if the calls may stop at some point.

  19. Beyond mean-field approximations for accurate and computationally efficient models of on-lattice chemical kinetics

    Science.gov (United States)

    Pineda, M.; Stamatakis, M.

    2017-07-01

    Modeling the kinetics of surface catalyzed reactions is essential for the design of reactors and chemical processes. The majority of microkinetic models employ mean-field approximations, which lead to an approximate description of catalytic kinetics by assuming spatially uncorrelated adsorbates. On the other hand, kinetic Monte Carlo (KMC) methods provide a discrete-space continuous-time stochastic formulation that enables an accurate treatment of spatial correlations in the adlayer, but at a significant computation cost. In this work, we use the so-called cluster mean-field approach to develop higher order approximations that systematically increase the accuracy of kinetic models by treating spatial correlations at a progressively higher level of detail. We further demonstrate our approach on a reduced model for NO oxidation incorporating first nearest-neighbor lateral interactions and construct a sequence of approximations of increasingly higher accuracy, which we compare with KMC and mean-field. The latter is found to perform rather poorly, overestimating the turnover frequency by several orders of magnitude for this system. On the other hand, our approximations, while more computationally intense than the traditional mean-field treatment, still achieve tremendous computational savings compared to KMC simulations, thereby opening the way for employing them in multiscale modeling frameworks.

  20. Turbulent Bubbly Flow in a Vertical Pipe Computed By an Eddy-Resolving Reynolds Stress Model

    Science.gov (United States)

    2014-09-19

    the numerical code OpenFOAM R©. 1 Introduction Turbulent bubbly flows are encountered in many industrially relevant applications, such as chemical in...performed using the OpenFOAM -2.2.2 computational code utilizing a cell- center-based finite volume method on an unstructured numerical grid. The...the mean Courant number is always below 0.4. The utilized turbulence models were implemented into the so-called twoPhaseEulerFoam solver in OpenFOAM , to

  1. LHCb computing model

    CERN Document Server

    Frank, M; Pacheco, Andreu

    1998-01-01

    This document is a first attempt to describe the LHCb computing model. The CPU power needed to process data for the event filter and reconstruction is estimated to be 2.2 \\Theta 106 MIPS. This will be installed at the experiment and will be reused during non data-taking periods for reprocessing. The maximal I/O of these activities is estimated to be around 40 MB/s.We have studied three basic models concerning the placement of the CPU resources for the other computing activities, Monte Carlo-simulation (1:4 \\Theta 106 MIPS) and physics analysis (0:5 \\Theta 106 MIPS): CPU resources may either be located at the physicist's homelab, national computer centres (Regional Centres) or at CERN.The CPU resources foreseen for analysis are sufficient to allow 100 concurrent analyses. It is assumed that physicists will work in physics groups that produce analysis data at an average rate of 4.2 MB/s or 11 TB per month. However, producing these group analysis data requires reading capabilities of 660 MB/s. It is further assu...

  2. 40 CFR 194.23 - Models and computer codes.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  3. Is CALL Obsolete? Language Acquisition and Language Learning Revisited in a Digital Age

    Science.gov (United States)

    Jarvis, Huw; Krashen, Stephen

    2014-01-01

    In this article, Huw Jarvis and Stephen Krashen ask "Is CALL Obsolete?" When the term CALL (Computer-Assisted Language Learning) was introduced in the 1960s, the language education profession knew only about language learning, not language acquisition, and assumed the computer's primary contribution to second language acquisition…

  4. Trust Models in Ubiquitous Computing

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2008-01-01

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.......We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models....

  5. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  6. Computer models for economic and silvicultural decisions

    Science.gov (United States)

    Rosalie J. Ingram

    1989-01-01

    Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.

  7. The Ghost in the Machine: Are "Teacherless" CALL Programs Really Possible?

    Science.gov (United States)

    Davies, Ted; Williamson, Rodney

    1998-01-01

    Reflects critically on pedagogical issues in the production of computer-assisted language learning (CALL) courseware and ways CALL has affected the practice of language learning. Concludes that if CALL is to reach full potential, it must be more than a simple medium of information; it should provide a teaching/learning process, with the real…

  8. Quantum vertex model for reversible classical computing.

    Science.gov (United States)

    Chamon, C; Mucciolo, E R; Ruckenstein, A E; Yang, Z-C

    2017-05-12

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without 'learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  9. Flight calls and orientation

    DEFF Research Database (Denmark)

    Larsen, Ole Næsbye; Andersen, Bent Bach; Kropp, Wibke

    2008-01-01

    flight calls was simulated by sequential computer controlled activation of five loudspeakers placed in a linear array perpendicular to the bird's migration course. The bird responded to this stimulation by changing its migratory course in the direction of that of the ‘flying conspecifics' but after about......  In a pilot experiment a European Robin, Erithacus rubecula, expressing migratory restlessness with a stable orientation, was video filmed in the dark with an infrared camera and its directional migratory activity was recorded. The flight overhead of migrating conspecifics uttering nocturnal...... 30 minutes it drifted back to its original migration course. The results suggest that songbirds migrating alone at night can use the flight calls from conspecifics as additional cues for orientation and that they may compare this information with other cues to decide what course to keep....

  10. A Practical, Robust Methodology for Acquiring New Observation Data Using Computationally Expensive Groundwater Models

    Science.gov (United States)

    Siade, Adam J.; Hall, Joel; Karelse, Robert N.

    2017-11-01

    Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.

  11. Mapping the Most Significant Computer Hacking Events to a Temporal Computer Attack Model

    OpenAIRE

    Heerden , Renier ,; Pieterse , Heloise; Irwin , Barry

    2012-01-01

    Part 4: Section 3: ICT for Peace and War; International audience; This paper presents eight of the most significant computer hacking events (also known as computer attacks). These events were selected because of their unique impact, methodology, or other properties. A temporal computer attack model is presented that can be used to model computer based attacks. This model consists of the following stages: Target Identification, Reconnaissance, Attack, and Post-Attack Reconnaissance stages. The...

  12. PREVENTIVE SIGNATURE MODEL FOR SECURE CLOUD DEPLOYMENT THROUGH FUZZY DATA ARRAY COMPUTATION

    Directory of Open Access Journals (Sweden)

    R. Poorvadevi

    2017-01-01

    Full Text Available Cloud computing is a resource pool which offers boundless services by the form of resources to its end users whoever heavily depends on cloud service providers. Cloud is providing the service access across the geographic locations in an efficient way. However it is offering numerous services, client end system is not having adequate methods, security policies and other protocols for using the cloud customer secret level transactions and other privacy related information. So, this proposed model brings the solution for securing the cloud user confidential data, Application deployment and also identifying the genuineness of the user by applying the scheme which is referred as fuzzy data array computation. Fuzzy data array computation provides an effective system is called signature retrieval and evaluation system through which customer’s data can be safeguarded along with their application. This signature system can be implemented on the cloud environment using the cloud sim 3.0 simulator tools. It facilitates the security operation over the data centre and cloud vendor locations in an effective manner.

  13. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-01-01

    The application of computers to controlled thermonuclear research (CTR) is essential. In the near future the use of computers in the numerical modeling of fusion systems should increase substantially. A recent panel has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies is called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. To meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR Laboratories by a communication network. The crucial element needed for success is trained personnel. The number of people with knowledge of plasma science and engineering trained in numerical methods and computer science must be increased substantially in the next few years. Nuclear engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing

  14. Inbound Call Centers and Emotional Dissonance in the Job Demands – Resources Model

    Directory of Open Access Journals (Sweden)

    Monica Molino

    2016-07-01

    Full Text Available Background: Emotional labor, defined as the process of regulating feelings and expressions as part of the work role, is a major characteristic in call centers. In particular, interacting with customers, agents are required to show certain emotions that are considered acceptable by the organization, even though these emotions may be different from their true feelings. This kind of experience is defined as emotional dissonance and represents a feature of the job especially for call center inbound activities. Aim: The present study was aimed at investigating whether emotional dissonance mediates the relationship between job demands (workload and customer verbal aggression and job resources (supervisor support, colleague support and job autonomy on the one hand, and, on the other, affective discomfort, using the job demands-resources model as a framework. The study also observed differences between two different types of inbound activities: customer assistance service and information service.Method: The study involved agents of an Italian Telecommunication Company, 352 of whom worked in the customer assistance service and 179 in the information service. The hypothesized model was tested across the two groups through multi-group structural equation modeling.Results: Analyses showed that customer assistance service agents experience greater customer verbal aggression and emotional dissonance than information service agents. Results also showed, only for the customer assistance service group, a full mediation of emotional dissonance between workload and affective discomfort, and a partial mediation of customer verbal aggression and job autonomy, and affective discomfort.Conclusion: This study’s findings contributed both to the emotional labor literature, investigating the mediational role of emotional dissonance in the job demands-resources model, and to call center literature, considering differences between two specific kinds of inbound activities

  15. Electoronic Performance Monitoring in Call Centers: An Ethical Decision Model

    OpenAIRE

    Perkins, David

    2013-01-01

    Ever since it emerged on a widespread basis in the 1990s, electronic performance monitoring of employees has received significant scrutiny in the literature. Call centers have been the focus of many of these studies. This particular study addresses the issue of electronic performance monitoring in call centers from an ethical perspective. The following ethical dilemma is offered: "Is it ethical for a call center manager to evaluate the performance of a call center employee using electronic pe...

  16. Sustainability in CALL Learning Environments: A Systemic Functional Grammar Approach

    Science.gov (United States)

    McDonald, Peter

    2014-01-01

    This research aims to define a sustainable resource in Computer-Assisted Language Learning (CALL). In order for a CALL resource to be sustainable it must work within existing educational curricula. This feature is a necessary prerequisite of sustainability because, despite the potential for educational change that digitalization has offered since…

  17. Impact of Using CALL on Iranian EFL Learners' Vocabulary Knowledge

    Science.gov (United States)

    Yunus, Melor Md; Salehi, Hadi; Amini, Mahdi

    2016-01-01

    Computer Assisted Language Learning (CALL) integration in EFL contexts has intensified noticeably in recent years. This integration might be in different ways and for different purposes such as vocabulary acquisition, grammar learning, phonology, writing skills, etc. More explicitly, this study is an attempt to explore the effect of using CALL on…

  18. Models of parallel computation :a survey and classification

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yunquan; CHEN Guoliang; SUN Guangzhong; MIAO Qiankun

    2007-01-01

    In this paper,the state-of-the-art parallel computational model research is reviewed.We will introduce various models that were developed during the past decades.According to their targeting architecture features,especially memory organization,we classify these parallel computational models into three generations.These models and their characteristics are discussed based on three generations classification.We believe that with the ever increasing speed gap between the CPU and memory systems,incorporating non-uniform memory hierarchy into computational models will become unavoidable.With the emergence of multi-core CPUs,the parallelism hierarchy of current computing platforms becomes more and more complicated.Describing this complicated parallelism hierarchy in future computational models becomes more and more important.A semi-automatic toolkit that can extract model parameters and their values on real computers can reduce the model analysis complexity,thus allowing more complicated models with more parameters to be adopted.Hierarchical memory and hierarchical parallelism will be two very important features that should be considered in future model design and research.

  19. Computer model for ductile fracture

    International Nuclear Information System (INIS)

    Moran, B.; Reaugh, J. E.

    1979-01-01

    A computer model is described for predicting ductile fracture initiation and propagation. The computer fracture model is calibrated by simple and notched round-bar tension tests and a precracked compact tension test. The model is used to predict fracture initiation and propagation in a Charpy specimen and compare the results with experiments. The calibrated model provides a correlation between Charpy V-notch (CVN) fracture energy and any measure of fracture toughness, such as J/sub Ic/. A second simpler empirical correlation was obtained using the energy to initiate fracture in the Charpy specimen rather than total energy CVN, and compared the results with the empirical correlation of Rolfe and Novak

  20. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  1. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  2. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  3. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  4. A physicist's model of computation

    International Nuclear Information System (INIS)

    Fredkin, E.

    1991-01-01

    An attempt is presented to make a statement about what a computer is and how it works from the perspective of physics. The single observation that computation can be a reversible process allows for the same kind of insight into computing as was obtained by Carnot's discovery that heat engines could be modelled as reversible processes. It allows us to bring computation into the realm of physics, where the power of physics allows us to ask and answer questions that seemed intractable from the viewpoint of computer science. Strangely enough, this effort makes it clear why computers get cheaper every year. (author) 14 refs., 4 figs

  5. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Measurements and computer modeling of fast ion emission from plasma accelerators of the rod plasma injector type

    International Nuclear Information System (INIS)

    Malinowski, Karol; Sadowski, Marek J; Skladnik-Sadowska, Elzbieta

    2014-01-01

    This paper reports on the results of experimental studies and computer simulations of the emission of fast ion streams from so-called rod plasma injectors (RPI). Various RPI facilities have been used at the National Centre for Nuclear Research (NCBJ) for basic plasma studies as well as for material engineering. In fact, the RPI facilities have been studied experimentally for many years, particularly at the Institute for Nuclear Sciences (now the NCBJ), and numerous experimental data have been collected. Unfortunately, the ion emission characteristics have so far not been explained theoretically in a satisfactory way. In this paper, in order to explain these characteristics, use was made of a single-particle model. Taking into account the stochastic character of the ion emission, we applied a Monte Carlo method. The performed computer simulations of a pinhole image and energy spectrum of deuterons emitted from RPI-IBIS, which were computed on the basis of the applied model, appeared to be in reasonable agreement with the experimental data. (paper)

  7. Applications of membrane computing in systems and synthetic biology

    CERN Document Server

    Gheorghe, Marian; Pérez-Jiménez, Mario

    2014-01-01

    Membrane Computing was introduced as a computational paradigm in Natural Computing. The models introduced, called Membrane (or P) Systems, provide a coherent platform to describe and study living cells as computational systems. Membrane Systems have been investigated for their computational aspects and employed to model problems in other fields, like: Computer Science, Linguistics, Biology, Economy, Computer Graphics, Robotics, etc. Their inherent parallelism, heterogeneity and intrinsic versatility allow them to model a broad range of processes and phenomena, being also an efficient means to solve and analyze problems in a novel way. Membrane Computing has been used to model biological systems, becoming with time a thorough modeling paradigm comparable, in its modeling and predicting capabilities, to more established models in this area. This book is the result of the need to collect, in an organic way, different facets of this paradigm. The chapters of this book, together with the web pages accompanying th...

  8. uPy: a ubiquitous computer graphics Python API with Biological Modeling Applications

    Science.gov (United States)

    Autin, L.; Johnson, G.; Hake, J.; Olson, A.; Sanner, M.

    2015-01-01

    In this paper we describe uPy, an extension module for the Python programming language that provides a uniform abstraction of the APIs of several 3D computer graphics programs called hosts, including: Blender, Maya, Cinema4D, and DejaVu. A plugin written with uPy is a unique piece of code that will run in all uPy-supported hosts. We demonstrate the creation of complex plug-ins for molecular/cellular modeling and visualization and discuss how uPy can more generally simplify programming for many types of projects (not solely science applications) intended for multi-host distribution. uPy is available at http://upy.scripps.edu PMID:24806987

  9. Call for participation first ACM workshop on education in computer security

    OpenAIRE

    Irvine, Cynthia; Orman, Hilarie

    1997-01-01

    Taken from the NPS website. The security of information systems and networks is a growing concern. Experts are needed to design and organize the protection mechanisms for these systems. Both government and industry increasingly seek individuals with knowledge and skills in computer security. In the past, most traditional computer science curricula bypassed formal studies in computer security altogether. An understanding of computer security was achieved largely through on-the-job ...

  10. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer-aided...... methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task....... The methodology has been implemented into a computer-aided modeling framework, which combines expert skills, tools, and database connections that are required for the different steps of the model development work-flow with the goal to increase the efficiency of the modeling process. The framework has two main...

  11. Elements of matrix modeling and computing with Matlab

    CERN Document Server

    White, Robert E

    2006-01-01

    As discrete models and computing have become more common, there is a need to study matrix computation and numerical linear algebra. Encompassing a diverse mathematical core, Elements of Matrix Modeling and Computing with MATLAB examines a variety of applications and their modeling processes, showing you how to develop matrix models and solve algebraic systems. Emphasizing practical skills, it creates a bridge from problems with two and three variables to more realistic problems that have additional variables. Elements of Matrix Modeling and Computing with MATLAB focuses on seven basic applicat

  12. Vehicle - Bridge interaction, comparison of two computing models

    Science.gov (United States)

    Melcer, Jozef; Kuchárová, Daniela

    2017-07-01

    The paper presents the calculation of the bridge response on the effect of moving vehicle moves along the bridge with various velocities. The multi-body plane computing model of vehicle is adopted. The bridge computing models are created in two variants. One computing model represents the bridge as the Bernoulli-Euler beam with continuously distributed mass and the second one represents the bridge as the lumped mass model with 1 degrees of freedom. The mid-span bridge dynamic deflections are calculated for both computing models. The results are mutually compared and quantitative evaluated.

  13. Development and application of a complex numerical model and software for the computation of dose conversion factors for radon progenies.

    Science.gov (United States)

    Farkas, Árpád; Balásházy, Imre

    2015-04-01

    A more exact determination of dose conversion factors associated with radon progeny inhalation was possible due to the advancements in epidemiological health risk estimates in the last years. The enhancement of computational power and the development of numerical techniques allow computing dose conversion factors with increasing reliability. The objective of this study was to develop an integrated model and software based on a self-developed airway deposition code, an own bronchial dosimetry model and the computational methods accepted by International Commission on Radiological Protection (ICRP) to calculate dose conversion coefficients for different exposure conditions. The model was tested by its application for exposure and breathing conditions characteristic of mines and homes. The dose conversion factors were 8 and 16 mSv WLM(-1) for homes and mines when applying a stochastic deposition model combined with the ICRP dosimetry model (named PM-A model), and 9 and 17 mSv WLM(-1) when applying the same deposition model combined with authors' bronchial dosimetry model and the ICRP bronchiolar and alveolar-interstitial dosimetry model (called PM-B model). User friendly software for the computation of dose conversion factors has also been developed. The software allows one to compute conversion factors for a large range of exposure and breathing parameters and to perform sensitivity analyses. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  15. Creation of 'Ukrytie' objects computer model

    International Nuclear Information System (INIS)

    Mazur, A.B.; Kotlyarov, V.T.; Ermolenko, A.I.; Podbereznyj, S.S.; Postil, S.D.; Shaptala, D.V.

    1999-01-01

    A partial computer model of the 'Ukrytie' object was created with the use of geoinformation technologies. The computer model makes it possible to carry out information support of the works related to the 'Ukrytie' object stabilization and its conversion into ecologically safe system for analyzing, forecasting and controlling the processes occurring in the 'Ukrytie' object. Elements and structures of the 'Ukryttia' object were designed and input into the model

  16. Synthetic digital radiographs using exposure computer models of Voxels / EGS4 Phantoms

    International Nuclear Information System (INIS)

    Kenned, Roberto; Vieira, Jose W.; Lima, Fernando R.A.; Loureiro, Eduardo

    2008-01-01

    The objective of this work is to produce synthetic digital radiographs from synthetic phantoms with the use of a Computational Model of Exposition (MCE). The literature explains a model consisted on a phantom, a Monte Carlo code and an algorithm of a radioactive source. In this work it was used the FAX phantom (Female Adult voXel), besides the EGS4 system code Eletron Shower-range version 4) and an external source, similar to that used in diagnostic radiology. The implementation of MCE creates files with information on external energy deposited in the voxels of fantoma used, here called EnergiaPorVoxel.dat. These files along with the targeted phantom (fax.sgi) worked as data entry for the DIP software (Digital Imaging Processing) to build the synthetic phantoms based on energy and the effective dose. This way you can save each slice that is the stack of pictures of these phantoms synthetics, which have been called synthetic digital radiography. Using this, it is possible to use techniques of emphasis in space to increase the contrast or elineate contours between organs and tissues. The practical use of these images is not only to allow a planning of examinations performed in clinics and hospitals and reducing unnecessary exposure to patients by error of radiographic techniques. (author)

  17. Exploratory analysis regarding the domain definitions for computer based analytical models

    Science.gov (United States)

    Raicu, A.; Oanta, E.; Barhalescu, M.

    2017-08-01

    Our previous computer based studies dedicated to structural problems using analytical methods defined the composite cross section of a beam as a result of Boolean operations with so-called ‘simple’ shapes. Using generalisations, in the class of the ‘simple’ shapes were included areas bounded by curves approximated using spline functions and areas approximated as polygons. However, particular definitions lead to particular solutions. In order to ascend above the actual limitations, we conceived a general definition of the cross sections that are considered now calculus domains consisting of several subdomains. The according set of input data use complex parameterizations. This new vision allows us to naturally assign a general number of attributes to the subdomains. In this way there may be modelled new phenomena that use map-wise information, such as the metal alloys equilibrium diagrams. The hierarchy of the input data text files that use the comma-separated-value format and their structure are also presented and discussed in the paper. This new approach allows us to reuse the concepts and part of the data processing software instruments already developed. The according software to be subsequently developed will be modularised and generalised in order to be used in the upcoming projects that require rapid development of computer based models.

  18. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  19. Do ergonomics improvements increase computer workers' productivity?: an intervention study in a call centre.

    Science.gov (United States)

    Smith, Michael J; Bayehi, Antoinette Derjani

    2003-01-15

    This paper examines whether improving physical ergonomics working conditions affects worker productivity in a call centre with computer-intensive work. A field study was conducted at a catalogue retail service organization to explore the impact of ergonomics improvements on worker production. There were three levels of ergonomics interventions, each adding incrementally to the previous one. The first level was ergonomics training for all computer users accompanied by workstation ergonomics analysis leading to specific customized adjustments to better fit each worker (Group C). The second level added specific workstation accessories to improve the worker fit if the ergonomics analysis indicated a need for them (Group B). The third level met Group B requirements plus an improved chair (Group A). Productivity data was gathered from 72 volunteer participants who received ergonomics improvements to their workstations and 370 control subjects working in the same departments. Daily company records of production outputs for each worker were taken before ergonomics intervention (baseline) and 12 months after ergonomics intervention. Productivity improvement from baseline to 12 months post-intervention was examined across all ergonomics conditions combined, and also compared to the control group. The findings showed that worker performance increased for 50% of the ergonomics improvement participants and decreased for 50%. Overall, there was a 4.87% output increase for the ergonomics improvement group as compared to a 3.46% output decrease for the control group. The level of productivity increase varied by the type of the ergonomics improvements with Group C showing the best improvement (9.43%). Even though the average production improved, caution must be used in interpreting the findings since the ergonomics interventions were not successful for one-half of the participants.

  20. Acoustic model optimisation for a call routing system

    CSIR Research Space (South Africa)

    Kleynhans, N

    2012-11-01

    Full Text Available and results and a discussion are presented in Section V. Lastly, the conclusion and possible future work appear in Section VI. II. BACKGROUND A. AutoSecretary IVR System Figure 1 shows the high level call flow of the AutoSecretary call routing system... to be difficult Corpus Name # utterances duration in hours Lwazi English 5843 5.03 Lwazi English plus Lwazi language prompts 7770 5.57 NCHLT English 106018 76.97 AST English (5 dialects) 51745 29.80 TABLE I THE NUMBER OF TRAINING UTTERANCES AND DURATION...

  1. A varying coefficient model to measure the effectiveness of mass media anti-smoking campaigns in generating calls to a Quitline.

    Science.gov (United States)

    Bui, Quang M; Huggins, Richard M; Hwang, Wen-Han; White, Victoria; Erbas, Bircan

    2010-01-01

    Anti-smoking advertisements are an effective population-based smoking reduction strategy. The Quitline telephone service provides a first point of contact for adults considering quitting. Because of data complexity, the relationship between anti-smoking advertising placement, intensity, and time trends in total call volume is poorly understood. In this study we use a recently developed semi-varying coefficient model to elucidate this relationship. Semi-varying coefficient models comprise parametric and nonparametric components. The model is fitted to the daily number of calls to Quitline in Victoria, Australia to estimate a nonparametric long-term trend and parametric terms for day-of-the-week effects and to clarify the relationship with target audience rating points (TARPs) for the Quit and nicotine replacement advertising campaigns. The number of calls to Quitline increased with the TARP value of both the Quit and other smoking cessation advertisement; the TARP values associated with the Quit program were almost twice as effective. The varying coefficient term was statistically significant for peak periods with little or no advertising. Semi-varying coefficient models are useful for modeling public health data when there is little or no information on other factors related to the at-risk population. These models are well suited to modeling call volume to Quitline, because the varying coefficient allowed the underlying time trend to depend on fixed covariates that also vary with time, thereby explaining more of the variation in the call model.

  2. Call center performance with direct response advertising

    NARCIS (Netherlands)

    M. Kiygi Calli (Meltem); M. Weverbergh (Marcel); Ph.H.B.F. Franses (Philip Hans)

    2017-01-01

    textabstractThis study investigates the manpower planning and the performance of a national call center dealing with car repairs and on the road interventions. We model the impact of advertising on the capacity required. The starting point is a forecasting model for the incoming calls, where we take

  3. A methodology for the design of experiments in computational intelligence with multiple regression models.

    Science.gov (United States)

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  4. A methodology for the design of experiments in computational intelligence with multiple regression models

    Directory of Open Access Journals (Sweden)

    Carlos Fernandez-Lozano

    2016-12-01

    Full Text Available The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  5. Computational modelling in fluid mechanics

    International Nuclear Information System (INIS)

    Hauguel, A.

    1985-01-01

    The modelling of the greatest part of environmental or industrial flow problems gives very similar types of equations. The considerable increase in computing capacity over the last ten years consequently allowed numerical models of growing complexity to be processed. The varied group of computer codes presented are now a complementary tool of experimental facilities to achieve studies in the field of fluid mechanics. Several codes applied in the nuclear field (reactors, cooling towers, exchangers, plumes...) are presented among others [fr

  6. Computer Language For Optimization Of Design

    Science.gov (United States)

    Scotti, Stephen J.; Lucas, Stephen H.

    1991-01-01

    SOL is computer language geared to solution of design problems. Includes mathematical modeling and logical capabilities of computer language like FORTRAN; also includes additional power of nonlinear mathematical programming methods at language level. SOL compiler takes SOL-language statements and generates equivalent FORTRAN code and system calls. Provides syntactic and semantic checking for recovery from errors and provides detailed reports containing cross-references to show where each variable used. Implemented on VAX/VMS computer systems. Requires VAX FORTRAN compiler to produce executable program.

  7. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  8. Language teacher education in CALL: history and perspectives

    Directory of Open Access Journals (Sweden)

    Ana Cristina Biondo Salomão

    2013-01-01

    Full Text Available Over the last years, the new technologies have changed the way we relate to information and communicate with other people, which has brought on impact to foreign language teaching and learning, and, consequently, to the area of foreign language teacher education. The abbreviation CALL (Computer Assisted Language Learning has been used to designate the processes of language teaching and learning with the use of computers, and language teacher education in CALL to name teacher education for and with the use of new technologies, since a number of authors point to the interdependence of both processes. We intend in this article to present an overview of the literature of the area of language teacher education in CALL nowadays and discuss issues related to the use of new technologies concerning its integration to teacher education and the functional and institutional roles to be taken. We also present two proposals of teacher education with the use of new technologies which are being implemented and at the same time studied in Brazil, which we believe have essential elements for the development of language teachers for and with the use of new technologies currently.

  9. Effects of airgun sounds on bowhead whale calling rates: evidence for two behavioral thresholds.

    Directory of Open Access Journals (Sweden)

    Susanna B Blackwell

    Full Text Available In proximity to seismic operations, bowhead whales (Balaena mysticetus decrease their calling rates. Here, we investigate the transition from normal calling behavior to decreased calling and identify two threshold levels of received sound from airgun pulses at which calling behavior changes. Data were collected in August-October 2007-2010, during the westward autumn migration in the Alaskan Beaufort Sea. Up to 40 directional acoustic recorders (DASARs were deployed at five sites offshore of the Alaskan North Slope. Using triangulation, whale calls localized within 2 km of each DASAR were identified and tallied every 10 minutes each season, so that the detected call rate could be interpreted as the actual call production rate. Moreover, airgun pulses were identified on each DASAR, analyzed, and a cumulative sound exposure level was computed for each 10-min period each season (CSEL10-min. A Poisson regression model was used to examine the relationship between the received CSEL10-min from airguns and the number of detected bowhead calls. Calling rates increased as soon as airgun pulses were detectable, compared to calling rates in the absence of airgun pulses. After the initial increase, calling rates leveled off at a received CSEL10-min of ~94 dB re 1 μPa2-s (the lower threshold. In contrast, once CSEL10-min exceeded ~127 dB re 1 μPa2-s (the upper threshold, whale calling rates began decreasing, and when CSEL10-min values were above ~160 dB re 1 μPa2-s, the whales were virtually silent.

  10. An empirical analysis of the corporate call decision

    International Nuclear Information System (INIS)

    Carlson, M.D.

    1998-01-01

    An economic study of the the behaviour of financial managers of utility companies was presented. The study examined whether or not an option pricing based model of the call decision does a better job of explaining callable preferred share prices and call decisions compared to other models. In this study, the Rust (1987) empirical technique was extended to include the use of information from preferred share prices in addition to the call decisions. Reasonable estimates were obtained from data of shares of the Pacific Gas and Electric Company (PGE) for the transaction costs associated with a call. It was concluded that the managers of the PGE clearly take into account the value of the option to delay the call when making their call decisions

  11. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  12. Investigating CALL in the Classroom: Situational Variables to Consider

    Directory of Open Access Journals (Sweden)

    Darlene Liutkus

    2012-01-01

    Full Text Available A new paradigm in second language pedagogy has Computer Assisted Language Learning (CALL playing a significant role. Much of the literature to-date claims that CALL can have a positive impact on students’ second language acquisition (SLA. Mixed method of research produces data to investigate if CALL positively affects student language proficiency, motivation and autonomy. Classroom observation of participants in their natural environment is a qualitative technique used but has situational variables that could skew results if not structured. A questionnaire is a quantitative tool that can offer insight regarding participants’ perception of performance but can contradict what the researcher has observed. This paper will take an in-depth look at variables such as: instructor’s pedagogical application; blending CALL into the curriculum; types of CALL implemented; feedback received and their implications for design of the data collection tools

  13. Control-flow analysis of function calls and returns by abstract interpretation

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Jensen, Thomas P.

    2009-01-01

    We derive a control-flow analysis that approximates the interprocedural control-flow of both function calls and returns in the presence of first-class functions and tail-call optimization. In addition to an abstract environment, our analysis computes for each expression an abstract control stack......, effectively approximating where function calls return across optimized tail calls. The analysis is systematically calculated by abstract interpretation of the stack-based CaEK abstract machine of Flanagan et al. using a series of Galois connections. Abstract interpretation provides a unifying setting in which...

  14. FamSeq: a variant calling program for family-based sequencing data using graphics processing units.

    Directory of Open Access Journals (Sweden)

    Gang Peng

    2014-10-01

    Full Text Available Various algorithms have been developed for variant calling using next-generation sequencing data, and various methods have been applied to reduce the associated false positive and false negative rates. Few variant calling programs, however, utilize the pedigree information when the family-based sequencing data are available. Here, we present a program, FamSeq, which reduces both false positive and false negative rates by incorporating the pedigree information from the Mendelian genetic model into variant calling. To accommodate variations in data complexity, FamSeq consists of four distinct implementations of the Mendelian genetic model: the Bayesian network algorithm, a graphics processing unit version of the Bayesian network algorithm, the Elston-Stewart algorithm and the Markov chain Monte Carlo algorithm. To make the software efficient and applicable to large families, we parallelized the Bayesian network algorithm that copes with pedigrees with inbreeding loops without losing calculation precision on an NVIDIA graphics processing unit. In order to compare the difference in the four methods, we applied FamSeq to pedigree sequencing data with family sizes that varied from 7 to 12. When there is no inbreeding loop in the pedigree, the Elston-Stewart algorithm gives analytical results in a short time. If there are inbreeding loops in the pedigree, we recommend the Bayesian network method, which provides exact answers. To improve the computing speed of the Bayesian network method, we parallelized the computation on a graphics processing unit. This allowed the Bayesian network method to process the whole genome sequencing data of a family of 12 individuals within two days, which was a 10-fold time reduction compared to the time required for this computation on a central processing unit.

  15. Computational multiscale modeling of intergranular cracking

    International Nuclear Information System (INIS)

    Simonovski, Igor; Cizelj, Leon

    2011-01-01

    A novel computational approach for simulation of intergranular cracks in a polycrystalline aggregate is proposed in this paper. The computational model includes a topological model of the experimentally determined microstructure of a 400 μm diameter stainless steel wire and automatic finite element discretization of the grains and grain boundaries. The microstructure was spatially characterized by X-ray diffraction contrast tomography and contains 362 grains and some 1600 grain boundaries. Available constitutive models currently include isotropic elasticity for the grain interior and cohesive behavior with damage for the grain boundaries. The experimentally determined lattice orientations are employed to distinguish between resistant low energy and susceptible high energy grain boundaries in the model. The feasibility and performance of the proposed computational approach is demonstrated by simulating the onset and propagation of intergranular cracking. The preliminary numerical results are outlined and discussed.

  16. Quantum Vertex Model for Reversible Classical Computing

    Science.gov (United States)

    Chamon, Claudio; Mucciolo, Eduardo; Ruckenstein, Andrei; Yang, Zhicheng

    We present a planar vertex model that encodes the result of a universal reversible classical computation in its ground state. The approach involves Boolean variables (spins) placed on links of a two-dimensional lattice, with vertices representing logic gates. Large short-ranged interactions between at most two spins implement the operation of each gate. The lattice is anisotropic with one direction corresponding to computational time, and with transverse boundaries storing the computation's input and output. The model displays no finite temperature phase transitions, including no glass transitions, independent of circuit. The computational complexity is encoded in the scaling of the relaxation rate into the ground state with the system size. We use thermal annealing and a novel and more efficient heuristic \\x9Dannealing with learning to study various computational problems. To explore faster relaxation routes, we construct an explicit mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating a novel approach to reversible classical computation based on quantum annealing.

  17. Radar data processing using a distributed computational system

    Science.gov (United States)

    Mota, Gilberto F.

    1992-06-01

    This research specifies and validates a new concurrent decomposition scheme, called Confined Space Search Decomposition (CSSD), to exploit parallelism of Radar Data Processing algorithms using a Distributed Computational System. To formalize the specification, we propose and apply an object-oriented methodology called Decomposition Cost Evaluation Model (DCEM). To reduce the penalties of load imbalance, we propose a distributed dynamic load balance heuristic called Object Reincarnation (OR). To validate the research, we first compare our decomposition with an identified alternative using the proposed DCEM model and then develop a theoretical prediction of selected parameters. We also develop a simulation to check the Object Reincarnation Concept.

  18. Factors Affecting the Normalization of CALL in Chinese Senior High Schools

    Science.gov (United States)

    He, Bi; Puakpong, Nattaya; Lian, Andrew

    2015-01-01

    With the development of Information Technology, increasing attention has been paid to Computer Assisted Language Learning (CALL). Meanwhile, increasing enthusiasm is seen for English learning and teaching in China. Yet, few research studies have focused on the normalization of CALL in ethnically diverse areas. In response to this research gap,…

  19. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  20. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar

    2016-03-21

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.

  2. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar; Henkel, Ron; Hoehndorf, Robert; Kacprowski, Tim; Knuepfer, Christian; Liebermeister, Wolfram

    2016-01-01

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users' intuition about model similarity, and to support complex model searches in databases.

  3. MINIMARS interim report appendix halo model and computer code

    International Nuclear Information System (INIS)

    Santarius, J.F.; Barr, W.L.; Deng, B.Q.; Emmert, G.A.

    1985-01-01

    A tenuous, cool plasma called the halo shields the core plasma in a tandem mirror from neutral gas and impurities. The neutral particles are ionized and then pumped by the halo to the end tanks of the device, since flow of plasma along field lines is much faster than radial flow. Plasma reaching the end tank walls recombines, and the resulting neutral gas is vacuum pumped. The basic geometry of the MINIMARS halo is shown. For halo modeling purposes, the core plasma and cold gas regions may be treated as single radial zones leading to halo source and sink terms. The halo itself is differential into two major radial zones: halo scraper and halo dump. The halo scraper zone is defined by the radial distance required for the ion end plugging potential to drop to the central cell value, and thus have no effect on axial confinement; this distance is typically a sloshing plug ion Larmor diameter. The outer edge of the halo dump zone is defined by the last central cell flux tube to pass through the choke coil. This appendix will summarize the halo model that has been developed for MINIMARS and the methodology used in implementing that model as a computer code

  4. A cortical edge-integration model of object-based lightness computation that explains effects of spatial context and individual differences

    Science.gov (United States)

    Rudd, Michael E.

    2014-01-01

    Previous work has demonstrated that perceived surface reflectance (lightness) can be modeled in simple contexts in a quantitatively exact way by assuming that the visual system first extracts information about local, directed steps in log luminance, then spatially integrates these steps along paths through the image to compute lightness (Rudd and Zemach, 2004, 2005, 2007). This method of computing lightness is called edge integration. Recent evidence (Rudd, 2013) suggests that human vision employs a default strategy to integrate luminance steps only along paths from a common background region to the targets whose lightness is computed. This implies a role for gestalt grouping in edge-based lightness computation. Rudd (2010) further showed the perceptual weights applied to edges in lightness computation can be influenced by the observer's interpretation of luminance steps as resulting from either spatial variation in surface reflectance or illumination. This implies a role for top-down factors in any edge-based model of lightness (Rudd and Zemach, 2005). Here, I show how the separate influences of grouping and attention on lightness can be modeled in tandem by a cortical mechanism that first employs top-down signals to spatially select regions of interest for lightness computation. An object-based network computation, involving neurons that code for border-ownership, then automatically sets the neural gains applied to edge signals surviving the earlier spatial selection stage. Only the borders that survive both processing stages are spatially integrated to compute lightness. The model assumptions are consistent with those of the cortical lightness model presented earlier by Rudd (2010, 2013), and with neurophysiological data indicating extraction of local edge information in V1, network computations to establish figure-ground relations and border ownership in V2, and edge integration to encode lightness and darkness signals in V4. PMID:25202253

  5. A Cortical Edge-integration Model of Object-Based Lightness Computation that Explains Effects of Spatial Context and Individual Differences

    Directory of Open Access Journals (Sweden)

    Michael E Rudd

    2014-08-01

    Full Text Available Previous work demonstrated that perceived surface reflectance (lightness can be modeled in simple contexts in a quantitatively exact way by assuming that the visual system first extracts information about local, directed steps in log luminance, then spatial integrates these steps along paths through the image to compute lightness (Rudd & Zemach, 2004, 2005, 2007. This method of computing lightness is called edge integration. Recent evidence (Rudd, 2013 suggests that the human vision employs a default strategy to integrate luminance steps only along paths from a common background region to the targets whose lightness is computed. This implies a role for gestalt grouping in edge-based lightness computation. Rudd (2010 further showed the perceptual weights applied to edges in lightness computation can be influenced by the observer’s interpretation of luminance steps as resulting from either spatial variation in surface reflectance or illumination. This implies a role for top-down factors in any edge-based model of lightness (Rudd & Zemach, 2005. Here, I show how the separate influences of grouping and attention on lightness can be together modeled by a cortical mechanism that first employs top-down signals to spatially select regions of interest for lightness computation. An object-based network computation, involving neurons that code for border-ownership, then automatically sets the neural gains applied to edge signals surviving the earlier spatial selection stage. Only the borders that survive both processing stages are spatially integrated to compute lightness. The model assumptions are consistent with those of the cortical lightness model presented earlier by Rudd (2010, 2013, and with neurophysiological data indicating extraction of local edge information in V1, network computations to establish figure-ground relations and border ownership in V2, and edge integration to encode lightness and darkness signals in V4.

  6. A cortical edge-integration model of object-based lightness computation that explains effects of spatial context and individual differences.

    Science.gov (United States)

    Rudd, Michael E

    2014-01-01

    Previous work has demonstrated that perceived surface reflectance (lightness) can be modeled in simple contexts in a quantitatively exact way by assuming that the visual system first extracts information about local, directed steps in log luminance, then spatially integrates these steps along paths through the image to compute lightness (Rudd and Zemach, 2004, 2005, 2007). This method of computing lightness is called edge integration. Recent evidence (Rudd, 2013) suggests that human vision employs a default strategy to integrate luminance steps only along paths from a common background region to the targets whose lightness is computed. This implies a role for gestalt grouping in edge-based lightness computation. Rudd (2010) further showed the perceptual weights applied to edges in lightness computation can be influenced by the observer's interpretation of luminance steps as resulting from either spatial variation in surface reflectance or illumination. This implies a role for top-down factors in any edge-based model of lightness (Rudd and Zemach, 2005). Here, I show how the separate influences of grouping and attention on lightness can be modeled in tandem by a cortical mechanism that first employs top-down signals to spatially select regions of interest for lightness computation. An object-based network computation, involving neurons that code for border-ownership, then automatically sets the neural gains applied to edge signals surviving the earlier spatial selection stage. Only the borders that survive both processing stages are spatially integrated to compute lightness. The model assumptions are consistent with those of the cortical lightness model presented earlier by Rudd (2010, 2013), and with neurophysiological data indicating extraction of local edge information in V1, network computations to establish figure-ground relations and border ownership in V2, and edge integration to encode lightness and darkness signals in V4.

  7. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data

    Science.gov (United States)

    Ma, Jianhua

    2018-01-01

    A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual). This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject’s persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual’s facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model. PMID:29495343

  8. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data

    Directory of Open Access Journals (Sweden)

    Ao Guo

    2018-02-01

    Full Text Available A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual. This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject’s persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual’s facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model.

  9. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data.

    Science.gov (United States)

    Guo, Ao; Ma, Jianhua

    2018-02-25

    A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual). This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject's persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual's facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model.

  10. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  11. Computational models of airway branching morphogenesis.

    Science.gov (United States)

    Varner, Victor D; Nelson, Celeste M

    2017-07-01

    The bronchial network of the mammalian lung consists of millions of dichotomous branches arranged in a highly complex, space-filling tree. Recent computational models of branching morphogenesis in the lung have helped uncover the biological mechanisms that construct this ramified architecture. In this review, we focus on three different theoretical approaches - geometric modeling, reaction-diffusion modeling, and continuum mechanical modeling - and discuss how, taken together, these models have identified the geometric principles necessary to build an efficient bronchial network, as well as the patterning mechanisms that specify airway geometry in the developing embryo. We emphasize models that are integrated with biological experiments and suggest how recent progress in computational modeling has advanced our understanding of airway branching morphogenesis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  13. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  14. Exploration of Textual Interactions in CALL Learning Communities: Emerging Research and Opportunities

    Science.gov (United States)

    White, Jonathan R.

    2017-01-01

    Computer-assisted language learning (CALL) has greatly enhanced the realm of online social interaction and behavior. In language classrooms, it allows the opportunity for students to enhance their learning experiences. "Exploration of Textual Interactions in CALL Learning Communities: Emerging Research and Opportunities" is an ideal…

  15. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  16. Do's and Don'ts of Computer Models for Planning

    Science.gov (United States)

    Hammond, John S., III

    1974-01-01

    Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)

  17. Reduced order methods for modeling and computational reduction

    CERN Document Server

    Rozza, Gianluigi

    2014-01-01

    This monograph addresses the state of the art of reduced order methods for modeling and computational reduction of complex parametrized systems, governed by ordinary and/or partial differential equations, with a special emphasis on real time computing techniques and applications in computational mechanics, bioengineering and computer graphics.  Several topics are covered, including: design, optimization, and control theory in real-time with applications in engineering; data assimilation, geometry registration, and parameter estimation with special attention to real-time computing in biomedical engineering and computational physics; real-time visualization of physics-based simulations in computer science; the treatment of high-dimensional problems in state space, physical space, or parameter space; the interactions between different model reduction and dimensionality reduction approaches; the development of general error estimation frameworks which take into account both model and discretization effects. This...

  18. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  19. Bayesian Action–Perception Computational Model: Interaction of Production and Recognition of Cursive Letters

    Science.gov (United States)

    Gilet, Estelle; Diard, Julien; Bessière, Pierre

    2011-01-01

    In this paper, we study the collaboration of perception and action representations involved in cursive letter recognition and production. We propose a mathematical formulation for the whole perception–action loop, based on probabilistic modeling and Bayesian inference, which we call the Bayesian Action–Perception (BAP) model. Being a model of both perception and action processes, the purpose of this model is to study the interaction of these processes. More precisely, the model includes a feedback loop from motor production, which implements an internal simulation of movement. Motor knowledge can therefore be involved during perception tasks. In this paper, we formally define the BAP model and show how it solves the following six varied cognitive tasks using Bayesian inference: i) letter recognition (purely sensory), ii) writer recognition, iii) letter production (with different effectors), iv) copying of trajectories, v) copying of letters, and vi) letter recognition (with internal simulation of movements). We present computer simulations of each of these cognitive tasks, and discuss experimental predictions and theoretical developments. PMID:21674043

  20. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-02-01

    The role of Nuclear Engineering Education in the application of computers to controlled fusion research can be a very important one. In the near future the use of computers in the numerical modelling of fusion systems should increase substantially. A recent study group has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. In order to meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR laboratories by a communications network. The crucial element that is needed for success is trained personnel. The number of people with knowledge of plasma science and engineering that are trained in numerical methods and computer science is quite small, and must be increased substantially in the next few years. Nuclear Engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing. (U.S.)

  1. Towards The Deep Model : Understanding Visual Recognition Through Computational Models

    OpenAIRE

    Wang, Panqu

    2017-01-01

    Understanding how visual recognition is achieved in the human brain is one of the most fundamental questions in vision research. In this thesis I seek to tackle this problem from a neurocomputational modeling perspective. More specifically, I build machine learning-based models to simulate and explain cognitive phenomena related to human visual recognition, and I improve computational models using brain-inspired principles to excel at computer vision tasks.I first describe how a neurocomputat...

  2. EFL Teachers' Knowledge of the Use and Development of Computer-Assisted Language Learning (CALL) Materials

    Science.gov (United States)

    Dashtestani, Reza

    2014-01-01

    Even though there are a plethora of CALL materials available to EFL teachers nowadays, very limited attention has been directed toward the issue that most EFL teachers are merely the consumers of CALL materials. The main challenge is to equip EFL teachers with the required CALL materials development skills to enable them to be contributors to CALL…

  3. Penentuan Nilai Opsi Call Eropa Dengan Pembayaran Dividen

    Directory of Open Access Journals (Sweden)

    Diana Purwandari

    2016-08-01

    Full Text Available Fluktuasi harga saham menyebabkan perdagangan saham memiliki resiko. Opsi merupakan alternatif untuk mengurangi resiko dalam perdagangan saham. Opsi Eropa adalah suatu kontrak keuangan yang memberikan hak, bukan kewajiban, kepada holder, untuk membeli atau menjual aset pokok dari writer pada saat jatuh tempo dengan harga yang sudah ditentukan. Model penilaian harga opsi yang banyak diterima dalam bidang finansial adalah model Black-Scholes. Tujuan dari penelitian ini adalah mengetahui pengaruh pembagian dividen terhadap harga saham dan menentukan nilai opsi call Eropa dengan pembayaran dividen pada waktu yang telah ditentukan. Nilai opsi call Eropa dengan pembayaran dividen pada waktu yang telah ditentukan diperoleh menggunakan integrasi numerik dengan metode Simpson sebesar 12,6388.Kata kunci: opsi call Eropa, model Black-Scholes, dividen, metode Simpson.ABSTRACT Fluctuations in stock prices lead stock trading risk. An alternative options to reduce the risk in stock trading. European option is a financial contract that gives the right, but not the obligation, to the holder, to buy or sell the underlying asset of the writer at the maturity date at a price specified. Option price valuation models are widely accepted in the field of finance is the Black-Scholes model. The purpose of this study is to determine the effect of dividend distribution to the stock price and determine the value of the European call option with dividend payments at a predetermined time. Value of the European call option with dividend payments at a predetermined time obtained using numerical integration with Simpson method of 12,6388.Key words: European call options, Black-Scholes model, dividend, Simpson method.

  4. SoftWAXS: a computational tool for modeling wide-angle X-ray solution scattering from biomolecules.

    Science.gov (United States)

    Bardhan, Jaydeep; Park, Sanghyun; Makowski, Lee

    2009-10-01

    This paper describes a computational approach to estimating wide-angle X-ray solution scattering (WAXS) from proteins, which has been implemented in a computer program called SoftWAXS. The accuracy and efficiency of SoftWAXS are analyzed for analytically solvable model problems as well as for proteins. Key features of the approach include a numerical procedure for performing the required spherical averaging and explicit representation of the solute-solvent boundary and the surface of the hydration layer. These features allow the Fourier transform of the excluded volume and hydration layer to be computed directly and with high accuracy. This approach will allow future investigation of different treatments of the electron density in the hydration shell. Numerical results illustrate the differences between this approach to modeling the excluded volume and a widely used model that treats the excluded-volume function as a sum of Gaussians representing the individual atomic excluded volumes. Comparison of the results obtained here with those from explicit-solvent molecular dynamics clarifies shortcomings inherent to the representation of solvent as a time-averaged electron-density profile. In addition, an assessment is made of how the calculated scattering patterns depend on input parameters such as the solute-atom radii, the width of the hydration shell and the hydration-layer contrast. These results suggest that obtaining predictive calculations of high-resolution WAXS patterns may require sophisticated treatments of solvent.

  5. LinkImputeR: user-guided genotype calling and imputation for non-model organisms.

    Science.gov (United States)

    Money, Daniel; Migicovsky, Zoë; Gardner, Kyle; Myles, Sean

    2017-07-10

    Genomic studies such as genome-wide association and genomic selection require genome-wide genotype data. All existing technologies used to create these data result in missing genotypes, which are often then inferred using genotype imputation software. However, existing imputation methods most often make use only of genotypes that are successfully inferred after having passed a certain read depth threshold. Because of this, any read information for genotypes that did not pass the threshold, and were thus set to missing, is ignored. Most genomic studies also choose read depth thresholds and quality filters without investigating their effects on the size and quality of the resulting genotype data. Moreover, almost all genotype imputation methods require ordered markers and are therefore of limited utility in non-model organisms. Here we introduce LinkImputeR, a software program that exploits the read count information that is normally ignored, and makes use of all available DNA sequence information for the purposes of genotype calling and imputation. It is specifically designed for non-model organisms since it requires neither ordered markers nor a reference panel of genotypes. Using next-generation DNA sequence (NGS) data from apple, cannabis and grape, we quantify the effect of varying read count and missingness thresholds on the quantity and quality of genotypes generated from LinkImputeR. We demonstrate that LinkImputeR can increase the number of genotype calls by more than an order of magnitude, can improve genotyping accuracy by several percent and can thus improve the power of downstream analyses. Moreover, we show that the effects of quality and read depth filters can differ substantially between data sets and should therefore be investigated on a per-study basis. By exploiting DNA sequence data that is normally ignored during genotype calling and imputation, LinkImputeR can significantly improve both the quantity and quality of genotype data generated from

  6. THE COMPUTATIONAL INTELLIGENCE TECHNIQUES FOR PREDICTIONS - ARTIFICIAL NEURAL NETWORKS

    OpenAIRE

    Mary Violeta Bar

    2014-01-01

    The computational intelligence techniques are used in problems which can not be solved by traditional techniques when there is insufficient data to develop a model problem or when they have errors.Computational intelligence, as he called Bezdek (Bezdek, 1992) aims at modeling of biological intelligence. Artificial Neural Networks( ANNs) have been applied to an increasing number of real world problems of considerable complexity. Their most important advantage is solving problems that are too c...

  7. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  8. TADtool: visual parameter identification for TAD-calling algorithms.

    Science.gov (United States)

    Kruse, Kai; Hug, Clemens B; Hernández-Rodríguez, Benjamín; Vaquerizas, Juan M

    2016-10-15

    Eukaryotic genomes are hierarchically organized into topologically associating domains (TADs). The computational identification of these domains and their associated properties critically depends on the choice of suitable parameters of TAD-calling algorithms. To reduce the element of trial-and-error in parameter selection, we have developed TADtool: an interactive plot to find robust TAD-calling parameters with immediate visual feedback. TADtool allows the direct export of TADs called with a chosen set of parameters for two of the most common TAD calling algorithms: directionality and insulation index. It can be used as an intuitive, standalone application or as a Python package for maximum flexibility. TADtool is available as a Python package from GitHub (https://github.com/vaquerizaslab/tadtool) or can be installed directly via PyPI, the Python package index (tadtool). kai.kruse@mpi-muenster.mpg.de, jmv@mpi-muenster.mpg.deSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  9. Criteria for Evaluating a Game-Based CALL Platform

    Science.gov (United States)

    Ní Chiaráin, Neasa; Ní Chasaide, Ailbhe

    2017-01-01

    Game-based Computer-Assisted Language Learning (CALL) is an area that currently warrants attention, as task-based, interactive, multimodal games increasingly show promise for language learning. This area is inherently multidisciplinary--theories from second language acquisition, games, and psychology must be explored and relevant concepts from…

  10. Modeling multimodal human-computer interaction

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2004-01-01

    Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze

  11. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...

  13. Editorial: Modelling and computational challenges in granular materials

    OpenAIRE

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss the current progress and latest advancements in the field of advanced numerical methods and modelling of granular materials. The focus will be on computational methods, improved algorithms and the m...

  14. SmartShadow models and methods for pervasive computing

    CERN Document Server

    Wu, Zhaohui

    2013-01-01

    SmartShadow: Models and Methods for Pervasive Computing offers a new perspective on pervasive computing with SmartShadow, which is designed to model a user as a personality ""shadow"" and to model pervasive computing environments as user-centric dynamic virtual personal spaces. Just like human beings' shadows in the physical world, it follows people wherever they go, providing them with pervasive services. The model, methods, and software infrastructure for SmartShadow are presented and an application for smart cars is also introduced.  The book can serve as a valuable reference work for resea

  15. Computational Modeling of Biological Systems From Molecules to Pathways

    CERN Document Server

    2012-01-01

    Computational modeling is emerging as a powerful new approach for studying and manipulating biological systems. Many diverse methods have been developed to model, visualize, and rationally alter these systems at various length scales, from atomic resolution to the level of cellular pathways. Processes taking place at larger time and length scales, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. Computational Modeling of Biological Systems: From Molecules to Pathways provides an overview of established computational methods for the modeling of biologically and medically relevant systems. It is suitable for researchers and professionals working in the fields of biophysics, computational biology, systems biology, and molecular medicine.

  16. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  17. Models of the Learner in Computer-Assisted Instruction

    Science.gov (United States)

    1975-12-01

    De ~.elopment Center San Diego, California 92152 !4" N il 0 ’U ’ UNCI.ASbp kW ILL SECURITY CLASSIVICATION OF TNO;S PAA;E 11%.. Daea t,,i...d) fREAD...and computer programaing . Advo.itional efforts are being, made to extend this approach to less formal subiect matter such as South American geography...Laubsch’s call for a time-dependent forgetting process. Despite the inclusion of a forgetting process, presentation strategies based on the family of

  18. A computational model of selection by consequences.

    OpenAIRE

    McDowell, J J

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied o...

  19. A Categorisation of Cloud Computing Business Models

    OpenAIRE

    Chang, Victor; Bacigalupo, David; Wills, Gary; De Roure, David

    2010-01-01

    This paper reviews current cloud computing business models and presents proposals on how organisations can achieve sustainability by adopting appropriate models. We classify cloud computing business models into eight types: (1) Service Provider and Service Orientation; (2) Support and Services Contracts; (3) In-House Private Clouds; (4) All-In-One Enterprise Cloud; (5) One-Stop Resources and Services; (6) Government funding; (7) Venture Capitals; and (8) Entertainment and Social Networking. U...

  20. Chaos Modelling with Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Chaos Modelling with Computers Unpredicatable Behaviour of Deterministic Systems. Balakrishnan Ramasamy T S K V Iyer. General Article Volume 1 Issue 5 May 1996 pp 29-39 ...

  1. A computer simulation model to compute the radiation transfer of mountainous regions

    Science.gov (United States)

    Li, Yuguang; Zhao, Feng; Song, Rui

    2011-11-01

    In mountainous regions, the radiometric signal recorded at the sensor depends on a number of factors such as sun angle, atmospheric conditions, surface cover type, and topography. In this paper, a computer simulation model of radiation transfer is designed and evaluated. This model implements the Monte Carlo ray-tracing techniques and is specifically dedicated to the study of light propagation in mountainous regions. The radiative processes between sun light and the objects within the mountainous region are realized by using forward Monte Carlo ray-tracing methods. The performance of the model is evaluated through detailed comparisons with the well-established 3D computer simulation model: RGM (Radiosity-Graphics combined Model) based on the same scenes and identical spectral parameters, which shows good agreements between these two models' results. By using the newly developed computer model, series of typical mountainous scenes are generated to analyze the physical mechanism of mountainous radiation transfer. The results show that the effects of the adjacent slopes are important for deep valleys and they particularly affect shadowed pixels, and the topographic effect needs to be considered in mountainous terrain before accurate inferences from remotely sensed data can be made.

  2. Applications of computer modeling to fusion research

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling

  3. Sustainability in CALL Learning Environments: A Systemic Functional Grammar Approach

    Directory of Open Access Journals (Sweden)

    Peter McDonald

    2014-09-01

    Full Text Available This research aims to define a sustainable resource in Computer-Assisted Language Learning (CALL. In order for a CALL resource to be sustainable it must work within existing educational curricula. This feature is a necessary prerequisite of sustainability because, despite the potential for educational change that digitalization has offered since the nineteen nineties, curricula in traditional educational institutions have not fundamentally changed, even as we move from a pre-digital society towards a digital society. Curricula have failed to incorporate CALL resources because no agreed-upon pedagogical language enables teachers to discuss CALL classroom practices. Systemic Functional Grammar (SFG can help to provide this language and bridge the gap between the needs of the curriculum and the potentiality of CALL-based resources. This paper will outline how SFG principles can be used to create a pedagogical language for CALL and it will give practical examples of how this language can be used to create sustainable resources in classroom contexts.

  4. Trust models in ubiquitous computing.

    Science.gov (United States)

    Krukow, Karl; Nielsen, Mogens; Sassone, Vladimiro

    2008-10-28

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.

  5. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  6. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  7. Blackboard architecture and qualitative model in a computer aided assistant designed to define computers for HEP computing

    International Nuclear Information System (INIS)

    Nodarse, F.F.; Ivanov, V.G.

    1991-01-01

    Using BLACKBOARD architecture and qualitative model, an expert systm was developed to assist the use in defining the computers method for High Energy Physics computing. The COMEX system requires an IBM AT personal computer or compatible with than 640 Kb RAM and hard disk. 5 refs.; 9 figs

  8. Call Admission Scheme for Multidimensional Traffic Assuming Finite Handoff User

    Directory of Open Access Journals (Sweden)

    Md. Baitul Al Sadi

    2017-01-01

    Full Text Available Usually, the number of users within a cell in a mobile cellular network is considered infinite; hence, M/M/n/k model is appropriate for new originated traffic, but the number of ongoing calls around a cell is always finite. Hence, the traffic model of handoff call will be M/M/n/k/N. In this paper, a K-dimensional traffic model of a mobile cellular network is proposed using the combination of limited and unlimited users case. A new call admission scheme (CAS is proposed based on both thinning scheme and fading condition. The fading condition of the wireless channel access to a handoff call is prioritized compared to newly originated calls.

  9. Computational models in physics teaching: a framework

    Directory of Open Access Journals (Sweden)

    Marco Antonio Moreira

    2012-08-01

    Full Text Available The purpose of the present paper is to present a theoretical framework to promote and assist meaningful physics learning through computational models. Our proposal is based on the use of a tool, the AVM diagram, to design educational activities involving modeling and computer simulations. The idea is to provide a starting point for the construction and implementation of didactical approaches grounded in a coherent epistemological view about scientific modeling.

  10. A model for calculating the optimal replacement interval of computer systems

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1981-08-01

    A mathematical model for calculating the optimal replacement interval of computer systems is described. This model is made to estimate the best economical interval of computer replacement when computing demand, cost and performance of computer, etc. are known. The computing demand is assumed to monotonously increase every year. Four kinds of models are described. In the model 1, a computer system is represented by only a central processing unit (CPU) and all the computing demand is to be processed on the present computer until the next replacement. On the other hand in the model 2, the excessive demand is admitted and may be transferred to other computing center and processed costly there. In the model 3, the computer system is represented by a CPU, memories (MEM) and input/output devices (I/O) and it must process all the demand. Model 4 is same as model 3, but the excessive demand is admitted to be processed in other center. (1) Computing demand at the JAERI, (2) conformity of Grosch's law for the recent computers, (3) replacement cost of computer systems, etc. are also described. (author)

  11. Modeling soft factors in computer-based wargames

    Science.gov (United States)

    Alexander, Steven M.; Ross, David O.; Vinarskai, Jonathan S.; Farr, Steven D.

    2002-07-01

    Computer-based wargames have seen much improvement in recent years due to rapid increases in computing power. Because these games have been developed for the entertainment industry, most of these advances have centered on the graphics, sound, and user interfaces integrated into these wargames with less attention paid to the game's fidelity. However, for a wargame to be useful to the military, it must closely approximate as many of the elements of war as possible. Among the elements that are typically not modeled or are poorly modeled in nearly all military computer-based wargames are systematic effects, command and control, intelligence, morale, training, and other human and political factors. These aspects of war, with the possible exception of systematic effects, are individually modeled quite well in many board-based commercial wargames. The work described in this paper focuses on incorporating these elements from the board-based games into a computer-based wargame. This paper will also address the modeling and simulation of the systemic paralysis of an adversary that is implied by the concept of Effects Based Operations (EBO). Combining the fidelity of current commercial board wargames with the speed, ease of use, and advanced visualization of the computer can significantly improve the effectiveness of military decision making and education. Once in place, the process of converting board wargames concepts to computer wargames will allow the infusion of soft factors into military training and planning.

  12. Computer models for kinetic equations of magnetically confined plasmas

    International Nuclear Information System (INIS)

    Killeen, J.; Kerbel, G.D.; McCoy, M.G.; Mirin, A.A.; Horowitz, E.J.; Shumaker, D.E.

    1987-01-01

    This paper presents four working computer models developed by the computational physics group of the National Magnetic Fusion Energy Computer Center. All of the models employ a kinetic description of plasma species. Three of the models are collisional, i.e., they include the solution of the Fokker-Planck equation in velocity space. The fourth model is collisionless and treats the plasma ions by a fully three-dimensional particle-in-cell method

  13. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  14. COMPUTATIONAL TOXICOLOGY-WHERE IS THE DATA? ...

    Science.gov (United States)

    This talk will briefly describe the state of the data world for computational toxicology and one approach to improve the situation, called ACToR (Aggregated Computational Toxicology Resource). This talk will briefly describe the state of the data world for computational toxicology and one approach to improve the situation, called ACToR (Aggregated Computational Toxicology Resource).

  15. From nestling calls to fledgling silence: adaptive timing of change in response to aerial alarm calls.

    Science.gov (United States)

    Magrath, Robert D; Platzen, Dirk; Kondo, Junko

    2006-09-22

    Young birds and mammals are extremely vulnerable to predators and so should benefit from responding to parental alarm calls warning of danger. However, young often respond differently from adults. This difference may reflect: (i) an imperfect stage in the gradual development of adult behaviour or (ii) an adaptation to different vulnerability. Altricial birds provide an excellent model to test for adaptive changes with age in response to alarm calls, because fledglings are vulnerable to a different range of predators than nestlings. For example, a flying hawk is irrelevant to a nestling in a enclosed nest, but is dangerous to that individual once it has left the nest, so we predict that young develop a response to aerial alarm calls to coincide with fledging. Supporting our prediction, recently fledged white-browed scrubwrens, Sericornis frontalis, fell silent immediately after playback of their parents' aerial alarm call, whereas nestlings continued to calling despite hearing the playback. Young scrubwrens are therefore exquisitely adapted to the changing risks faced during development.

  16. A response-modeling alternative to surrogate models for support in computational analyses

    International Nuclear Information System (INIS)

    Rutherford, Brian

    2006-01-01

    Often, the objectives in a computational analysis involve characterization of system performance based on some function of the computed response. In general, this characterization includes (at least) an estimate or prediction for some performance measure and an estimate of the associated uncertainty. Surrogate models can be used to approximate the response in regions where simulations were not performed. For most surrogate modeling approaches, however (1) estimates are based on smoothing of available data and (2) uncertainty in the response is specified in a point-wise (in the input space) fashion. These aspects of the surrogate model construction might limit their capabilities. One alternative is to construct a probability measure, G(r), for the computer response, r, based on available data. This 'response-modeling' approach will permit probability estimation for an arbitrary event, E(r), based on the computer response. In this general setting, event probabilities can be computed: prob(E)=∫ r I(E(r))dG(r) where I is the indicator function. Furthermore, one can use G(r) to calculate an induced distribution on a performance measure, pm. For prediction problems where the performance measure is a scalar, its distribution F pm is determined by: F pm (z)=∫ r I(pm(r)≤z)dG(r). We introduce response models for scalar computer output and then generalize the approach to more complicated responses that utilize multiple response models

  17. Grow--a computer subroutine that projects the growth of trees in the Lake States' forests.

    Science.gov (United States)

    Gary J. Brand

    1981-01-01

    A computer subroutine, Grow, has been written in 1977 Standard FORTRAN to implement a distance-independent, individual tree growth model for Lake States' forests. Grow is a small and easy-to-use version of the growth model. All the user has to do is write a calling program to read initial conditions, call Grow, and summarize the results.

  18. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  19. Towards a Formal Framework for Computational Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl Kristian; Sassone, Vladimiro

    2006-01-01

    We define a mathematical measure for the quantitative comparison of probabilistic computational trust systems, and use it to compare a well-known class of algorithms based on the so-called beta model. The main novelty is that our approach is formal, rather than based on experimental simulation....

  20. Computable general equilibrium modelling in the context of trade and environmental policy

    Energy Technology Data Exchange (ETDEWEB)

    Koesler, Simon Tobias

    2014-10-14

    This thesis is dedicated to the evaluation of environmental policies in the context of climate change. Its objectives are twofold. Its first part is devoted to the development of potent instruments for quantitative impact analysis of environmental policy. In this context, the main contributions include the development of a new computable general equilibrium (CGE) model which makes use of the new comprehensive and coherent World Input-Output Dataset (WIOD) and which features a detailed representation of bilateral and bisectoral trade flows. Moreover it features an investigation of input substitutability to provide modellers with adequate estimates for key elasticities as well as a discussion and amelioration of the standard base year calibration procedure of most CGE models. Building on these tools, the second part applies the improved modelling framework and studies the economic implications of environmental policy. This includes an analysis of so called rebound effects, which are triggered by energy efficiency improvements and reduce their net benefit, an investigation of how firms restructure their production processes in the presence of carbon pricing mechanisms, and an analysis of a regional maritime emission trading scheme as one of the possible options to reduce emissions of international shipping in the EU context.

  1. Computable general equilibrium modelling in the context of trade and environmental policy

    International Nuclear Information System (INIS)

    Koesler, Simon Tobias

    2014-01-01

    This thesis is dedicated to the evaluation of environmental policies in the context of climate change. Its objectives are twofold. Its first part is devoted to the development of potent instruments for quantitative impact analysis of environmental policy. In this context, the main contributions include the development of a new computable general equilibrium (CGE) model which makes use of the new comprehensive and coherent World Input-Output Dataset (WIOD) and which features a detailed representation of bilateral and bisectoral trade flows. Moreover it features an investigation of input substitutability to provide modellers with adequate estimates for key elasticities as well as a discussion and amelioration of the standard base year calibration procedure of most CGE models. Building on these tools, the second part applies the improved modelling framework and studies the economic implications of environmental policy. This includes an analysis of so called rebound effects, which are triggered by energy efficiency improvements and reduce their net benefit, an investigation of how firms restructure their production processes in the presence of carbon pricing mechanisms, and an analysis of a regional maritime emission trading scheme as one of the possible options to reduce emissions of international shipping in the EU context.

  2. The Architectural Designs of a Nanoscale Computing Model

    Directory of Open Access Journals (Sweden)

    Mary M. Eshaghian-Wilner

    2004-08-01

    Full Text Available A generic nanoscale computing model is presented in this paper. The model consists of a collection of fully interconnected nanoscale computing modules, where each module is a cube of cells made out of quantum dots, spins, or molecules. The cells dynamically switch between two states by quantum interactions among their neighbors in all three dimensions. This paper includes a brief introduction to the field of nanotechnology from a computing point of view and presents a set of preliminary architectural designs for fabricating the nanoscale model studied.

  3. Geometric and computer-aided spline hob modeling

    Science.gov (United States)

    Brailov, I. G.; Myasoedova, T. M.; Panchuk, K. L.; Krysova, I. V.; Rogoza, YU A.

    2018-03-01

    The paper considers acquiring the spline hob geometric model. The objective of the research is the development of a mathematical model of spline hob for spline shaft machining. The structure of the spline hob is described taking into consideration the motion in parameters of the machine tool system of cutting edge positioning and orientation. Computer-aided study is performed with the use of CAD and on the basis of 3D modeling methods. Vector representation of cutting edge geometry is accepted as the principal method of spline hob mathematical model development. The paper defines the correlations described by parametric vector functions representing helical cutting edges designed for spline shaft machining with consideration for helical movement in two dimensions. An application for acquiring the 3D model of spline hob is developed on the basis of AutoLISP for AutoCAD environment. The application presents the opportunity for the use of the acquired model for milling process imitation. An example of evaluation, analytical representation and computer modeling of the proposed geometrical model is reviewed. In the mentioned example, a calculation of key spline hob parameters assuring the capability of hobbing a spline shaft of standard design is performed. The polygonal and solid spline hob 3D models are acquired by the use of imitational computer modeling.

  4. Brine: a computer program to compute brine migration adjacent to a nuclear waste canister in a salt repository

    International Nuclear Information System (INIS)

    Duckworth, G.D.; Fuller, M.E.

    1980-01-01

    This report presents a mathematical model used to predict brine migration toward a nuclear waste canister in a bedded salt repository. The mathematical model is implemented in a computer program called BRINE. The program is written in FORTRAN and executes in the batch mode on a CDC 7600. A description of the program input requirements and output available is included. Samples of input and output are given

  5. Limits of computational white-light holography

    International Nuclear Information System (INIS)

    Mader, Sebastian; Kozacki, Tomasz; Tompkin, Wayne

    2013-01-01

    Recently, computational holograms are being used in applications, where previously conventional holograms were applied. Compared to conventional holography, computational holography is based on imaging of virtual objects instead of real objects, which renders them somewhat more flexibility. Here, computational holograms are calculated based on the superposition of point sources, which are placed at the mesh vertices of arbitrary 3D models. The computed holograms have full parallax and exhibit a problem in viewing that we have called g hosting , which is linked to the viewing of computational holograms based on 3D models close to the image plane. Experimental white-light reconstruction of these holograms showed significant blurring, which is explained here based on simulations of the lateral as well as the axial resolution of a point image with respect to the source spectrum and image distance. In accordance with these simulations, an upper limit of the distance to the image plane is determined, which ensures high quality imaging.

  6. Structure, function, and behaviour of computational models in systems biology.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens; Dittrich, Peter; Le Novère, Nicolas

    2013-05-31

    Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such "bio-models" necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. We present a conceptual framework - the meaning facets - which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model's components (structure), the meaning of the model's intended use (function), and the meaning of the model's dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research.

  7. Evaluation of a Postdischarge Call System Using the Logic Model.

    Science.gov (United States)

    Frye, Timothy C; Poe, Terri L; Wilson, Marisa L; Milligan, Gary

    2018-02-01

    This mixed-method study was conducted to evaluate a postdischarge call program for congestive heart failure patients at a major teaching hospital in the southeastern United States. The program was implemented based on the premise that it would improve patient outcomes and overall quality of life, but it had never been evaluated for effectiveness. The Logic Model was used to evaluate the input of key staff members to determine whether the outputs and results of the program matched the expectations of the organization. Interviews, online surveys, reviews of existing patient outcome data, and reviews of publicly available program marketing materials were used to ascertain current program output. After analyzing both qualitative and quantitative data from the evaluation, recommendations were made to the organization to improve the effectiveness of the program.

  8. Computer modeling of nuclide adsorption on geologic materials

    International Nuclear Information System (INIS)

    Silva, R.J.; White, A.R.; Yee, A.W.

    1980-07-01

    A computer program, called MINEQL, has been developed and is being tested for use in predicting the distribution of radionuclides between solid and aqueous species for a variety of geologic materials and solution conditions. MINEQL is designed to accept a list of components of a system (electrolytes, solid substrates and radionuclides) and their total analytical concentrations, solve the appropriate set of mass balance and equilibrium expressions, and produce a list of the identities and concentrations of all species formed by interactions among the components and between them and/or water

  9. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  10. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...

  11. Computer Modeling of Direct Metal Laser Sintering

    Science.gov (United States)

    Cross, Matthew

    2014-01-01

    A computational approach to modeling direct metal laser sintering (DMLS) additive manufacturing process is presented. The primary application of the model is for determining the temperature history of parts fabricated using DMLS to evaluate residual stresses found in finished pieces and to assess manufacturing process strategies to reduce part slumping. The model utilizes MSC SINDA as a heat transfer solver with imbedded FORTRAN computer code to direct laser motion, apply laser heating as a boundary condition, and simulate the addition of metal powder layers during part fabrication. Model results are compared to available data collected during in situ DMLS part manufacture.

  12. Highway traffic simulation on multi-processor computers

    Energy Technology Data Exchange (ETDEWEB)

    Hanebutte, U.R.; Doss, E.; Tentner, A.M.

    1997-04-01

    A computer model has been developed to simulate highway traffic for various degrees of automation with a high level of fidelity in regard to driver control and vehicle characteristics. The model simulates vehicle maneuvering in a multi-lane highway traffic system and allows for the use of Intelligent Transportation System (ITS) technologies such as an Automated Intelligent Cruise Control (AICC). The structure of the computer model facilitates the use of parallel computers for the highway traffic simulation, since domain decomposition techniques can be applied in a straight forward fashion. In this model, the highway system (i.e. a network of road links) is divided into multiple regions; each region is controlled by a separate link manager residing on an individual processor. A graphical user interface augments the computer model kv allowing for real-time interactive simulation control and interaction with each individual vehicle and road side infrastructure element on each link. Average speed and traffic volume data is collected at user-specified loop detector locations. Further, as a measure of safety the so- called Time To Collision (TTC) parameter is being recorded.

  13. Climate Ocean Modeling on Parallel Computers

    Science.gov (United States)

    Wang, P.; Cheng, B. N.; Chao, Y.

    1998-01-01

    Ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change. However, modeling the ocean circulation at various spatial and temporal scales is a very challenging computational task.

  14. New CALL-SLA Research Interfaces for the 21st Century: Towards Equitable Multilingualism

    Science.gov (United States)

    Ortega, Lourdes

    2017-01-01

    The majority of the world is multilingual, but inequitably multilingual, and much of the world is also technologized, but inequitably so. Thus, researchers in the fields of computer-assisted language learning (CALL) and second language acquisition (SLA) would profit from considering multilingualism and social justice when envisioning new CALL-SLA…

  15. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  16. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  17. Ch. 33 Modeling: Computational Thermodynamics

    International Nuclear Information System (INIS)

    Besmann, Theodore M.

    2012-01-01

    This chapter considers methods and techniques for computational modeling for nuclear materials with a focus on fuels. The basic concepts for chemical thermodynamics are described and various current models for complex crystalline and liquid phases are illustrated. Also included are descriptions of available databases for use in chemical thermodynamic studies and commercial codes for performing complex equilibrium calculations.

  18. Too close to call

    DEFF Research Database (Denmark)

    Kurrild-Klitgaard, Peter

    2012-01-01

    a number of other frequent explanations and is found to be quite robust. When augmented with approval ratings for incumbent presidents, the explanatory power increases to 83 pct. and only incorrectly calls one of the last 15 US presidential elections. Applied to the 2012 election as a forecasting model...

  19. Probabilistic Anomaly Detection Based On System Calls Analysis

    Directory of Open Access Journals (Sweden)

    Przemysław Maciołek

    2007-01-01

    Full Text Available We present an application of probabilistic approach to the anomaly detection (PAD. Byanalyzing selected system calls (and their arguments, the chosen applications are monitoredin the Linux environment. This allows us to estimate “(abnormality” of their behavior (bycomparison to previously collected profiles. We’ve attached results of threat detection ina typical computer environment.

  20. ASTHENOPIA PADA PEKERJA WANITA DI CALL CENTRE-X

    Directory of Open Access Journals (Sweden)

    Frans X Suharyanto

    2012-07-01

    Full Text Available Abstract. Nowdays, computers have been used widely in every kind of occupation. One of the health problems of the computer using is eyestrain or asthenopia. Some experts have tried to correlate the exposure of Video Display Terminal (VDT with the occurrence of asthenopia but until present there is no database about the prevelance of computer operators found they more easily become asthenopic. The complain of asthenopia itself is subjective and varies in every individual and therefore one measureable objective value is needed to determine the occurrence of the asthenopia case itself.The design of the study was using "pre and post test", it involved 72 subjects in "X"-call centre and using photostress test to measure objectively the occurrence ofasthenopia by measuring the increasing of Macular Recovery Time (MKT before and after working.The average shift of MKT was about 2.98 ± 3.57 second within 68.1 % of thesubjects, which were evaluated after 4-hour working time with their VDT. The distance between the eyes to VDT and the satisfaction with the working shift arrangement had significant correlation with the occurrence of asthenopia. There was significant correlation between subjective complaints such as pain within the area around the eyes,headache and dry eyes due to increasing of the MRTKeywords : Asthenopia, Macular Recovery Time (MRT, Call Centre, Video DisplayTerminal (VDT.

  1. Call centers with a postponed callback offer

    NARCIS (Netherlands)

    B. Legros (Benjamin); S. Ding (Sihan); R.D. van der Mei (Rob); O. Jouini (Oualid)

    2017-01-01

    textabstractWe study a call center model with a postponed callback option. A customer at the head of the queue whose elapsed waiting time achieves a given threshold receives a voice message mentioning the option to be called back later. This callback option differs from the traditional ones found in

  2. Uses of Computer Simulation Models in Ag-Research and Everyday Life

    Science.gov (United States)

    When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...

  3. Got a call from "Microsoft"? The social way infecting your PC

    CERN Multimedia

    Computer Security Team

    2012-01-01

    Have you recently been called by “Microsoft Security”? At home!? Then beware: these are fake calls trying to make you install malicious software on your (home) PC!   This scam is currently prevalent throughout the Geneva area, targeting many different international organizations and companies. If you receive such a call, just ignore it and hang up. If you have fallen for this scam and followed their pleas, please contact us at Computer.Security@cern.ch. This kind of scam is called “Social Engineering”. Some “Microsoft” call-centre agent informs you that your PC is infected and will try to convince you to download some software from the web. If you do – BANG – your PC is compromized and your local data is at stake. In other Social Engineering scams, the attackers try to convince you to give them your password or sensitive documents. Thus, recall that the real Microsoft will never call you – and definite...

  4. Deployment Models: Towards Eliminating Security Concerns From Cloud Computing

    OpenAIRE

    Zhao, Gansen; Chunming, Rong; Jaatun, Martin Gilje; Sandnes, Frode Eika

    2010-01-01

    Cloud computing has become a popular choice as an alternative to investing new IT systems. When making decisions on adopting cloud computing related solutions, security has always been a major concern. This article summarizes security concerns in cloud computing and proposes five service deployment models to ease these concerns. The proposed models provide different security related features to address different requirements and scenarios and can serve as reference models for deployment. D...

  5. Optimal scheduling in call centers with a callback option

    OpenAIRE

    Legros , Benjamin; Jouini , Oualid; Koole , Ger

    2016-01-01

    International audience; We consider a call center model with a callback option, which allows to transform an inbound call into an outbound one. A delayed call, with a long anticipated waiting time, receives the option to be called back. We assume a probabilistic customer reaction to the callback offer (option). The objective of the system manager is to characterize the optimal call scheduling that minimizes the expected waiting and abandonment costs. For the single-server case, we prove that ...

  6. A Computer-Assisted Instruction in Teaching Abstract Statistics to Public Affairs Undergraduates

    Science.gov (United States)

    Ozturk, Ali Osman

    2012-01-01

    This article attempts to demonstrate the applicability of a computer-assisted instruction supported with simulated data in teaching abstract statistical concepts to political science and public affairs students in an introductory research methods course. The software is called the Elaboration Model Computer Exercise (EMCE) in that it takes a great…

  7. The emerging role of cloud computing in molecular modelling.

    Science.gov (United States)

    Ebejer, Jean-Paul; Fulle, Simone; Morris, Garrett M; Finn, Paul W

    2013-07-01

    There is a growing recognition of the importance of cloud computing for large-scale and data-intensive applications. The distinguishing features of cloud computing and their relationship to other distributed computing paradigms are described, as are the strengths and weaknesses of the approach. We review the use made to date of cloud computing for molecular modelling projects and the availability of front ends for molecular modelling applications. Although the use of cloud computing technologies for molecular modelling is still in its infancy, we demonstrate its potential by presenting several case studies. Rapid growth can be expected as more applications become available and costs continue to fall; cloud computing can make a major contribution not just in terms of the availability of on-demand computing power, but could also spur innovation in the development of novel approaches that utilize that capacity in more effective ways. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  9. Life system modeling and intelligent computing. Pt. I. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part I of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 55 papers in this volume are organized in topical sections on intelligent modeling, monitoring, and control of complex nonlinear systems; autonomy-oriented computing and intelligent agents; advanced theory and methodology in fuzzy systems and soft computing; computational intelligence in utilization of clean and renewable energy resources; intelligent modeling, control and supervision for energy saving and pollution reduction; intelligent methods in developing vehicles, engines and equipments; computational methods and intelligence in modeling genetic and biochemical networks and regulation. (orig.)

  10. Computational Models of Rock Failure

    Science.gov (United States)

    May, Dave A.; Spiegelman, Marc

    2017-04-01

    Practitioners in computational geodynamics, as per many other branches of applied science, typically do not analyse the underlying PDE's being solved in order to establish the existence or uniqueness of solutions. Rather, such proofs are left to the mathematicians, and all too frequently these results lag far behind (in time) the applied research being conducted, are often unintelligible to the non-specialist, are buried in journals applied scientists simply do not read, or simply have not been proven. As practitioners, we are by definition pragmatic. Thus, rather than first analysing our PDE's, we first attempt to find approximate solutions by throwing all our computational methods and machinery at the given problem and hoping for the best. Typically this approach leads to a satisfactory outcome. Usually it is only if the numerical solutions "look odd" that we start delving deeper into the math. In this presentation I summarise our findings in relation to using pressure dependent (Drucker-Prager type) flow laws in a simplified model of continental extension in which the material is assumed to be an incompressible, highly viscous fluid. Such assumptions represent the current mainstream adopted in computational studies of mantle and lithosphere deformation within our community. In short, we conclude that for the parameter range of cohesion and friction angle relevant to studying rocks, the incompressibility constraint combined with a Drucker-Prager flow law can result in problems which have no solution. This is proven by a 1D analytic model and convincingly demonstrated by 2D numerical simulations. To date, we do not have a robust "fix" for this fundamental problem. The intent of this submission is to highlight the importance of simple analytic models, highlight some of the dangers / risks of interpreting numerical solutions without understanding the properties of the PDE we solved, and lastly to stimulate discussions to develop an improved computational model of

  11. On the Edge: Intelligent CALL in the 1990s.

    Science.gov (United States)

    Underwood, John

    1989-01-01

    Examines the possibilities of developing computer-assisted language learning (CALL) based on the best of modern technology, arguing that artificial intelligence (AI) strategies will radically improve the kinds of exercises that can be performed. Recommends combining AI technology with other tools for delivering instruction, such as simulation and…

  12. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  13. Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.

    Science.gov (United States)

    Pearl, Lisa S; Sprouse, Jon

    2015-06-01

    Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.

  14. Model-based and model-free Pavlovian reward learning: revaluation, revision, and revelation.

    Science.gov (United States)

    Dayan, Peter; Berridge, Kent C

    2014-06-01

    Evidence supports at least two methods for learning about reward and punishment and making predictions for guiding actions. One method, called model-free, progressively acquires cached estimates of the long-run values of circumstances and actions from retrospective experience. The other method, called model-based, uses representations of the environment, expectations, and prospective calculations to make cognitive predictions of future value. Extensive attention has been paid to both methods in computational analyses of instrumental learning. By contrast, although a full computational analysis has been lacking, Pavlovian learning and prediction has typically been presumed to be solely model-free. Here, we revise that presumption and review compelling evidence from Pavlovian revaluation experiments showing that Pavlovian predictions can involve their own form of model-based evaluation. In model-based Pavlovian evaluation, prevailing states of the body and brain influence value computations, and thereby produce powerful incentive motivations that can sometimes be quite new. We consider the consequences of this revised Pavlovian view for the computational landscape of prediction, response, and choice. We also revisit differences between Pavlovian and instrumental learning in the control of incentive motivation.

  15. "Tennis elbow". A challenging call for computation and medicine

    Science.gov (United States)

    Sfetsioris, D.; Bontioti, E. N.

    2014-10-01

    An attempt to give an insight on the features composing this musculotendinous disorder. We address the issues of definition, pathophysiology and the mechanism underlying the onset and the occurrence of the disease, diagnosis and diagnostic tools as well as the methods of treatment. We focus mostly on conservative treatment protocols and we recognize the need for a more thorough investigation with the aid of computation.

  16. Computational modeling of neural activities for statistical inference

    CERN Document Server

    Kolossa, Antonio

    2016-01-01

    This authored monograph supplies empirical evidence for the Bayesian brain hypothesis by modeling event-related potentials (ERP) of the human electroencephalogram (EEG) during successive trials in cognitive tasks. The employed observer models are useful to compute probability distributions over observable events and hidden states, depending on which are present in the respective tasks. Bayesian model selection is then used to choose the model which best explains the ERP amplitude fluctuations. Thus, this book constitutes a decisive step towards a better understanding of the neural coding and computing of probabilities following Bayesian rules. The target audience primarily comprises research experts in the field of computational neurosciences, but the book may also be beneficial for graduate students who want to specialize in this field. .

  17. An integrative computational modelling of music structure apprehension

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2014-01-01

    , the computational model, by virtue of its generality, extensiveness and operationality, is suggested as a blueprint for the establishment of cognitively validated model of music structure apprehension. Available as a Matlab module, it can be used for practical musicological uses.......An objectivization of music analysis requires a detailed formalization of the underlying principles and methods. The formalization of the most elementary structural processes is hindered by the complexity of music, both in terms of profusions of entities (such as notes) and of tight interactions...... between a large number of dimensions. Computational modeling would enable systematic and exhaustive tests on sizeable pieces of music, yet current researches cover particular musical dimensions with limited success. The aim of this research is to conceive a computational modeling of music analysis...

  18. Computational algebraic geometry of epidemic models

    Science.gov (United States)

    Rodríguez Vega, Martín.

    2014-06-01

    Computational Algebraic Geometry is applied to the analysis of various epidemic models for Schistosomiasis and Dengue, both, for the case without control measures and for the case where control measures are applied. The models were analyzed using the mathematical software Maple. Explicitly the analysis is performed using Groebner basis, Hilbert dimension and Hilbert polynomials. These computational tools are included automatically in Maple. Each of these models is represented by a system of ordinary differential equations, and for each model the basic reproductive number (R0) is calculated. The effects of the control measures are observed by the changes in the algebraic structure of R0, the changes in Groebner basis, the changes in Hilbert dimension, and the changes in Hilbert polynomials. It is hoped that the results obtained in this paper become of importance for designing control measures against the epidemic diseases described. For future researches it is proposed the use of algebraic epidemiology to analyze models for airborne and waterborne diseases.

  19. Evaluation of Software Quality to Improve Application Performance Using Mc Call Model

    Directory of Open Access Journals (Sweden)

    Inda D Lestantri

    2018-04-01

    Full Text Available The existence of software should have more value to improve the performance of the organization in addition to having the primary function to automate. Before being implemented in an operational environment, software must pass the test gradually to ensure that the software is functioning properly, meeting user needs and providing convenience for users to use it. This test is performed on a web-based application, by taking a test case in an e-SAP application. E-SAP is an application used to monitor teaching and learning activities used by a university in Jakarta. To measure software quality, testing can be done on users randomly. The user samples selected in this test are users with an age range of 18 years old up to 25 years, background information technology. This test was conducted on 30 respondents. This test is done by using Mc Call model. Model of testing Mc Call consists of 11 dimensions are grouped into 3 categories. This paper describes the testing with reference to the category of product operation, which includes 5 dimensions. The dimensions of testing performed include the dimensions of correctness, usability, efficiency, reliability, and integrity. This paper discusses testing on each dimension to measure software quality as an effort to improve performance. The result of research is e-SAP application has good quality with product operation value equal to 85.09%. This indicates that the e-SAP application has a great quality, so this application deserves to be examined in the next stage on the operational environment.

  20. Modeling of Communication in a Computational Situation Assessment Model

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, or situation awareness, because failures of situation assessment may result in wrong decisions for process control and finally errors of commission in nuclear power plants. Quantitative or prescriptive models to predict operator's situation assessment in a situation, the results of situation assessment, provide many benefits such as HSI design solutions, human performance data, and human reliability. Unfortunately, a few computational situation assessment models for NPP operators have been proposed and those insufficiently embed human cognitive characteristics. Thus we proposed a new computational situation assessment model of nuclear power plant operators. The proposed model incorporating significant cognitive factors uses a Bayesian belief network (BBN) as model architecture. It is believed that communication between nuclear power plant operators affects operators' situation assessment and its result, situation awareness. We tried to verify that the proposed model represent the effects of communication on situation assessment. As the result, the proposed model succeeded in representing the operators' behavior and this paper shows the details

  1. Computer models of vocal tract evolution: an overview and critique

    NARCIS (Netherlands)

    de Boer, B.; Fitch, W. T.

    2010-01-01

    Human speech has been investigated with computer models since the invention of digital computers, and models of the evolution of speech first appeared in the late 1960s and early 1970s. Speech science and computer models have a long shared history because speech is a physical signal and can be

  2. The complete guide to blender graphics computer modeling and animation

    CERN Document Server

    Blain, John M

    2014-01-01

    Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments

  3. COMPUTATIONAL MODELING OF AIRFLOW IN NONREGULAR SHAPED CHANNELS

    Directory of Open Access Journals (Sweden)

    A. A. Voronin

    2013-05-01

    Full Text Available The basic approaches to computational modeling of airflow in the human nasal cavity are analyzed. Different models of turbulent flow which may be used in order to calculate air velocity and pressure are discussed. Experimental measurement results of airflow temperature are illustrated. Geometrical model of human nasal cavity reconstructed from computer-aided tomography scans and numerical simulation results of airflow inside this model are also given. Spatial distributions of velocity and temperature for inhaled and exhaled air are shown.

  4. Applied Mathematics, Modelling and Computational Science

    CERN Document Server

    Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

    2015-01-01

    The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

  5. Editorial: Modelling and computational challenges in granular materials

    NARCIS (Netherlands)

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss

  6. Generating Computational Models for Serious Gaming

    NARCIS (Netherlands)

    Westera, Wim

    2018-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  7. Airfoil Computations using the γ - Reθ Model

    DEFF Research Database (Denmark)

    Sørensen, Niels N.

    computations. Based on this, an estimate of the error in the computations is determined to be approximately one percent in the attached region. Following the verification of the implemented model, the model is applied to four airfoils, NACA64- 018, NACA64-218, NACA64-418 and NACA64-618 and the results...

  8. Blended call center with idling times during the call service

    NARCIS (Netherlands)

    Legros, Benjamin; Jouini, Oualid; Koole, Ger

    We consider a blended call center with calls arriving over time and an infinitely backlogged amount of outbound jobs. Inbound calls have a non-preemptive priority over outbound jobs. The inbound call service is characterized by three successive stages where the second one is a break; i.e., there is

  9. A System Computational Model of Implicit Emotional Learning.

    Science.gov (United States)

    Puviani, Luca; Rama, Sidita

    2016-01-01

    Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation.

  10. Category-theoretic models of algebraic computer systems

    Science.gov (United States)

    Kovalyov, S. P.

    2016-01-01

    A computer system is said to be algebraic if it contains nodes that implement unconventional computation paradigms based on universal algebra. A category-based approach to modeling such systems that provides a theoretical basis for mapping tasks to these systems' architecture is proposed. The construction of algebraic models of general-purpose computations involving conditional statements and overflow control is formally described by a reflector in an appropriate category of algebras. It is proved that this reflector takes the modulo ring whose operations are implemented in the conventional arithmetic processors to the Łukasiewicz logic matrix. Enrichments of the set of ring operations that form bases in the Łukasiewicz logic matrix are found.

  11. Computational Methods for Modeling Aptamers and Designing Riboswitches

    Directory of Open Access Journals (Sweden)

    Sha Gong

    2017-11-01

    Full Text Available Riboswitches, which are located within certain noncoding RNA region perform functions as genetic “switches”, regulating when and where genes are expressed in response to certain ligands. Understanding the numerous functions of riboswitches requires computation models to predict structures and structural changes of the aptamer domains. Although aptamers often form a complex structure, computational approaches, such as RNAComposer and Rosetta, have already been applied to model the tertiary (three-dimensional (3D structure for several aptamers. As structural changes in aptamers must be achieved within the certain time window for effective regulation, kinetics is another key point for understanding aptamer function in riboswitch-mediated gene regulation. The coarse-grained self-organized polymer (SOP model using Langevin dynamics simulation has been successfully developed to investigate folding kinetics of aptamers, while their co-transcriptional folding kinetics can be modeled by the helix-based computational method and BarMap approach. Based on the known aptamers, the web server Riboswitch Calculator and other theoretical methods provide a new tool to design synthetic riboswitches. This review will represent an overview of these computational methods for modeling structure and kinetics of riboswitch aptamers and for designing riboswitches.

  12. Computational modelling of the impact of AIDS on business.

    Science.gov (United States)

    Matthews, Alan P

    2007-07-01

    An overview of computational modelling of the impact of AIDS on business in South Africa, with a detailed description of the AIDS Projection Model (APM) for companies, developed by the author, and suggestions for further work. Computational modelling of the impact of AIDS on business in South Africa requires modelling of the epidemic as a whole, and of its impact on a company. This paper gives an overview of epidemiological modelling, with an introduction to the Actuarial Society of South Africa (ASSA) model, the most widely used such model for South Africa. The APM produces projections of HIV prevalence, new infections, and AIDS mortality on a company, based on the anonymous HIV testing of company employees, and projections from the ASSA model. A smoothed statistical model of the prevalence test data is computed, and then the ASSA model projection for each category of employees is adjusted so that it matches the measured prevalence in the year of testing. FURTHER WORK: Further techniques that could be developed are microsimulation (representing individuals in the computer), scenario planning for testing strategies, and models for the business environment, such as models of entire sectors, and mapping of HIV prevalence in time and space, based on workplace and community data.

  13. Computational neurorehabilitation: modeling plasticity and learning to predict recovery.

    Science.gov (United States)

    Reinkensmeyer, David J; Burdet, Etienne; Casadio, Maura; Krakauer, John W; Kwakkel, Gert; Lang, Catherine E; Swinnen, Stephan P; Ward, Nick S; Schweighofer, Nicolas

    2016-04-30

    Despite progress in using computational approaches to inform medicine and neuroscience in the last 30 years, there have been few attempts to model the mechanisms underlying sensorimotor rehabilitation. We argue that a fundamental understanding of neurologic recovery, and as a result accurate predictions at the individual level, will be facilitated by developing computational models of the salient neural processes, including plasticity and learning systems of the brain, and integrating them into a context specific to rehabilitation. Here, we therefore discuss Computational Neurorehabilitation, a newly emerging field aimed at modeling plasticity and motor learning to understand and improve movement recovery of individuals with neurologic impairment. We first explain how the emergence of robotics and wearable sensors for rehabilitation is providing data that make development and testing of such models increasingly feasible. We then review key aspects of plasticity and motor learning that such models will incorporate. We proceed by discussing how computational neurorehabilitation models relate to the current benchmark in rehabilitation modeling - regression-based, prognostic modeling. We then critically discuss the first computational neurorehabilitation models, which have primarily focused on modeling rehabilitation of the upper extremity after stroke, and show how even simple models have produced novel ideas for future investigation. Finally, we conclude with key directions for future research, anticipating that soon we will see the emergence of mechanistic models of motor recovery that are informed by clinical imaging results and driven by the actual movement content of rehabilitation therapy as well as wearable sensor-based records of daily activity.

  14. Computational modeling and engineering in pediatric and congenital heart disease.

    Science.gov (United States)

    Marsden, Alison L; Feinstein, Jeffrey A

    2015-10-01

    Recent methodological advances in computational simulations are enabling increasingly realistic simulations of hemodynamics and physiology, driving increased clinical utility. We review recent developments in the use of computational simulations in pediatric and congenital heart disease, describe the clinical impact in modeling in single-ventricle patients, and provide an overview of emerging areas. Multiscale modeling combining patient-specific hemodynamics with reduced order (i.e., mathematically and computationally simplified) circulatory models has become the de-facto standard for modeling local hemodynamics and 'global' circulatory physiology. We review recent advances that have enabled faster solutions, discuss new methods (e.g., fluid structure interaction and uncertainty quantification), which lend realism both computationally and clinically to results, highlight novel computationally derived surgical methods for single-ventricle patients, and discuss areas in which modeling has begun to exert its influence including Kawasaki disease, fetal circulation, tetralogy of Fallot (and pulmonary tree), and circulatory support. Computational modeling is emerging as a crucial tool for clinical decision-making and evaluation of novel surgical methods and interventions in pediatric cardiology and beyond. Continued development of modeling methods, with an eye towards clinical needs, will enable clinical adoption in a wide range of pediatric and congenital heart diseases.

  15. Challenges and Security in Cloud Computing

    Science.gov (United States)

    Chang, Hyokyung; Choi, Euiin

    People who live in this world want to solve any problems as they happen then. An IT technology called Ubiquitous computing should help the situations easier and we call a technology which makes it even better and powerful cloud computing. Cloud computing, however, is at the stage of the beginning to implement and use and it faces a lot of challenges in technical matters and security issues. This paper looks at the cloud computing security.

  16. Climate models on massively parallel computers

    International Nuclear Information System (INIS)

    Vitart, F.; Rouvillois, P.

    1993-01-01

    First results got on massively parallel computers (Multiple Instruction Multiple Data and Simple Instruction Multiple Data) allow to consider building of coupled models with high resolutions. This would make possible simulation of thermoaline circulation and other interaction phenomena between atmosphere and ocean. The increasing of computers powers, and then the improvement of resolution will go us to revise our approximations. Then hydrostatic approximation (in ocean circulation) will not be valid when the grid mesh will be of a dimension lower than a few kilometers: We shall have to find other models. The expert appraisement got in numerical analysis at the Center of Limeil-Valenton (CEL-V) will be used again to imagine global models taking in account atmosphere, ocean, ice floe and biosphere, allowing climate simulation until a regional scale

  17. Rough – Granular Computing knowledge discovery models

    Directory of Open Access Journals (Sweden)

    Mohammed M. Eissa

    2016-11-01

    Full Text Available Medical domain has become one of the most important areas of research in order to richness huge amounts of medical information about the symptoms of diseases and how to distinguish between them to diagnose it correctly. Knowledge discovery models play vital role in refinement and mining of medical indicators to help medical experts to settle treatment decisions. This paper introduces four hybrid Rough – Granular Computing knowledge discovery models based on Rough Sets Theory, Artificial Neural Networks, Genetic Algorithm and Rough Mereology Theory. A comparative analysis of various knowledge discovery models that use different knowledge discovery techniques for data pre-processing, reduction, and data mining supports medical experts to extract the main medical indicators, to reduce the misdiagnosis rates and to improve decision-making for medical diagnosis and treatment. The proposed models utilized two medical datasets: Coronary Heart Disease dataset and Hepatitis C Virus dataset. The main purpose of this paper was to explore and evaluate the proposed models based on Granular Computing methodology for knowledge extraction according to different evaluation criteria for classification of medical datasets. Another purpose is to make enhancement in the frame of KDD processes for supervised learning using Granular Computing methodology.

  18. Basket call option pricing for CCVG using sparse grids

    KAUST Repository

    Crocce, Fabian

    2016-01-06

    The use of processes with jumps to overcome the shortcomings of the classical Black and Scholes when modelling stock prices has became very popular. One of the best-known models is the Common Clock Variance Gamma model (CCVG), introduced by Madan and Seneta in the 1990 [3]. We propose a method to price European basket call options modelled by the CCVG. The method could be extended to other model obtained by the subordination of a multidimensional Brownian motion and to more general options. To simplify the expositions we consider calls under the CCVG.

  19. Basket call option pricing for CCVG using sparse grids

    KAUST Repository

    Crocce, Fabian; Hä ppö lä , Juho; Iania, Alessandro; Tempone, Raul

    2016-01-01

    The use of processes with jumps to overcome the shortcomings of the classical Black and Scholes when modelling stock prices has became very popular. One of the best-known models is the Common Clock Variance Gamma model (CCVG), introduced by Madan and Seneta in the 1990 [3]. We propose a method to price European basket call options modelled by the CCVG. The method could be extended to other model obtained by the subordination of a multidimensional Brownian motion and to more general options. To simplify the expositions we consider calls under the CCVG.

  20. Computation of high Reynolds number internal/external flows

    International Nuclear Information System (INIS)

    Cline, M.C.; Wilmoth, R.G.

    1981-01-01

    A general, user oriented computer program, called VNAP2, has been developed to calculate high Reynolds number, internal/external flows. VNAP2 solves the two-dimensional, time-dependent Navier-Stokes equations. The turbulence is modeled with either a mixing-length, a one transport equation, or a two transport equation model. Interior grid points are computed using the explicit MacCormack scheme with special procedures to speed up the calculation in the fine grid. All boundary conditions are calculated using a reference plane characteristic scheme with the viscous terms treated as source terms. Several internal, external, and internal/external flow calculations are presented

  1. Computation of high Reynolds number internal/external flows

    Science.gov (United States)

    Cline, M. C.; Wilmoth, R. G.

    1981-01-01

    A general, user oriented computer program, called VNAP2, was developed to calculate high Reynolds number, internal/ external flows. The VNAP2 program solves the two dimensional, time dependent Navier-Stokes equations. The turbulence is modeled with either a mixing-length, a one transport equation, or a two transport equation model. Interior grid points are computed using the explicit MacCormack Scheme with special procedures to speed up the calculation in the fine grid. All boundary conditions are calculated using a reference plane characteristic scheme with the viscous terms treated as source terms. Several internal, external, and internal/external flow calculations are presented.

  2. EMS call center models with and without function differentiation: a comparison

    NARCIS (Netherlands)

    van Buuren, M.; Kommer, G.J.; van der Mei, R.D.; Bhulai, Sandjai

    2017-01-01

    In pre-hospital health care the call center plays an important role in the coordination of emergency medical services (EMS). An EMS call center handles inbound requests for EMS and dispatches an ambulance if necessary. The time needed for triage and dispatch is part of the total response time to the

  3. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  4. Computer-aided drug design at Boehringer Ingelheim

    Science.gov (United States)

    Muegge, Ingo; Bergner, Andreas; Kriegl, Jan M.

    2017-03-01

    Computer-Aided Drug Design (CADD) is an integral part of the drug discovery endeavor at Boehringer Ingelheim (BI). CADD contributes to the evaluation of new therapeutic concepts, identifies small molecule starting points for drug discovery, and develops strategies for optimizing hit and lead compounds. The CADD scientists at BI benefit from the global use and development of both software platforms and computational services. A number of computational techniques developed in-house have significantly changed the way early drug discovery is carried out at BI. In particular, virtual screening in vast chemical spaces, which can be accessed by combinatorial chemistry, has added a new option for the identification of hits in many projects. Recently, a new framework has been implemented allowing fast, interactive predictions of relevant on and off target endpoints and other optimization parameters. In addition to the introduction of this new framework at BI, CADD has been focusing on the enablement of medicinal chemists to independently perform an increasing amount of molecular modeling and design work. This is made possible through the deployment of MOE as a global modeling platform, allowing computational and medicinal chemists to freely share ideas and modeling results. Furthermore, a central communication layer called the computational chemistry framework provides broad access to predictive models and other computational services.

  5. Assessment of weld thickness loss in offshore pipelines using computed radiography and computational modeling

    International Nuclear Information System (INIS)

    Correa, S.C.A.; Souza, E.M.; Oliveira, D.F.; Silva, A.X.; Lopes, R.T.; Marinho, C.; Camerini, C.S.

    2009-01-01

    In order to guarantee the structural integrity of oil plants it is crucial to monitor the amount of weld thickness loss in offshore pipelines. However, in spite of its relevance, this parameter is very difficult to determine, due to both the large diameter of most pipes and the complexity of the multi-variable system involved. In this study, a computational modeling based on Monte Carlo MCNPX code is combined with computed radiography to estimate the weld thickness loss in large-diameter offshore pipelines. Results show that computational modeling is a powerful tool to estimate intensity variations in radiographic images generated by weld thickness variations, and it can be combined with computed radiography to assess weld thickness loss in offshore and subsea pipelines.

  6. Introduction to computation and modeling for differential equations

    CERN Document Server

    Edsberg, Lennart

    2008-01-01

    An introduction to scientific computing for differential equationsIntroduction to Computation and Modeling for Differential Equations provides a unified and integrated view of numerical analysis, mathematical modeling in applications, and programming to solve differential equations, which is essential in problem-solving across many disciplines, such as engineering, physics, and economics. This book successfully introduces readers to the subject through a unique ""Five-M"" approach: Modeling, Mathematics, Methods, MATLAB, and Multiphysics. This approach facilitates a thorough understanding of h

  7. Time-domain seismic modeling in viscoelastic media for full waveform inversion on heterogeneous computing platforms with OpenCL

    Science.gov (United States)

    Fabien-Ouellet, Gabriel; Gloaguen, Erwan; Giroux, Bernard

    2017-03-01

    Full Waveform Inversion (FWI) aims at recovering the elastic parameters of the Earth by matching recordings of the ground motion with the direct solution of the wave equation. Modeling the wave propagation for realistic scenarios is computationally intensive, which limits the applicability of FWI. The current hardware evolution brings increasing parallel computing power that can speed up the computations in FWI. However, to take advantage of the diversity of parallel architectures presently available, new programming approaches are required. In this work, we explore the use of OpenCL to develop a portable code that can take advantage of the many parallel processor architectures now available. We present a program called SeisCL for 2D and 3D viscoelastic FWI in the time domain. The code computes the forward and adjoint wavefields using finite-difference and outputs the gradient of the misfit function given by the adjoint state method. To demonstrate the code portability on different architectures, the performance of SeisCL is tested on three different devices: Intel CPUs, NVidia GPUs and Intel Xeon PHI. Results show that the use of GPUs with OpenCL can speed up the computations by nearly two orders of magnitudes over a single threaded application on the CPU. Although OpenCL allows code portability, we show that some device-specific optimization is still required to get the best performance out of a specific architecture. Using OpenCL in conjunction with MPI allows the domain decomposition of large models on several devices located on different nodes of a cluster. For large enough models, the speedup of the domain decomposition varies quasi-linearly with the number of devices. Finally, we investigate two different approaches to compute the gradient by the adjoint state method and show the significant advantages of using OpenCL for FWI.

  8. A novel patient-specific model to compute coronary fractional flow reserve.

    Science.gov (United States)

    Kwon, Soon-Sung; Chung, Eui-Chul; Park, Jin-Seo; Kim, Gook-Tae; Kim, Jun-Woo; Kim, Keun-Hong; Shin, Eun-Seok; Shim, Eun Bo

    2014-09-01

    The fractional flow reserve (FFR) is a widely used clinical index to evaluate the functional severity of coronary stenosis. A computer simulation method based on patients' computed tomography (CT) data is a plausible non-invasive approach for computing the FFR. This method can provide a detailed solution for the stenosed coronary hemodynamics by coupling computational fluid dynamics (CFD) with the lumped parameter model (LPM) of the cardiovascular system. In this work, we have implemented a simple computational method to compute the FFR. As this method uses only coronary arteries for the CFD model and includes only the LPM of the coronary vascular system, it provides simpler boundary conditions for the coronary geometry and is computationally more efficient than existing approaches. To test the efficacy of this method, we simulated a three-dimensional straight vessel using CFD coupled with the LPM. The computed results were compared with those of the LPM. To validate this method in terms of clinically realistic geometry, a patient-specific model of stenosed coronary arteries was constructed from CT images, and the computed FFR was compared with clinically measured results. We evaluated the effect of a model aorta on the computed FFR and compared this with a model without the aorta. Computationally, the model without the aorta was more efficient than that with the aorta, reducing the CPU time required for computing a cardiac cycle to 43.4%. Copyright © 2014. Published by Elsevier Ltd.

  9. Toward a computational model of hemostasis

    Science.gov (United States)

    Leiderman, Karin; Danes, Nicholas; Schoeman, Rogier; Neeves, Keith

    2017-11-01

    Hemostasis is the process by which a blood clot forms to prevent bleeding at a site of injury. The formation time, size and structure of a clot depends on the local hemodynamics and the nature of the injury. Our group has previously developed computational models to study intravascular clot formation, a process confined to the interior of a single vessel. Here we present the first stage of an experimentally-validated, computational model of extravascular clot formation (hemostasis) in which blood through a single vessel initially escapes through a hole in the vessel wall and out a separate injury channel. This stage of the model consists of a system of partial differential equations that describe platelet aggregation and hemodynamics, solved via the finite element method. We also present results from the analogous, in vitro, microfluidic model. In both models, formation of a blood clot occludes the injury channel and stops flow from escaping while blood in the main vessel retains its fluidity. We discuss the different biochemical and hemodynamic effects on clot formation using distinct geometries representing intra- and extravascular injuries.

  10. Computer-Aided Modeling of Lipid Processing Technology

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel

    2011-01-01

    increase along with growing interest in biofuels, the oleochemical industry faces in the upcoming years major challenges in terms of design and development of better products and more sustainable processes to make them. Computer-aided methods and tools for process synthesis, modeling and simulation...... are widely used for design, analysis, and optimization of processes in the chemical and petrochemical industries. These computer-aided tools have helped the chemical industry to evolve beyond commodities toward specialty chemicals and ‘consumer oriented chemicals based products’. Unfortunately...... to develop systematic computer-aided methods (property models) and tools (database) related to the prediction of the necessary physical properties suitable for design and analysis of processes employing lipid technologies. The methods and tools include: the development of a lipid-database (CAPEC...

  11. Development of a common data model for scientific simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, J. [Los Alamos National Lab., NM (United States); Butler, D.M. [Limit Point Systems, Inc. (United States); Matarazzo, C.; Miller, M. [Lawrence Livermore National Lab., CA (United States); Schoof, L. [Sandia National Lab., Albuquerque, NM (United States)

    1999-06-01

    The problem of sharing data among scientific simulation models is a difficult and persistent one. Computational scientists employ an enormous variety of discrete approximations in modeling physical processes on computers. Problems occur when models based on different representations are required to exchange data with one another, or with some other software package. Within the DOE`s Accelerated Strategic Computing Initiative (ASCI), a cross-disciplinary group called the Data Models and Formats (DMF) group, has been working to develop a common data model. The current model is comprised of several layers of increasing semantic complexity. One of these layers is an abstract model based on set theory and topology called the fiber bundle kernel (FBK). This layer provides the flexibility needed to describe a wide range of mesh-approximated functions as well as other entities. This paper briefly describes the ASCI common data model, its mathematical basis, and ASCI prototype development. These prototypes include an object-oriented data management library developed at Los Alamos called the Common Data Model Library or CDMlib, the Vector Bundle API from the Lawrence Livermore Laboratory, and the DMF API from Sandia National Laboratory.

  12. An ELT's Solution to Combat Plagiarism: "Birth" of CALL.

    Science.gov (United States)

    Sabieh, Christine

    One English-as-a Second-Language professor fought plagiarism using computer assisted language learning (CALL). She succeeded in getting half of her class to write documented research papers free of plagiarism. Although all of the students claimed to know how to avoid plagiarizing, 35 percent presented the work with minor traces of plagiarism. The…

  13. The job demands-resources model of work engagement in South African call centres

    OpenAIRE

    Yolandi Janse van Rensburg; Billy Boonzaier; Michèle Boonzaier

    2013-01-01

    Orientation: A ‘sacrificial human resource strategy’ is practised in call centres, resulting in poor employee occupational health. Consequently, questions are posed in terms of the consequences of call centre work and which salient antecedent variables impact the engagement and wellbeing of call centre representatives. Research purpose: Firstly, to gauge the level of employee engagement amongst a sample of call centre representatives in South Africa and, secondly, to track the paths throu...

  14. Computational model for dosimetric purposes in dental procedures

    International Nuclear Information System (INIS)

    Kawamoto, Renato H.; Campos, Tarcisio R.

    2013-01-01

    This study aims to develop a computational model for dosimetric purposes the oral region, based on computational tools SISCODES and MCNP-5, to predict deterministic effects and minimize stochastic effects caused by ionizing radiation by radiodiagnosis. Based on a set of digital information provided by computed tomography, three-dimensional voxel model was created, and its tissues represented. The model was exported to the MCNP code. In association with SICODES, we used the Monte Carlo N-Particle Transport Code (MCNP-5) method to play the corresponding interaction of nuclear particles with human tissues statistical process. The study will serve as a source of data for dosimetric studies in the oral region, providing deterministic effect and minimize the stochastic effect of ionizing radiation

  15. Organizing Mass-Interaction Physical Models: the CORDIS-ANIMA Musical Instrumentarium

    OpenAIRE

    Tache , Olivier; Cadoz , Claude

    2009-01-01

    International audience; Practicing physical modeling leads to reconsider the relationship with the musical computer, at a conceptual and practical level. This calls for specific learning tools, which fully take into account the specificities of this activity and go further than the traditional concepts of libraries of models and tutorials used in computer music environments. This paper presents a framework, called the Instrumentarium, which addresses this issue in the context of the CORDIS-AN...

  16. The Role of Computer-Assisted Language Learning (CALL) in Promoting Learner Autonomy

    Science.gov (United States)

    Mutlu, Arzu; Eroz-Tuga, Betil

    2013-01-01

    Problem Statement: Teaching a language with the help of computers and the Internet has attracted the attention of many practitioners and researchers in the last 20 years, so the number of studies that investigate whether computers and the Internet promote language learning continues to increase. These studies have focused on exploring the beliefs…

  17. On the Bayesian calibration of computer model mixtures through experimental data, and the design of predictive models

    Science.gov (United States)

    Karagiannis, Georgios; Lin, Guang

    2017-08-01

    For many real systems, several computer models may exist with different physics and predictive abilities. To achieve more accurate simulations/predictions, it is desirable for these models to be properly combined and calibrated. We propose the Bayesian calibration of computer model mixture method which relies on the idea of representing the real system output as a mixture of the available computer model outputs with unknown input dependent weight functions. The method builds a fully Bayesian predictive model as an emulator for the real system output by combining, weighting, and calibrating the available models in the Bayesian framework. Moreover, it fits a mixture of calibrated computer models that can be used by the domain scientist as a mean to combine the available computer models, in a flexible and principled manner, and perform reliable simulations. It can address realistic cases where one model may be more accurate than the others at different input values because the mixture weights, indicating the contribution of each model, are functions of the input. Inference on the calibration parameters can consider multiple computer models associated with different physics. The method does not require knowledge of the fidelity order of the models. We provide a technique able to mitigate the computational overhead due to the consideration of multiple computer models that is suitable to the mixture model framework. We implement the proposed method in a real-world application involving the Weather Research and Forecasting large-scale climate model.

  18. Computer-based theory of strategies

    Energy Technology Data Exchange (ETDEWEB)

    Findler, N V

    1983-01-01

    Some of the objectives and working tools of a new area of study, tentatively called theory of strategies, are described. It is based on the methodology of artificial intelligence, decision theory, operations research and digital gaming. The latter refers to computing activity that incorporates model building, simulation and learning programs in conflict situations. Three long-term projects which aim at automatically analyzing and synthesizing strategies are discussed. 27 references.

  19. Inhibitors of calling behavior of Plodia interpunctella.

    Science.gov (United States)

    Hirashima, Akinori; Shigeta, Yoko; Eiraku, Tomohiko; Kuwano, Eiichi

    2003-01-01

    Some octopamine agonists were found to suppress the calling behavior of the stored product Indian meal moth, Plodia interpunctella. Compounds were screened using a calling behavior bioassay using female P. interpunctella. Four active derivatives, with inhibitory activity at the nanomolar range, were identified in order of decreasing activity: 2-(1-phenylethylamino)-2-oxazoline > 2-(2-ethyl,6-methylanilino)oxazolidine > 2-(2-methyl benzylamino)-2-thiazoline > 2-(2,6-diethylanilino)thiazolidine. Three-dimensional pharmacophore hypotheses were built from a set of 15 compounds. Among the ten common-featured models generated by the program Catalyst/HipHop, a hypothesis including a hydrogen-bond acceptor lipid, a hydrophobic aromatic and two hydrophobic aliphatic features was considered to be essential for inhibitory activity in the calling behavior. Active compounds mapped well onto all the hydrogen-bond acceptor lipid, hydrophobic aromatic and hydrophobic aliphatic features of the hypothesis. On the other hand, less active compounds were shown not to achieve the energetically favorable conformation that is found in the active molecules in order to fit the 3D common-feature pharmacophore models. The present studies demonstrate that inhibition of calling behavior is via an octopamine receptor.

  20. Inhibitors of calling behavior of Plodia interpunctella

    Directory of Open Access Journals (Sweden)

    Akinori Hirashima

    2003-01-01

    Full Text Available Some octopamine agonists were found to suppress the calling behavior of the stored product Indian meal moth, Plodia interpunctella. Compounds were screened using a calling behavior bioassay using female P. interpunctella. Four active derivatives, with inhibitory activity at the nanomolar range, were identified in order of decreasing activity: 2-(1-phenylethylamino-2-oxazoline > 2-(2-ethyl,6-methylanilinooxazolidine > 2-(2-methyl benzylamino-2-thiazoline > 2-(2,6-diethylanilinothiazolidine. Three-dimensional pharmacophore hypotheses were built from a set of 15 compounds. Among the ten common-featured models generated by the program Catalyst/HipHop, a hypothesis including a hydrogen-bond acceptor lipid, a hydrophobic aromatic and two hydrophobic aliphatic features was considered to be essential for inhibitory activity in the calling behavior. Active compounds mapped well onto all the hydrogen-bond acceptor lipid, hydrophobic aromatic and hydrophobic aliphatic features of the hypothesis. On the other hand, less active compounds were shown not to achieve the energetically favorable conformation that is found in the active molecules in order to fit the 3D common-feature pharmacophore models. The present studies demonstrate that inhibition of calling behavior is via an octopamine receptor.

  1. Biomedical Imaging and Computational Modeling in Biomechanics

    CERN Document Server

    Iacoviello, Daniela

    2013-01-01

    This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease.   The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.

  2. Getting computer models to communicate

    International Nuclear Information System (INIS)

    Caremoli, Ch.; Erhard, P.

    1999-01-01

    Today's computers have the processing power to deliver detailed and global simulations of complex industrial processes such as the operation of a nuclear reactor core. So should we be producing new, global numerical models to take full advantage of this new-found power? If so, it would be a long-term job. There is, however, another solution; to couple the existing validated numerical models together so that they work as one. (authors)

  3. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  4. Cloud Computing Adoption Business Model Factors: Does Enterprise Size Matter?

    OpenAIRE

    Bogataj Habjan, Kristina; Pucihar, Andreja

    2017-01-01

    This paper presents the results of research investigating the impact of business model factors on cloud computing adoption. The introduced research model consists of 40 cloud computing business model factors, grouped into eight factor groups. Their impact and importance for cloud computing adoption were investigated among enterpirses in Slovenia. Furthermore, differences in opinion according to enterprise size were investigated. Research results show no statistically significant impacts of in...

  5. Using AMDD method for Database Design in Mobile Cloud Computing Systems

    OpenAIRE

    Silviu Claudiu POPA; Mihai-Constantin AVORNICULUI; Vasile Paul BRESFELEAN

    2013-01-01

    The development of the technologies of wireless telecommunications gave birth of new kinds of e-commerce, the so called Mobile e-Commerce or m-Commerce. Mobile Cloud Computing (MCC) represents a new IT research area that combines mobile computing and cloud compu-ting techniques. Behind a cloud mobile commerce system there is a database containing all necessary information for transactions. By means of Agile Model Driven Development (AMDD) method, we are able to achieve many benefits that smoo...

  6. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented...

  7. Computational Design Modelling : Proceedings of the Design Modelling Symposium

    CERN Document Server

    Kilian, Axel; Palz, Norbert; Scheurer, Fabian

    2012-01-01

    This book publishes the peer-reviewed proceeding of the third Design Modeling Symposium Berlin . The conference constitutes a platform for dialogue on experimental practice and research within the field of computationally informed architectural design. More than 60 leading experts the computational processes within the field of computationally informed architectural design to develop a broader and less exotic building practice that bears more subtle but powerful traces of the complex tool set and approaches we have developed and studied over recent years. The outcome are new strategies for a reasonable and innovative implementation of digital potential in truly innovative and radical design guided by both responsibility towards processes and the consequences they initiate.

  8. Information systems performance evaluation, introducing a two-level technique: Case study call centers

    Directory of Open Access Journals (Sweden)

    Hesham A. Baraka

    2015-03-01

    The objective of this paper was to introduce a new technique that can support decision makers in the call centers industry to evaluate, and analyze the performance of call centers. The technique presented is derived from the research done on measuring the success or failure of information systems. Two models are mainly adopted namely: the Delone and Mclean model first introduced in 1992 and the Design Reality Gap model introduced by Heeks in 2002. Two indices are defined to calculate the performance of the call center; the success index and the Gap Index. An evaluation tool has been developed to allow call centers managers to evaluate the performance of their call centers in a systematic analytical approach; the tool was applied on 4 call centers from different areas, simple applications such as food ordering, marketing, and sales, technical support systems, to more real time services such as the example of emergency control systems. Results showed the importance of using information systems models to evaluate complex systems as call centers. The models used allow identifying the dimensions for the call centers that are facing challenges, together with an identification of the individual indicators in these dimensions that are causing the poor performance of the call center.

  9. Computing chemical organizations in biological networks.

    Science.gov (United States)

    Centler, Florian; Kaleta, Christoph; di Fenizio, Pietro Speroni; Dittrich, Peter

    2008-07-15

    Novel techniques are required to analyze computational models of intracellular processes as they increase steadily in size and complexity. The theory of chemical organizations has recently been introduced as such a technique that links the topology of biochemical reaction network models to their dynamical repertoire. The network is decomposed into algebraically closed and self-maintaining subnetworks called organizations. They form a hierarchy representing all feasible system states including all steady states. We present three algorithms to compute the hierarchy of organizations for network models provided in SBML format. Two of them compute the complete organization hierarchy, while the third one uses heuristics to obtain a subset of all organizations for large models. While the constructive approach computes the hierarchy starting from the smallest organization in a bottom-up fashion, the flux-based approach employs self-maintaining flux distributions to determine organizations. A runtime comparison on 16 different network models of natural systems showed that none of the two exhaustive algorithms is superior in all cases. Studying a 'genome-scale' network model with 762 species and 1193 reactions, we demonstrate how the organization hierarchy helps to uncover the model structure and allows to evaluate the model's quality, for example by detecting components and subsystems of the model whose maintenance is not explained by the model. All data and a Java implementation that plugs into the Systems Biology Workbench is available from http://www.minet.uni-jena.de/csb/prj/ot/tools.

  10. Description of mathematical models and computer programs

    International Nuclear Information System (INIS)

    1977-01-01

    The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives

  11. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  12. Optimal service using Matlab - simulink controlled Queuing system at call centers

    Science.gov (United States)

    Balaji, N.; Siva, E. P.; Chandrasekaran, A. D.; Tamilazhagan, V.

    2018-04-01

    This paper presents graphical integrated model based academic research on telephone call centres. This paper introduces an important feature of impatient customers and abandonments in the queue system. However the modern call centre is a complex socio-technical system. Queuing theory has now become a suitable application in the telecom industry to provide better online services. Through this Matlab-simulink multi queuing structured models provide better solutions in complex situations at call centres. Service performance measures analyzed at optimal level through Simulink queuing model.

  13. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  14. Computational challenges in modeling gene regulatory events.

    Science.gov (United States)

    Pataskar, Abhijeet; Tiwari, Vijay K

    2016-10-19

    Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.

  15. Shadow Replication: An Energy-Aware, Fault-Tolerant Computational Model for Green Cloud Computing

    Directory of Open Access Journals (Sweden)

    Xiaolong Cui

    2014-08-01

    Full Text Available As the demand for cloud computing continues to increase, cloud service providers face the daunting challenge to meet the negotiated SLA agreement, in terms of reliability and timely performance, while achieving cost-effectiveness. This challenge is increasingly compounded by the increasing likelihood of failure in large-scale clouds and the rising impact of energy consumption and CO2 emission on the environment. This paper proposes Shadow Replication, a novel fault-tolerance model for cloud computing, which seamlessly addresses failure at scale, while minimizing energy consumption and reducing its impact on the environment. The basic tenet of the model is to associate a suite of shadow processes to execute concurrently with the main process, but initially at a much reduced execution speed, to overcome failures as they occur. Two computationally-feasible schemes are proposed to achieve Shadow Replication. A performance evaluation framework is developed to analyze these schemes and compare their performance to traditional replication-based fault tolerance methods, focusing on the inherent tradeoff between fault tolerance, the specified SLA and profit maximization. The results show that Shadow Replication leads to significant energy reduction, and is better suited for compute-intensive execution models, where up to 30% more profit increase can be achieved due to reduced energy consumption.

  16. Efficiency using computer simulation of Reverse Threshold Model Theory on assessing a “One Laptop Per Child” computer versus desktop computer

    Directory of Open Access Journals (Sweden)

    Supat Faarungsang

    2017-04-01

    Full Text Available The Reverse Threshold Model Theory (RTMT model was introduced based on limiting factor concepts, but its efficiency compared to the Conventional Model (CM has not been published. This investigation assessed the efficiency of RTMT compared to CM using computer simulation on the “One Laptop Per Child” computer and a desktop computer. Based on probability values, it was found that RTMT was more efficient than CM among eight treatment combinations and an earlier study verified that RTMT gives complete elimination of random error. Furthermore, RTMT has several advantages over CM and is therefore proposed to be applied to most research data.

  17. Voice over IP phone calls from your smartphone

    CERN Multimedia

    2014-01-01

    All CERN users do have a Lync account (see here) and can use Instant Messaging, presence and other features. In addition, if your number is activated on Lync IP Phone(1) system then you can make standard phone calls from your computer (Windows/Mac).   Recently, we upgraded the infrastructure to Lync 2013. One of the major features is the possibility to make Voice over IP phone calls from a smartphone using your CERN standard phone number (not mobile!). Install Lync 2013 on iPhone/iPad, Android or Windows Phone, connect to WiFi network and make phone calls as if you were in your office. There will be no roaming charges because you will be using WiFi to connect to CERN phone system(2). Register here to the presentation on Tuesday 29 April at 11 a.m. in the Technical Training Center and see the most exciting features of Lync 2013.   Looking forward to seeing you! The Lync team (1) How to register on Lync IP Phone system: http://information-technology.web.cern.ch/book/lync-ip-phone-serv...

  18. The Next Generation ARC Middleware and ATLAS Computing Model

    CERN Document Server

    Filipcic, A; The ATLAS collaboration; Smirnova, O; Konstantinov, A; Karpenko, D

    2012-01-01

    The distributed NDGF Tier-1 and associated Nordugrid clusters are well integrated into the ATLAS computing model but follow a slightly different paradigm than other ATLAS resources. The current strategy does not divide the sites as in the commonly used hierarchical model, but rather treats them as a single storage endpoint and a pool of distributed computing nodes. The next generation ARC middleware with its several new technologies provides new possibilities in development of the ATLAS computing model, such as pilot jobs with pre-cached input files, automatic job migration between the sites, integration of remote sites without connected storage elements, and automatic brokering for jobs with non-standard resource requirements. ARC's data transfer model provides an automatic way for the computing sites to participate in ATLAS' global task management system without requiring centralised brokering or data transfer services. The powerful API combined with Python and Java bindings can easily be used to build new ...

  19. Context Analysis of Customer Requests using a Hybrid Adaptive Neuro Fuzzy Inference System and Hidden Markov Models in the Natural Language Call Routing Problem

    Science.gov (United States)

    Rustamov, Samir; Mustafayev, Elshan; Clements, Mark A.

    2018-04-01

    The context analysis of customer requests in a natural language call routing problem is investigated in the paper. One of the most significant problems in natural language call routing is a comprehension of client request. With the aim of finding a solution to this issue, the Hybrid HMM and ANFIS models become a subject to an examination. Combining different types of models (ANFIS and HMM) can prevent misunderstanding by the system for identification of user intention in dialogue system. Based on these models, the hybrid system may be employed in various language and call routing domains due to nonusage of lexical or syntactic analysis in classification process.

  20. Context Analysis of Customer Requests using a Hybrid Adaptive Neuro Fuzzy Inference System and Hidden Markov Models in the Natural Language Call Routing Problem

    Directory of Open Access Journals (Sweden)

    Rustamov Samir

    2018-04-01

    Full Text Available The context analysis of customer requests in a natural language call routing problem is investigated in the paper. One of the most significant problems in natural language call routing is a comprehension of client request. With the aim of finding a solution to this issue, the Hybrid HMM and ANFIS models become a subject to an examination. Combining different types of models (ANFIS and HMM can prevent misunderstanding by the system for identification of user intention in dialogue system. Based on these models, the hybrid system may be employed in various language and call routing domains due to nonusage of lexical or syntactic analysis in classification process.

  1. Lattice Boltzmann model capable of mesoscopic vorticity computation

    Science.gov (United States)

    Peng, Cheng; Guo, Zhaoli; Wang, Lian-Ping

    2017-11-01

    It is well known that standard lattice Boltzmann (LB) models allow the strain-rate components to be computed mesoscopically (i.e., through the local particle distributions) and as such possess a second-order accuracy in strain rate. This is one of the appealing features of the lattice Boltzmann method (LBM) which is of only second-order accuracy in hydrodynamic velocity itself. However, no known LB model can provide the same quality for vorticity and pressure gradients. In this paper, we design a multiple-relaxation time LB model on a three-dimensional 27-discrete-velocity (D3Q27) lattice. A detailed Chapman-Enskog analysis is presented to illustrate all the necessary constraints in reproducing the isothermal Navier-Stokes equations. The remaining degrees of freedom are carefully analyzed to derive a model that accommodates mesoscopic computation of all the velocity and pressure gradients from the nonequilibrium moments. This way of vorticity calculation naturally ensures a second-order accuracy, which is also proven through an asymptotic analysis. We thus show, with enough degrees of freedom and appropriate modifications, the mesoscopic vorticity computation can be achieved in LBM. The resulting model is then validated in simulations of a three-dimensional decaying Taylor-Green flow, a lid-driven cavity flow, and a uniform flow passing a fixed sphere. Furthermore, it is shown that the mesoscopic vorticity computation can be realized even with single relaxation parameter.

  2. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking.

    Science.gov (United States)

    Pârvu, Ovidiu; Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour

  3. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    Science.gov (United States)

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  4. Dynamic call center routing policies using call waiting and agent idle times

    NARCIS (Netherlands)

    Chan, W.; Koole, G.M.; L'Ecuyer, P.

    2014-01-01

    We study call routing policies for call centers with multiple call types and multiple agent groups. We introduce new weight-based routing policies where each pair (call type, agent group) is given a matching priority defined as an affine combination of the longest waiting time for that call type and

  5. Studying Language Learning Opportunities Afforded by a Collaborative CALL Task

    Science.gov (United States)

    Leahy, Christine

    2016-01-01

    This research study explores the learning potential of a computer-assisted language learning (CALL) activity. Research suggests that the dual emphasis on content development and language accuracy, as well as the complexity of L2 production in natural settings, can potentially create cognitive overload. This study poses the question whether, and…

  6. Student Teachers and CALL: Personal and Pedagogical Uses and Beliefs

    Science.gov (United States)

    Hlas, Anne Cummings; Conroy, Kelly; Hildebrandt, Susan A.

    2017-01-01

    The student teaching semester affords teacher candidates the chance to apply what they have learned during their teacher preparation coursework. Therefore, it can be a prime opportunity for student teachers to use technology for their own language learning and to implement computer assisted language learning (CALL) in their instruction. This study…

  7. Integrating CALL into an Iranian EAP Course: Constraints and Affordances

    Science.gov (United States)

    Mehran, Parisa; Alizadeh, Mehrasa

    2015-01-01

    Iranian universities have recently displayed a growing interest in integrating Computer-Assisted Language Learning (CALL) into teaching/learning English. The English for Academic Purposes (EAP) context, however, is not keeping pace with the current changes since EAP courses are strictly text-based and exam-oriented, and little research has thus…

  8. Predictive modeling of liquid-sodium thermal–hydraulics experiments and computations

    International Nuclear Information System (INIS)

    Arslan, Erkan; Cacuci, Dan G.

    2014-01-01

    Highlights: • We applied the predictive modeling method of Cacuci and Ionescu-Bujor (2010). • We assimilated data from sodium flow experiments. • We used computational fluid dynamics simulations of sodium experiments. • The predictive modeling method greatly reduced uncertainties in predicted results. - Abstract: This work applies the predictive modeling procedure formulated by Cacuci and Ionescu-Bujor (2010) to assimilate data from liquid-sodium thermal–hydraulics experiments in order to reduce systematically the uncertainties in the predictions of computational fluid dynamics (CFD) simulations. The predicted CFD-results for the best-estimate model parameters and results describing sodium-flow velocities and temperature distributions are shown to be significantly more precise than the original computations and experiments, in that the predicted uncertainties for the best-estimate results and model parameters are significantly smaller than both the originally computed and the experimental uncertainties

  9. time series modeling of daily abandoned calls in a call centre

    African Journals Online (AJOL)

    DJFLEX

    Models for evaluating and predicting the short periodic time series in daily ... Ugwuowo (2006) proposed asymmetric angular- linear multivariate regression models, ..... Using the parameter estimates in Table 3, the fitted Fourier series model is ..... For the SARIMA model with the stochastic component also being white noise, ...

  10. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  11. Computational and Statistical Models: A Comparison for Policy Modeling of Childhood Obesity

    Science.gov (United States)

    Mabry, Patricia L.; Hammond, Ross; Ip, Edward Hak-Sing; Huang, Terry T.-K.

    As systems science methodologies have begun to emerge as a set of innovative approaches to address complex problems in behavioral, social science, and public health research, some apparent conflicts with traditional statistical methodologies for public health have arisen. Computational modeling is an approach set in context that integrates diverse sources of data to test the plausibility of working hypotheses and to elicit novel ones. Statistical models are reductionist approaches geared towards proving the null hypothesis. While these two approaches may seem contrary to each other, we propose that they are in fact complementary and can be used jointly to advance solutions to complex problems. Outputs from statistical models can be fed into computational models, and outputs from computational models can lead to further empirical data collection and statistical models. Together, this presents an iterative process that refines the models and contributes to a greater understanding of the problem and its potential solutions. The purpose of this panel is to foster communication and understanding between statistical and computational modelers. Our goal is to shed light on the differences between the approaches and convey what kinds of research inquiries each one is best for addressing and how they can serve complementary (and synergistic) roles in the research process, to mutual benefit. For each approach the panel will cover the relevant "assumptions" and how the differences in what is assumed can foster misunderstandings. The interpretations of the results from each approach will be compared and contrasted and the limitations for each approach will be delineated. We will use illustrative examples from CompMod, the Comparative Modeling Network for Childhood Obesity Policy. The panel will also incorporate interactive discussions with the audience on the issues raised here.

  12. ATLAS computing activities and developments in the Italian Grid cloud

    International Nuclear Information System (INIS)

    Rinaldi, L; Ciocca, C; K, M; Annovi, A; Antonelli, M; Martini, A; Barberis, D; Brunengo, A; Corosu, M; Barberis, S; Carminati, L; Campana, S; Di, A; Capone, V; Carlino, G; Doria, A; Esposito, R; Merola, L; De, A; Luminari, L

    2012-01-01

    The large amount of data produced by the ATLAS experiment needs new computing paradigms for data processing and analysis, which involve many computing centres spread around the world. The computing workload is managed by regional federations, called “clouds”. The Italian cloud consists of a main (Tier-1) center, located in Bologna, four secondary (Tier-2) centers, and a few smaller (Tier-3) sites. In this contribution we describe the Italian cloud facilities and the activities of data processing, analysis, simulation and software development performed within the cloud, and we discuss the tests of the new computing technologies contributing to evolution of the ATLAS Computing Model.

  13. Computational Models for Calcium-Mediated Astrocyte Functions

    Directory of Open Access Journals (Sweden)

    Tiina Manninen

    2018-04-01

    Full Text Available The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop

  14. Computational Models for Calcium-Mediated Astrocyte Functions.

    Science.gov (United States)

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus

  15. The use of conduction model in laser weld profile computation

    Science.gov (United States)

    Grabas, Bogusław

    2007-02-01

    Profiles of joints resulting from deep penetration laser beam welding of a flat workpiece of carbon steel were computed. A semi-analytical conduction model solved with Green's function method was used in computations. In the model, the moving heat source was attenuated exponentially in accordance with Beer-Lambert law. Computational results were compared with those in the experiment.

  16. Computational Psychometrics for Modeling System Dynamics during Stressful Disasters

    Directory of Open Access Journals (Sweden)

    Pietro Cipresso

    2017-08-01

    Full Text Available Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future.

  17. Computer modeling of road bridge for simulation moving load

    Directory of Open Access Journals (Sweden)

    Miličić Ilija M.

    2016-01-01

    Full Text Available In this paper is shown computational modelling one span road structures truss bridge with the roadway on the upper belt of. Calculation models were treated as planar and spatial girders made up of 1D finite elements with applications for CAA: Tower and Bridge Designer 2016 (2nd Edition. The conducted computer simulations results are obtained for each comparison of the impact of moving load according to the recommendations of the two standards SRPS and AASHATO. Therefore, it is a variant of the bridge structure modeling application that provides Bridge Designer 2016 (2nd Edition identical modeled in an environment of Tower. As important information for the selection of a computer applications point out that the application Bridge Designer 2016 (2nd Edition we arent unable to treat the impacts moving load model under national standard - V600. .

  18. Epistemic Gameplay and Discovery in Computational Model-Based Inquiry Activities

    Science.gov (United States)

    Wilkerson, Michelle Hoda; Shareff, Rebecca; Laina, Vasiliki; Gravel, Brian

    2018-01-01

    In computational modeling activities, learners are expected to discover the inner workings of scientific and mathematical systems: First elaborating their understandings of a given system through constructing a computer model, then "debugging" that knowledge by testing and refining the model. While such activities have been shown to…

  19. An algorithm to detect and communicate the differences in computational models describing biological systems.

    Science.gov (United States)

    Scharm, Martin; Wolkenhauer, Olaf; Waltemath, Dagmar

    2016-02-15

    Repositories support the reuse of models and ensure transparency about results in publications linked to those models. With thousands of models available in repositories, such as the BioModels database or the Physiome Model Repository, a framework to track the differences between models and their versions is essential to compare and combine models. Difference detection not only allows users to study the history of models but also helps in the detection of errors and inconsistencies. Existing repositories lack algorithms to track a model's development over time. Focusing on SBML and CellML, we present an algorithm to accurately detect and describe differences between coexisting versions of a model with respect to (i) the models' encoding, (ii) the structure of biological networks and (iii) mathematical expressions. This algorithm is implemented in a comprehensive and open source library called BiVeS. BiVeS helps to identify and characterize changes in computational models and thereby contributes to the documentation of a model's history. Our work facilitates the reuse and extension of existing models and supports collaborative modelling. Finally, it contributes to better reproducibility of modelling results and to the challenge of model provenance. The workflow described in this article is implemented in BiVeS. BiVeS is freely available as source code and binary from sems.uni-rostock.de. The web interface BudHat demonstrates the capabilities of BiVeS at budhat.sems.uni-rostock.de. © The Author 2015. Published by Oxford University Press.

  20. Computer-aided modeling for efficient and innovative product-process engineering

    DEFF Research Database (Denmark)

    Heitzig, Martina

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy and water. This trend is set to continue due to the substantial benefits computer...... in chemical and biochemical engineering have been solved to illustrate the application of the generic modelling methodology, the computeraided modelling framework and the developed software tool.......-aided methods provide. The key prerequisite of computer-aided productprocess engineering is however the availability of models of different types, forms and application modes. The development of the models required for the systems under investigation tends to be a challenging, time-consuming and therefore cost...

  1. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  2. Insecurity of quantum secure computations

    Science.gov (United States)

    Lo, Hoi-Kwong

    1997-08-01

    It had been widely claimed that quantum mechanics can protect private information during public decision in, for example, the so-called two-party secure computation. If this were the case, quantum smart-cards, storing confidential information accessible only to a proper reader, could prevent fake teller machines from learning the PIN (personal identification number) from the customers' input. Although such optimism has been challenged by the recent surprising discovery of the insecurity of the so-called quantum bit commitment, the security of quantum two-party computation itself remains unaddressed. Here I answer this question directly by showing that all one-sided two-party computations (which allow only one of the two parties to learn the result) are necessarily insecure. As corollaries to my results, quantum one-way oblivious password identification and the so-called quantum one-out-of-two oblivious transfer are impossible. I also construct a class of functions that cannot be computed securely in any two-sided two-party computation. Nevertheless, quantum cryptography remains useful in key distribution and can still provide partial security in ``quantum money'' proposed by Wiesner.

  3. Light reflection models for computer graphics.

    Science.gov (United States)

    Greenberg, D P

    1989-04-14

    During the past 20 years, computer graphic techniques for simulating the reflection of light have progressed so that today images of photorealistic quality can be produced. Early algorithms considered direct lighting only, but global illumination phenomena with indirect lighting, surface interreflections, and shadows can now be modeled with ray tracing, radiosity, and Monte Carlo simulations. This article describes the historical development of computer graphic algorithms for light reflection and pictorially illustrates what will be commonly available in the near future.

  4. Python for Scientific Computing Education: Modeling of Queueing Systems

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2014-01-01

    Full Text Available In this paper, we present the methodology for the introduction to scientific computing based on model-centered learning. We propose multiphase queueing systems as a basis for learning objects. We use Python and parallel programming for implementing the models and present the computer code and results of stochastic simulations.

  5. Computational Intelligence. Mortality Models for the Actuary

    NARCIS (Netherlands)

    Willemse, W.J.

    2001-01-01

    This thesis applies computational intelligence to the field of actuarial (insurance) science. In particular, this thesis deals with life insurance where mortality modelling is important. Actuaries use ancient models (mortality laws) from the nineteenth century, for example Gompertz' and Makeham's

  6. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  7. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  8. Improvements in Thermal Performance of Mango Hot-water Treatment Equipments: Data Analysis, Mathematical Modelling and Numerical-computational Simulation

    Directory of Open Access Journals (Sweden)

    Elder M. Mendoza Orbegoso

    2017-06-01

    Full Text Available Mango is one of the most popular and best paid tropical fruits in worldwide markets, its exportation is regulated within a phytosanitary quality control for killing the “fruit fly”. Thus, mangoes must be subject to hot-water treatment process that involves their immersion in hot water over a period of time. In this work, field measurements, analytical and simulation studies are developed on available hot-water treatment equipment called “Original” that only complies with United States phytosanitary protocols. These approaches are made to characterize the fluid-dynamic and thermal behaviours that occur during the mangoes’ hot-water treatment process. Then, analytical model and Computational fluid dynamics simulations are developed for designing new hot-water treatment equipment called “Hybrid” that simultaneously meets with both United States and Japan phytosanitary certifications. Comparisons of analytical results with data field measurements demonstrate that “Hybrid” equipment offers a better fluid-dynamic and thermal performance than “Original” ones.

  9. Secure and Resilient Cloud Computing for the Department of Defense

    Science.gov (United States)

    2015-07-21

    scalability of resource usage. Lincoln Laboratory is developing technology that will strengthen the security and resilience of cloud computing so that the...capabilities are outsourced to a provider that delivers services to a cloud user (also called a tenant). The DoD is looking to the cloud computing model...hardware. Today’s cloud providers and the technology that underpins them are focused on the availability and scalability of services and not on DoD

  10. Life system modeling and intelligent computing. Pt. II. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part II of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 56 papers in this volume are organized in topical sections on advanced evolutionary computing theory and algorithms; advanced neural network and fuzzy system theory and algorithms; modeling and simulation of societies and collective behavior; biomedical signal processing, imaging, and visualization; intelligent computing and control in distributed power generation systems; intelligent methods in power and energy infrastructure development; intelligent modeling, monitoring, and control of complex nonlinear systems. (orig.)

  11. Global Stability of an Epidemic Model of Computer Virus

    Directory of Open Access Journals (Sweden)

    Xiaofan Yang

    2014-01-01

    Full Text Available With the rapid popularization of the Internet, computers can enter or leave the Internet increasingly frequently. In fact, no antivirus software can detect and remove all sorts of computer viruses. This implies that viruses would persist on the Internet. To better understand the spread of computer viruses in these situations, a new propagation model is established and analyzed. The unique equilibrium of the model is globally asymptotically stable, in accordance with the reality. A parameter analysis of the equilibrium is also conducted.

  12. Computer simulation of ultrasonic waves in solids

    International Nuclear Information System (INIS)

    Thibault, G.A.; Chaplin, K.

    1992-01-01

    A computer model that simulates the propagation of ultrasonic waves has been developed at AECL Research, Chalk River Laboratories. This program is called EWE, short for Elastic Wave Equations, the mathematics governing the propagation of ultrasonic waves. This report contains a brief summary of the use of ultrasonic waves in non-destructive testing techniques, a discussion of the EWE simulation code explaining the implementation of the equations and the types of output received from the model, and an example simulation showing the abilities of the model. (author). 2 refs., 2 figs

  13. A soft-contact model for computing safety margins in human prehension.

    Science.gov (United States)

    Singh, Tarkeshwar; Ambike, Satyajit

    2017-10-01

    The soft human digit tip forms contact with grasped objects over a finite area and applies a moment about an axis normal to the area. These moments are important for ensuring stability during precision grasping. However, the contribution of these moments to grasp stability is rarely investigated in prehension studies. The more popular hard-contact model assumes that the digits exert a force vector but no free moment on the grasped object. Many sensorimotor studies use this model and show that humans estimate friction coefficients to scale the normal force to grasp objects stably, i.e. the smoother the surface, the tighter the grasp. The difference between the applied normal force and the minimal normal force needed to prevent slipping is called safety margin and this index is widely used as a measure of grasp planning. Here, we define and quantify safety margin using a more realistic contact model that allows digits to apply both forces and moments. Specifically, we adapt a soft-contact model from robotics and demonstrate that the safety margin thus computed is a more accurate and robust index of grasp planning than its hard-contact variant. Previously, we have used the soft-contact model to propose two indices of grasp planning that show how humans account for the shape and inertial properties of an object. A soft-contact based safety margin offers complementary insights by quantifying how humans may account for surface properties of the object and skin tissue during grasp planning and execution. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Pulse cleaning flow models and numerical computation of candle ceramic filters.

    Science.gov (United States)

    Tian, Gui-shan; Ma, Zhen-ji; Zhang, Xin-yi; Xu, Ting-xiang

    2002-04-01

    Analytical and numerical computed models are developed for reverse pulse cleaning system of candle ceramic filters. A standard turbulent model is demonstrated suitably to the designing computation of reverse pulse cleaning system from the experimental and one-dimensional computational result. The computed results can be used to guide the designing of reverse pulse cleaning system, which is optimum Venturi geometry. From the computed results, the general conclusions and the designing methods are obtained.

  15. The role of computer modelling in participatory integrated assessments

    International Nuclear Information System (INIS)

    Siebenhuener, Bernd; Barth, Volker

    2005-01-01

    In a number of recent research projects, computer models have been included in participatory procedures to assess global environmental change. The intention was to support knowledge production and to help the involved non-scientists to develop a deeper understanding of the interactions between natural and social systems. This paper analyses the experiences made in three projects with the use of computer models from a participatory and a risk management perspective. Our cross-cutting analysis of the objectives, the employed project designs and moderation schemes and the observed learning processes in participatory processes with model use shows that models play a mixed role in informing participants and stimulating discussions. However, no deeper reflection on values and belief systems could be achieved. In terms of the risk management phases, computer models serve best the purposes of problem definition and option assessment within participatory integrated assessment (PIA) processes

  16. Disclosive Computer Ethics: The Exposure and Evaluation of Embedded Normativity in Computer Technology.

    NARCIS (Netherlands)

    Brey, Philip A.E.

    2000-01-01

    This essay provides a critique of mainstream computer ethics and argues for the importance of a complementary approach called disclosive computer ethics, which is concerned with the moral deciphering of embedded values and norms in computer systems, applications and practices. Also, four key values

  17. A Novel Multilayer Correlation Maximization Model for Improving CCA-Based Frequency Recognition in SSVEP Brain-Computer Interface.

    Science.gov (United States)

    Jiao, Yong; Zhang, Yu; Wang, Yu; Wang, Bei; Jin, Jing; Wang, Xingyu

    2018-05-01

    Multiset canonical correlation analysis (MsetCCA) has been successfully applied to optimize the reference signals by extracting common features from multiple sets of electroencephalogram (EEG) for steady-state visual evoked potential (SSVEP) recognition in brain-computer interface application. To avoid extracting the possible noise components as common features, this study proposes a sophisticated extension of MsetCCA, called multilayer correlation maximization (MCM) model for further improving SSVEP recognition accuracy. MCM combines advantages of both CCA and MsetCCA by carrying out three layers of correlation maximization processes. The first layer is to extract the stimulus frequency-related information in using CCA between EEG samples and sine-cosine reference signals. The second layer is to learn reference signals by extracting the common features with MsetCCA. The third layer is to re-optimize the reference signals set in using CCA with sine-cosine reference signals again. Experimental study is implemented to validate effectiveness of the proposed MCM model in comparison with the standard CCA and MsetCCA algorithms. Superior performance of MCM demonstrates its promising potential for the development of an improved SSVEP-based brain-computer interface.

  18. PETRI NET MODELING OF COMPUTER VIRUS LIFE CYCLE

    African Journals Online (AJOL)

    Dr Obe

    dynamic system analysis is applied to model the virus life cycle. Simulation of the derived model ... Keywords: Virus lifecycle, Petri nets, modeling. simulation. .... complex process. Figure 2 .... by creating Matlab files for five different computer ...

  19. Airfoil computations using the gamma-Retheta model; Wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen, Niels N.

    2009-05-15

    The present work addresses the validation of the implementation of the Menter, Langtry et al. gamma-theta correlation based transition model [1, 2, 3] in the EllipSys2D code. Firstly the 2. order of accuracy of the code is verified using a grid refinement study for laminar, turbulent and transitional computations. Based on this, an estimate of the error in the computations is determined to be approximately one percent in the attached region. Following the verification of the implemented model, the model is applied to four airfoils, NACA64-018, NACA64-218, NACA64-418 and NACA64-618 and the results are compared to measurements [4] and computations using the Xfoil code by Drela et al. [5]. In the linear pre stall region good agreement is observed both for lift and drag, while differences to both measurements and Xfoil computations are observed in stalled conditions. (au)

  20. On the usage of ultrasound computational models for decision making under ambiguity

    Science.gov (United States)

    Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron

    2018-04-01

    Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.

  1. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  2. Computer Modelling «Smart Building»

    Directory of Open Access Journals (Sweden)

    O. Yu. Maryasin

    2016-01-01

    Full Text Available Currently ”Smart building” or ”Smart house” technology is developing actively in industrialized countries. The main idea of ”smart building” or ”smart house” is to have a system which is able to identify definite situations happening in house and respond accordingly. Automated house management system is made for automated control and management and also for organization of interaction between separated systems of engineering equipment. This system includes automation subsystems of one or another engineering equipment as separated components. In order to perform study of different functioning modes of engineering subsystems and the whole system, mathematical and computer modeling needs to be used. From mathematical point of veiw description of ”Smart building” is a continuous-discrete or hybrid system consisting of interacting elements of different nature, whose behavior is described by continuous and discrete processes. In the article the authors present a computer model ”Smart building” which allows to model the work of main engineering subsystems and management algorithms. The model is created in Simulink Matlab system with ”physical modeling” library Simscape and Stateflow library. The peculiarity of this model is the use of specialized management and control algorithms which allow providing coordinated interaction of subsystems and optimizing power consumption. 

  3. Environmental Factors Affecting Computer Assisted Language Learning Success: A Complex Dynamic Systems Conceptual Model

    Science.gov (United States)

    Marek, Michael W.; Wu, Wen-Chi Vivian

    2014-01-01

    This conceptual, interdisciplinary inquiry explores Complex Dynamic Systems as the concept relates to the internal and external environmental factors affecting computer assisted language learning (CALL). Based on the results obtained by de Rosnay ["World Futures: The Journal of General Evolution", 67(4/5), 304-315 (2011)], who observed…

  4. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  5. A Set of Free Cross-Platform Authoring Programs for Flexible Web-Based CALL Exercises

    Science.gov (United States)

    O'Brien, Myles

    2012-01-01

    The Mango Suite is a set of three freely downloadable cross-platform authoring programs for flexible network-based CALL exercises. They are Adobe Air applications, so they can be used on Windows, Macintosh, or Linux computers, provided the freely-available Adobe Air has been installed on the computer. The exercises which the programs generate are…

  6. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  7. A scalable approach to modeling groundwater flow on massively parallel computers

    International Nuclear Information System (INIS)

    Ashby, S.F.; Falgout, R.D.; Tompson, A.F.B.

    1995-12-01

    We describe a fully scalable approach to the simulation of groundwater flow on a hierarchy of computing platforms, ranging from workstations to massively parallel computers. Specifically, we advocate the use of scalable conceptual models in which the subsurface model is defined independently of the computational grid on which the simulation takes place. We also describe a scalable multigrid algorithm for computing the groundwater flow velocities. We axe thus able to leverage both the engineer's time spent developing the conceptual model and the computing resources used in the numerical simulation. We have successfully employed this approach at the LLNL site, where we have run simulations ranging in size from just a few thousand spatial zones (on workstations) to more than eight million spatial zones (on the CRAY T3D)-all using the same conceptual model

  8. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

  9. Using Call Detail Records for Modeling Coastal Recreation Behavior

    Science.gov (United States)

    Call data records (CDR) are data from cellular phone networks that can be used to understand human mobility or where people go spatially. They can be used to estimate visitation to an area such as a coastal access point for a given time window, as well as provide information on t...

  10. GRAPHIC, time-sharing magnet design computer programs at Argonne

    International Nuclear Information System (INIS)

    Lari, R.J.

    1974-01-01

    This paper describes three magnet design computer programs in use at the Zero Gradient Synchrotron of Argonne National Laboratory. These programs are used in the time sharing mode in conjunction with a Tektronix model 4012 graphic display terminal. The first program in called TRIM, the second MAGNET, and the third GFUN. (U.S.)

  11. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    Science.gov (United States)

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services. PMID:28112020

  12. Establishing a Cloud Computing Success Model for Hospitals in Taiwan.

    Science.gov (United States)

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  13. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    Directory of Open Access Journals (Sweden)

    Jiunn-Woei Lian PhD

    2017-01-01

    Full Text Available The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  14. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  15. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    Science.gov (United States)

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate

  16. A composite computational model of liver glucose homeostasis. I. Building the composite model.

    Science.gov (United States)

    Hetherington, J; Sumner, T; Seymour, R M; Li, L; Rey, M Varela; Yamaji, S; Saffrey, P; Margoninski, O; Bogle, I D L; Finkelstein, A; Warner, A

    2012-04-07

    A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an 'engineering' approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.

  17. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  18. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  19. [A relational database to store Poison Centers calls].

    Science.gov (United States)

    Barelli, Alessandro; Biondi, Immacolata; Tafani, Chiara; Pellegrini, Aristide; Soave, Maurizio; Gaspari, Rita; Annetta, Maria Giuseppina

    2006-01-01

    Italian Poison Centers answer to approximately 100,000 calls per year. Potentially, this activity is a huge source of data for toxicovigilance and for syndromic surveillance. During the last decade, surveillance systems for early detection of outbreaks have drawn the attention of public health institutions due to the threat of terrorism and high-profile disease outbreaks. Poisoning surveillance needs the ongoing, systematic collection, analysis, interpretation, and dissemination of harmonised data about poisonings from all Poison Centers for use in public health action to reduce morbidity and mortality and to improve health. The entity-relationship model for a Poison Center relational database is extremely complex and not studied in detail. For this reason, not harmonised data collection happens among Italian Poison Centers. Entities are recognizable concepts, either concrete or abstract, such as patients and poisons, or events which have relevance to the database, such as calls. Connectivity and cardinality of relationships are complex as well. A one-to-many relationship exist between calls and patients: for one instance of entity calls, there are zero, one, or many instances of entity patients. At the same time, a one-to-many relationship exist between patients and poisons: for one instance of entity patients, there are zero, one, or many instances of entity poisons. This paper shows a relational model for a poison center database which allows the harmonised data collection of poison centers calls.

  20. 47 CFR 22.921 - 911 call processing procedures; 911-only calling mode.

    Science.gov (United States)

    2010-10-01

    ... programming in the mobile unit that determines the handling of a non-911 call and permit the call to be... CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.921 911 call processing procedures; 911-only calling mode. Mobile telephones manufactured after February 13, 2000 that are capable of...

  1. Attacker Modelling in Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Papini, Davide

    in with our everyday life. This future is visible to everyone nowadays: terms like smartphone, cloud, sensor, network etc. are widely known and used in our everyday life. But what about the security of such systems. Ubiquitous computing devices can be limited in terms of energy, computing power and memory...... attacker remain somehow undened and still under extensive investigation. This Thesis explores the nature of the ubiquitous attacker with a focus on how she interacts with the physical world and it denes a model that captures the abilities of the attacker. Furthermore a quantitative implementation...

  2. Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?

    Science.gov (United States)

    Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.

    2017-12-01

    This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.

  3. Methods for teaching geometric modelling and computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Rotkov, S.I.; Faitel`son, Yu. Ts.

    1992-05-01

    This paper considers methods for teaching the methods and algorithms of geometric modelling and computer graphics to programmers, designers and users of CAD and computer-aided research systems. There is a bibliography that can be used to prepare lectures and practical classes. 37 refs., 1 tab.

  4. Computational Modeling of Teaching and Learning through Application of Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Richard Lamb

    2015-09-01

    Full Text Available Within the mind, there are a myriad of ideas that make sense within the bounds of everyday experience, but are not reflective of how the world actually exists; this is particularly true in the domain of science. Classroom learning with teacher explanation are a bridge through which these naive understandings can be brought in line with scientific reality. The purpose of this paper is to examine how the application of a Multiobjective Evolutionary Algorithm (MOEA can work in concert with an existing computational-model to effectively model critical-thinking in the science classroom. An evolutionary algorithm is an algorithm that iteratively optimizes machine learning based computational models. The research question is, does the application of an evolutionary algorithm provide a means to optimize the Student Task and Cognition Model (STAC-M and does the optimized model sufficiently represent and predict teaching and learning outcomes in the science classroom? Within this computational study, the authors outline and simulate the effect of teaching on the ability of a “virtual” student to solve a Piagetian task. Using the Student Task and Cognition Model (STAC-M a computational model of student cognitive processing in science class developed in 2013, the authors complete a computational experiment which examines the role of cognitive retraining on student learning. Comparison of the STAC-M and the STAC-M with inclusion of the Multiobjective Evolutionary Algorithm shows greater success in solving the Piagetian science-tasks post cognitive retraining with the Multiobjective Evolutionary Algorithm. This illustrates the potential uses of cognitive and neuropsychological computational modeling in educational research. The authors also outline the limitations and assumptions of computational modeling.

  5. Finite difference computing with exponential decay models

    CERN Document Server

    Langtangen, Hans Petter

    2016-01-01

    This text provides a very simple, initial introduction to the complete scientific computing pipeline: models, discretization, algorithms, programming, verification, and visualization. The pedagogical strategy is to use one case study – an ordinary differential equation describing exponential decay processes – to illustrate fundamental concepts in mathematics and computer science. The book is easy to read and only requires a command of one-variable calculus and some very basic knowledge about computer programming. Contrary to similar texts on numerical methods and programming, this text has a much stronger focus on implementation and teaches testing and software engineering in particular. .

  6. Computational modelling of thermo-mechanical and transport properties of carbon nanotubes

    International Nuclear Information System (INIS)

    Rafii-Tabar, H.

    2004-01-01

    Over the recent years, numerical modelling and computer-based simulation of the properties of carbon nanotubes have become the focal points of research in computational nano-science and its associated fields of computational condensed matter physics and materials modelling. Modelling of the mechanical, thermal and transport properties of nanotubes via numerical simulations forms the central part of this research, concerned with the nano-scale mechanics and nano-scale thermodynamics of nanotubes, and nano-scale adsorption, storage and flow properties in nanotubes. A review of these properties, obtained via computational modelling studies, is presented here. We first introduce the physics of carbon nanotubes, and then present the computational simulation tools that are appropriate for conducting a modelling study at the nano-scales. These include the molecular dynamics (MD), the Monte Carlo (MC), and the ab initio MD simulation methods. A complete range of inter-atomic potentials, of two-body and many-body varieties, that underlie all the modelling studies considered in this review is also given. Mechanical models from continuum-based elasticity theory that have been extensively employed in computing the energetics of nanotubes, or interpret the results from atomistic modelling, are presented and discussed. These include models based on the continuum theory of curved plates, shells, vibrating rods and bending beams. The validity of these continuum-based models has also been examined and the conditions under which they are applicable to nanotube modelling have been listed. Pertinent concepts from continuum theories of stress analysis are included, and the relevant methods for conducting the computation of the stress tensor, elastic constants and elastic modulii at the atomic level are also given. We then survey a comprehensive range of modelling studies concerned with the adsorption and storage of gases, and flow of fluids, in carbon nanotubes of various types. This

  7. Computational modelling of thermo-mechanical and transport properties of carbon nanotubes

    Energy Technology Data Exchange (ETDEWEB)

    Rafii-Tabar, H

    2004-02-01

    Over the recent years, numerical modelling and computer-based simulation of the properties of carbon nanotubes have become the focal points of research in computational nano-science and its associated fields of computational condensed matter physics and materials modelling. Modelling of the mechanical, thermal and transport properties of nanotubes via numerical simulations forms the central part of this research, concerned with the nano-scale mechanics and nano-scale thermodynamics of nanotubes, and nano-scale adsorption, storage and flow properties in nanotubes. A review of these properties, obtained via computational modelling studies, is presented here. We first introduce the physics of carbon nanotubes, and then present the computational simulation tools that are appropriate for conducting a modelling study at the nano-scales. These include the molecular dynamics (MD), the Monte Carlo (MC), and the ab initio MD simulation methods. A complete range of inter-atomic potentials, of two-body and many-body varieties, that underlie all the modelling studies considered in this review is also given. Mechanical models from continuum-based elasticity theory that have been extensively employed in computing the energetics of nanotubes, or interpret the results from atomistic modelling, are presented and discussed. These include models based on the continuum theory of curved plates, shells, vibrating rods and bending beams. The validity of these continuum-based models has also been examined and the conditions under which they are applicable to nanotube modelling have been listed. Pertinent concepts from continuum theories of stress analysis are included, and the relevant methods for conducting the computation of the stress tensor, elastic constants and elastic modulii at the atomic level are also given. We then survey a comprehensive range of modelling studies concerned with the adsorption and storage of gases, and flow of fluids, in carbon nanotubes of various types. This

  8. Ocean Modeling and Visualization on Massively Parallel Computer

    Science.gov (United States)

    Chao, Yi; Li, P. Peggy; Wang, Ping; Katz, Daniel S.; Cheng, Benny N.

    1997-01-01

    Climate modeling is one of the grand challenges of computational science, and ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change.

  9. Computer modeling of flow induced in-reactor vibrations

    International Nuclear Information System (INIS)

    Turula, P.; Mulcahy, T.M.

    1977-01-01

    An assessment of the reliability of finite element method computer models, as applied to the computation of flow induced vibration response of components used in nuclear reactors, is presented. The prototype under consideration was the Fast Flux Test Facility reactor being constructed for US-ERDA. Data were available from an extensive test program which used a scale model simulating the hydraulic and structural characteristics of the prototype components, subjected to scaled prototypic flow conditions as well as to laboratory shaker excitations. Corresponding analytical solutions of the component vibration problems were obtained using the NASTRAN computer code. Modal analyses and response analyses were performed. The effect of the surrounding fluid was accounted for. Several possible forcing function definitions were considered. Results indicate that modal computations agree well with experimental data. Response amplitude comparisons are good only under conditions favorable to a clear definition of the structural and hydraulic properties affecting the component motion. 20 refs

  10. Comprehensive analysis of ultrasonic vocalizations in a mouse model of fragile X syndrome reveals limited, call type specific deficits.

    Directory of Open Access Journals (Sweden)

    Snigdha Roy

    Full Text Available Fragile X syndrome (FXS is a well-recognized form of inherited mental retardation, caused by a mutation in the fragile X mental retardation 1 (Fmr1 gene. The gene is located on the long arm of the X chromosome and encodes fragile X mental retardation protein (FMRP. Absence of FMRP in fragile X patients as well as in Fmr1 knockout (KO mice results, among other changes, in abnormal dendritic spine formation and altered synaptic plasticity in the neocortex and hippocampus. Clinical features of FXS include cognitive impairment, anxiety, abnormal social interaction, mental retardation, motor coordination and speech articulation deficits. Mouse pups generate ultrasonic vocalizations (USVs when isolated from their mothers. Whether those social ultrasonic vocalizations are deficient in mouse models of FXS is unknown. Here we compared isolation-induced USVs generated by pups of Fmr1-KO mice with those of their wild type (WT littermates. Though the total number of calls was not significantly different between genotypes, a detailed analysis of 10 different categories of calls revealed that loss of Fmr1 expression in mice causes limited and call-type specific deficits in ultrasonic vocalization: the carrier frequency of flat calls was higher, the percentage of downward calls was lower and that the frequency range of complex calls was wider in Fmr1-KO mice compared to their WT littermates.

  11. To EDF: PHOEBUS, computer-assisted engineering

    International Nuclear Information System (INIS)

    Lecocq, P.; Goudal, J.C.; Barache, J.M.; Feingold, D.; Cornon, P.

    1986-01-01

    EDF has built a modular integrated computer-assisted nuclear engineering (CAE) system called POEBUS. Since 1975, the Organization has been interested in CAE which places computer methods and technology at the disposal of the engineering department. This paper describes the changes and their effect on the techniques in use, working methods and the personnel called on to put them into operation [fr

  12. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    Science.gov (United States)

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  13. dDocent: a RADseq, variant-calling pipeline designed for population genomics of non-model organisms.

    Science.gov (United States)

    Puritz, Jonathan B; Hollenbeck, Christopher M; Gold, John R

    2014-01-01

    Restriction-site associated DNA sequencing (RADseq) has become a powerful and useful approach for population genomics. Currently, no software exists that utilizes both paired-end reads from RADseq data to efficiently produce population-informative variant calls, especially for non-model organisms with large effective population sizes and high levels of genetic polymorphism. dDocent is an analysis pipeline with a user-friendly, command-line interface designed to process individually barcoded RADseq data (with double cut sites) into informative SNPs/Indels for population-level analyses. The pipeline, written in BASH, uses data reduction techniques and other stand-alone software packages to perform quality trimming and adapter removal, de novo assembly of RAD loci, read mapping, SNP and Indel calling, and baseline data filtering. Double-digest RAD data from population pairings of three different marine fishes were used to compare dDocent with Stacks, the first generally available, widely used pipeline for analysis of RADseq data. dDocent consistently identified more SNPs shared across greater numbers of individuals and with higher levels of coverage. This is due to the fact that dDocent quality trims instead of filtering, incorporates both forward and reverse reads (including reads with INDEL polymorphisms) in assembly, mapping, and SNP calling. The pipeline and a comprehensive user guide can be found at http://dDocent.wordpress.com.

  14. Model Infrastruktur dan Manajemen Platform Server Berbasis Cloud Computing

    Directory of Open Access Journals (Sweden)

    Mulki Indana Zulfa

    2017-11-01

    Full Text Available Cloud computing is a new technology that is still very rapidly growing. This technology makes the Internet as the main media for the management of data and applications remotely. Cloud computing allows users to run an application without having to think about infrastructure and its platforms. Other technical aspects such as memory, storage, backup and restore, can be done very easily. This research is intended to modeling the infrastructure and management of computer platform in computer network of Faculty of Engineering, University of Jenderal Soedirman. The first stage in this research is literature study, by finding out the implementation model in previous research. Then the result will be combined with a new approach to existing resources and try to implement directly on the existing server network. The results showed that the implementation of cloud computing technology is able to replace the existing platform network.

  15. Complex system modelling and control through intelligent soft computations

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The book offers a snapshot of the theories and applications of soft computing in the area of complex systems modeling and control. It presents the most important findings discussed during the 5th International Conference on Modelling, Identification and Control, held in Cairo, from August 31-September 2, 2013. The book consists of twenty-nine selected contributions, which have been thoroughly reviewed and extended before their inclusion in the volume. The different chapters, written by active researchers in the field, report on both current theories and important applications of soft-computing. Besides providing the readers with soft-computing fundamentals, and soft-computing based inductive methodologies/algorithms, the book also discusses key industrial soft-computing applications, as well as multidisciplinary solutions developed for a variety of purposes, like windup control, waste management, security issues, biomedical applications and many others. It is a perfect reference guide for graduate students, r...

  16. The Mechanics of Embodiment: A Dialog on Embodiment and Computational Modeling

    Science.gov (United States)

    Pezzulo, Giovanni; Barsalou, Lawrence W.; Cangelosi, Angelo; Fischer, Martin H.; McRae, Ken; Spivey, Michael J.

    2011-01-01

    Embodied theories are increasingly challenging traditional views of cognition by arguing that conceptual representations that constitute our knowledge are grounded in sensory and motor experiences, and processed at this sensorimotor level, rather than being represented and processed abstractly in an amodal conceptual system. Given the established empirical foundation, and the relatively underspecified theories to date, many researchers are extremely interested in embodied cognition but are clamoring for more mechanistic implementations. What is needed at this stage is a push toward explicit computational models that implement sensorimotor grounding as intrinsic to cognitive processes. In this article, six authors from varying backgrounds and approaches address issues concerning the construction of embodied computational models, and illustrate what they view as the critical current and next steps toward mechanistic theories of embodiment. The first part has the form of a dialog between two fictional characters: Ernest, the “experimenter,” and Mary, the “computational modeler.” The dialog consists of an interactive sequence of questions, requests for clarification, challenges, and (tentative) answers, and touches the most important aspects of grounded theories that should inform computational modeling and, conversely, the impact that computational modeling could have on embodied theories. The second part of the article discusses the most important open challenges for embodied computational modeling. PMID:21713184

  17. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  18. Computer Modelling of Photochemical Smog Formation

    Science.gov (United States)

    Huebert, Barry J.

    1974-01-01

    Discusses a computer program that has been used in environmental chemistry courses as an example of modelling as a vehicle for teaching chemical dynamics, and as a demonstration of some of the factors which affect the production of smog. (Author/GS)

  19. Application of a computer model to the study of the geothermic field of Mofete, Italy

    Energy Technology Data Exchange (ETDEWEB)

    Giannone, G.; Turriani, C; Pistellie, E

    1984-01-01

    The aim of the study was to develop a reliable and comprehensive reservoir simulation package for a better understanding of ''in-situ'' processes pertinent to geothermal reservoir hydrodynamics and thermodynamics, and to enable an assessment of optimum reservoir management strategies as to production and reinjection policies. The study consists of four parts: The first deals with the computer programme. This is based on a programme called ''CHARG''developed in the US. Some adaptation was necessary. The second part concerns the fall-off and pit-tests of the geothermal well close to Naples ''Mofete 2''. This has been a crucial test for the CHARG model using asymmetric cylindrical coordinates and 14 different layers. Part three deals with predictions about longevity of the geothermal field of Mofete. The area is divided into 2500 blocs distibuted over 14 layers. Several configurations (various numbers of production and reinjection wells) have been tested. The last chapter delas with a comparison between the ISMES reservoir model, based on the finite elements approach and the AGIP model (finite differences). Both models give nearly the same results when applied to the geothermal field of Travale.

  20. Hybrid computer modelling in plasma physics

    International Nuclear Information System (INIS)

    Hromadka, J; Ibehej, T; Hrach, R

    2016-01-01

    Our contribution is devoted to development of hybrid modelling techniques. We investigate sheath structures in the vicinity of solids immersed in low temperature argon plasma of different pressures by means of particle and fluid computer models. We discuss the differences in results obtained by these methods and try to propose a way to improve the results of fluid models in the low pressure area. There is a possibility to employ Chapman-Enskog method to find appropriate closure relations of fluid equations in a case when particle distribution function is not Maxwellian. We try to follow this way to enhance fluid model and to use it in hybrid plasma model further. (paper)

  1. A Computational Model of Selection by Consequences

    Science.gov (United States)

    McDowell, J. J.

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…

  2. A cervid vocal fold model suggests greater glottal efficiency in calling at high frequencies.

    Directory of Open Access Journals (Sweden)

    Ingo R Titze

    2010-08-01

    Full Text Available Male Rocky Mountain elk (Cervus elaphus nelsoni produce loud and high fundamental frequency bugles during the mating season, in contrast to the male European Red Deer (Cervus elaphus scoticus who produces loud and low fundamental frequency roaring calls. A critical step in understanding vocal communication is to relate sound complexity to anatomy and physiology in a causal manner. Experimentation at the sound source, often difficult in vivo in mammals, is simulated here by a finite element model of the larynx and a wave propagation model of the vocal tract, both based on the morphology and biomechanics of the elk. The model can produce a wide range of fundamental frequencies. Low fundamental frequencies require low vocal fold strain, but large lung pressure and large glottal flow if sound intensity level is to exceed 70 dB at 10 m distance. A high-frequency bugle requires both large muscular effort (to strain the vocal ligament and high lung pressure (to overcome phonation threshold pressure, but at least 10 dB more intensity level can be achieved. Glottal efficiency, the ration of radiated sound power to aerodynamic power at the glottis, is higher in elk, suggesting an advantage of high-pitched signaling. This advantage is based on two aspects; first, the lower airflow required for aerodynamic power and, second, an acoustic radiation advantage at higher frequencies. Both signal types are used by the respective males during the mating season and probably serve as honest signals. The two signal types relate differently to physical qualities of the sender. The low-frequency sound (Red Deer call relates to overall body size via a strong relationship between acoustic parameters and the size of vocal organs and body size. The high-frequency bugle may signal muscular strength and endurance, via a 'vocalizing at the edge' mechanism, for which efficiency is critical.

  3. Process-Based Development of Competence Models to Computer Science Education

    Science.gov (United States)

    Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter

    2016-01-01

    A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…

  4. Security Management Model in Cloud Computing Environment

    OpenAIRE

    Ahmadpanah, Seyed Hossein

    2016-01-01

    In the cloud computing environment, cloud virtual machine (VM) will be more and more the number of virtual machine security and management faced giant Challenge. In order to address security issues cloud computing virtualization environment, this paper presents a virtual machine based on efficient and dynamic deployment VM security management model state migration and scheduling, study of which virtual machine security architecture, based on AHP (Analytic Hierarchy Process) virtual machine de...

  5. A Relational Account of Call-by-Value Sequentiality

    DEFF Research Database (Denmark)

    Riecke, Jon Gary; Sandholm, Anders Bo

    2002-01-01

    We construct a model for FPC, a purely functional, sequential, call-by-value language. The model is built from partial continuous functions, in the style of Plotkin, further constrained to be uniform with respect to a class of logical relations. We prove that the model is fully abstract....

  6. Modeling warm dense matter experiments using the 3D ALE-AMR code and the move toward exascale computing

    International Nuclear Information System (INIS)

    Koniges, A.; Liu, W.; Barnard, J.; Friedman, A.; Logan, G.; Eder, D.; Fisher, A.; Masters, N.; Bertozzi, A.

    2013-01-01

    The Neutralized Drift Compression Experiment II (NDCX II) is an induction accelerator planned for initial commissioning in 2012. The final design calls for a 3 MeV, Li + ion beam, delivered in a bunch with characteristic pulse duration of 1 ns, and transverse dimension of order 1 mm. The NDCX II will be used in studies of material in the warm dense matter (WDM) regime, and ion beam/hydrodynamic coupling experiments relevant to heavy ion based inertial fusion energy. We discuss recent efforts to adapt the 3D ALE-AMR code to model WDM experiments on NDCX II. The code, which combines Arbitrary Lagrangian Eulerian (ALE) hydrodynamics with Adaptive Mesh Refinement (AMR), has physics models that include ion deposition, radiation hydrodynamics, thermal diffusion, anisotropic material strength with material time history, and advanced models for fragmentation. Experiments at NDCX-II will explore the process of bubble and droplet formation (two-phase expansion) of superheated metal solids using ion beams. Experiments at higher temperatures will explore equation of state and heavy ion fusion beam-to-target energy coupling efficiency. Ion beams allow precise control of local beam energy deposition providing uniform volumetric heating on a timescale shorter than that of hydrodynamic expansion. We also briefly discuss the effects of the move to exascale computing and related computational changes on general modeling codes in fusion. (authors)

  7. CTserver: A Computational Thermodynamics Server for the Geoscience Community

    Science.gov (United States)

    Kress, V. C.; Ghiorso, M. S.

    2006-12-01

    The CTserver platform is an Internet-based computational resource that provides on-demand services in Computational Thermodynamics (CT) to a diverse geoscience user base. This NSF-supported resource can be accessed at ctserver.ofm-research.org. The CTserver infrastructure leverages a high-quality and rigorously tested software library of routines for computing equilibrium phase assemblages and for evaluating internally consistent thermodynamic properties of materials, e.g. mineral solid solutions and a variety of geological fluids, including magmas. Thermodynamic models are currently available for 167 phases. Recent additions include Duan, Møller and Weare's model for supercritical C-O-H-S, extended to include SO2 and S2 species, and an entirely new associated solution model for O-S-Fe-Ni sulfide liquids. This software library is accessed via the CORBA Internet protocol for client-server communication. CORBA provides a standardized, object-oriented, language and platform independent, fast, low-bandwidth interface to phase property modules running on the server cluster. Network transport, language translation and resource allocation are handled by the CORBA interface. Users access server functionality in two principal ways. Clients written as browser- based Java applets may be downloaded which provide specific functionality such as retrieval of thermodynamic properties of phases, computation of phase equilibria for systems of specified composition, or modeling the evolution of these systems along some particular reaction path. This level of user interaction requires minimal programming effort and is ideal for classroom use. A more universal and flexible mode of CTserver access involves making remote procedure calls from user programs directly to the server public interface. The CTserver infrastructure relieves the user of the burden of implementing and testing the often complex thermodynamic models of real liquids and solids. A pilot application of this distributed

  8. Validation of Computer Models for Homeland Security Purposes

    International Nuclear Information System (INIS)

    Schweppe, John E.; Ely, James; Kouzes, Richard T.; McConn, Ronald J.; Pagh, Richard T.; Robinson, Sean M.; Siciliano, Edward R.; Borgardt, James D.; Bender, Sarah E.; Earnhart, Alison H.

    2005-01-01

    At Pacific Northwest National Laboratory, we are developing computer models of radiation portal monitors for screening vehicles and cargo. Detailed models of the radiation detection equipment, vehicles, cargo containers, cargos, and radioactive sources have been created. These are used to determine the optimal configuration of detectors and the best alarm algorithms for the detection of items of interest while minimizing nuisance alarms due to the presence of legitimate radioactive material in the commerce stream. Most of the modeling is done with the Monte Carlo code MCNP to describe the transport of gammas and neutrons from extended sources through large, irregularly shaped absorbers to large detectors. A fundamental prerequisite is the validation of the computational models against field measurements. We describe the first step of this validation process, the comparison of the models to measurements with bare static sources

  9. Know Your Personal Computer

    Indian Academy of Sciences (India)

    computer with IBM PC .... read by a human and not translated by a compiler are called .... by different stages of education becomes a computer scientist. ... ancestors knew and carried out the semantic actions without question or comment.

  10. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  11. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  12. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  13. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  14. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ... ray beam follows a spiral path. A special computer program processes this large volume of data to ...

  15. Computer-animated model of accommodation and presbyopia.

    Science.gov (United States)

    Goldberg, Daniel B

    2015-02-01

    To understand, demonstrate, and further research the mechanisms of accommodation and presbyopia. Private practice, Little Silver, New Jersey, USA. Experimental study. The CAMA 2.0 computer-animated model of accommodation and presbyopia was produced in collaboration with an experienced medical animator using Autodesk Maya animation software and Adobe After Effects. The computer-animated model demonstrates the configuration and synchronous movements of all accommodative elements. A new classification of the zonular apparatus based on structure and function is proposed. There are 3 divisions of zonular fibers; that is, anterior, crossing, and posterior. The crossing zonular fibers form a scaffolding to support the lens; the anterior and posterior zonular fibers work reciprocally to achieve focused vision. The model demonstrates the important support function of Weiger ligament. Dynamic movement of the ora serrata demonstrates that the forces of ciliary muscle contraction store energy for disaccommodation in the elastic choroid. The flow of aqueous and vitreous provides strong evidence for our understanding of the hydrodynamic interactions during the accommodative cycle. The interaction may result from the elastic stretch in the choroid transmitted to the vitreous rather than from vitreous pressue. The model supports the concept that presbyopia results from loss of elasticity and increasing ocular rigidity in both the lenticular and extralenticular structures. The computer-animated model demonstrates the structures of accommodation moving in synchrony and might enhance understanding of the mechanisms of accommodation and presbyopia. Dr. Goldberg is a consultant to Acevision, Inc., and Bausch & Lomb. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  16. Computer - based modeling in extract sciences research -III ...

    African Journals Online (AJOL)

    Molecular modeling techniques have been of great applicability in the study of the biological sciences and other exact science fields like agriculture, mathematics, computer science and the like. In this write up, a list of computer programs for predicting, for instance, the structure of proteins has been provided. Discussions on ...

  17. Application of computer-aided multi-scale modelling framework – Aerosol case study

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Glarborg, Peter

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy and water. This trend is set to continue due to the substantial benefits computer-aided...... methods provide. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task involving...... numerous steps, expert skills and different modelling tools. This motivates the development of a computer-aided modelling framework that supports the user during model development, documentation, analysis, identification, application and re-use with the goal to increase the efficiency of the modelling...

  18. Computer Aided Multi-Data Fusion Dismount Modeling

    Science.gov (United States)

    2012-03-22

    dependent on a particular environmental condition. They are costly, cumbersome, and involve dedicated software practices and particular knowledge to operate...allow manipulation of 2D matrices, like Microsoft Excel or Libre Office. The second alternative is to modify an already created model (MEM). The model... software . Therefore, with the described computer aided multi-data dismount model the researcher will be able to attach signatures to any desired

  19. Prevendo a demanda de ligações em um call center por meio de um modelo de Regressão Múltipla Forecasting a call center demand using a Multiple Regression model

    Directory of Open Access Journals (Sweden)

    Marco Aurélio Carino Bouzada

    2009-09-01

    Full Text Available Este trabalho descreve - por meio do estudo de um caso - o problema da previsão de demanda de chamadas para um determinado produto no call center de uma grande empresa brasileira do setor - a Contax - e como ele foi abordado com o uso de Regressão Múltipla com variáveis dummy. Depois de destacar e justificar a importância do tema, o estudo apresenta uma breve revisão de literatura acerca de métodos de previsão de demanda e de sua aplicação em call centers. O caso é descrito, contextualizando, inicialmente, a empresa estudada e descrevendo, a seguir, a forma como ela lida com o problema de previsão de demanda de chamadas para o produto 103 - serviços relacionados à telefonia fixa. Um modelo de Regressão Múltipla com variáveis dummy é, então, desenvolvido para servir como base do processo de previsão de demanda proposto. Este modelo utiliza informações disponíveis capazes de influenciar a demanda, tais como o dia da semana, a ocorrência ou não de feriado e a proximidade da data com eventos críticos, como a chegada da conta à residência do cliente e seu vencimento; e apresentou ganhos de acurácia da ordem de 3 pontos percentuais para o período estudado, quando comparado com a ferramenta anteriormente em uso.This work describes - with the aid of a case study -a demand forecast problem for a specific product reported to the call center of a large Brazilian company in an industry called Contax, and the way it was approached with the use of Multiple Regression using dummy variables. After highlighting and justifying the studied matter relevance, the article presents a small literature review regarding demand forecast methods and their use in the call center industry. The case is described presenting the studied company and the way it deals with the Forecasting Demand for a telephone all center regarding telephone services products. Therefore, a Multiple Regression with dummy variables model was developed to work as the

  20. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  1. One-dimensional computational modeling on nuclear reactor problems

    International Nuclear Information System (INIS)

    Alves Filho, Hermes; Baptista, Josue Costa; Trindade, Luiz Fernando Santos; Heringer, Juan Diego dos Santos

    2013-01-01

    In this article, we present a computational modeling, which gives us a dynamic view of some applications of Nuclear Engineering, specifically in the power distribution and the effective multiplication factor (keff) calculations. We work with one-dimensional problems of deterministic neutron transport theory, with the linearized Boltzmann equation in the discrete ordinates (SN) formulation, independent of time, with isotropic scattering and then built a software (Simulator) for modeling computational problems used in a typical calculations. The program used in the implementation of the simulator was Matlab, version 7.0. (author)

  2. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  3. Visual Attention Modeling for Stereoscopic Video: A Benchmark and Computational Model.

    Science.gov (United States)

    Fang, Yuming; Zhang, Chi; Li, Jing; Lei, Jianjun; Perreira Da Silva, Matthieu; Le Callet, Patrick

    2017-10-01

    In this paper, we investigate the visual attention modeling for stereoscopic video from the following two aspects. First, we build one large-scale eye tracking database as the benchmark of visual attention modeling for stereoscopic video. The database includes 47 video sequences and their corresponding eye fixation data. Second, we propose a novel computational model of visual attention for stereoscopic video based on Gestalt theory. In the proposed model, we extract the low-level features, including luminance, color, texture, and depth, from discrete cosine transform coefficients, which are used to calculate feature contrast for the spatial saliency computation. The temporal saliency is calculated by the motion contrast from the planar and depth motion features in the stereoscopic video sequences. The final saliency is estimated by fusing the spatial and temporal saliency with uncertainty weighting, which is estimated by the laws of proximity, continuity, and common fate in Gestalt theory. Experimental results show that the proposed method outperforms the state-of-the-art stereoscopic video saliency detection models on our built large-scale eye tracking database and one other database (DML-ITRACK-3D).

  4. Computer modeling of commercial refrigerated warehouse facilities

    International Nuclear Information System (INIS)

    Nicoulin, C.V.; Jacobs, P.C.; Tory, S.

    1997-01-01

    The use of computer models to simulate the energy performance of large commercial refrigeration systems typically found in food processing facilities is an area of engineering practice that has seen little development to date. Current techniques employed in predicting energy consumption by such systems have focused on temperature bin methods of analysis. Existing simulation tools such as DOE2 are designed to model commercial buildings and grocery store refrigeration systems. The HVAC and Refrigeration system performance models in these simulations tools model equipment common to commercial buildings and groceries, and respond to energy-efficiency measures likely to be applied to these building types. The applicability of traditional building energy simulation tools to model refrigerated warehouse performance and analyze energy-saving options is limited. The paper will present the results of modeling work undertaken to evaluate energy savings resulting from incentives offered by a California utility to its Refrigerated Warehouse Program participants. The TRNSYS general-purpose transient simulation model was used to predict facility performance and estimate program savings. Custom TRNSYS components were developed to address modeling issues specific to refrigerated warehouse systems, including warehouse loading door infiltration calculations, an evaporator model, single-state and multi-stage compressor models, evaporative condenser models, and defrost energy requirements. The main focus of the paper will be on the modeling approach. The results from the computer simulations, along with overall program impact evaluation results, will also be presented

  5. Phylogenetic signal in the acoustic parameters of the advertisement calls of four clades of anurans.

    Science.gov (United States)

    Gingras, Bruno; Mohandesan, Elmira; Boko, Drasko; Fitch, W Tecumseh

    2013-07-01

    Anuran vocalizations, especially their advertisement calls, are largely species-specific and can be used to identify taxonomic affiliations. Because anurans are not vocal learners, their vocalizations are generally assumed to have a strong genetic component. This suggests that the degree of similarity between advertisement calls may be related to large-scale phylogenetic relationships. To test this hypothesis, advertisement calls from 90 species belonging to four large clades (Bufo, Hylinae, Leptodactylus, and Rana) were analyzed. Phylogenetic distances were estimated based on the DNA sequences of the 12S mitochondrial ribosomal RNA gene, and, for a subset of 49 species, on the rhodopsin gene. Mean values for five acoustic parameters (coefficient of variation of root-mean-square amplitude, dominant frequency, spectral flux, spectral irregularity, and spectral flatness) were computed for each species. We then tested for phylogenetic signal on the body-size-corrected residuals of these five parameters, using three statistical tests (Moran's I, Mantel, and Blomberg's K) and three models of genetic distance (pairwise distances, Abouheif's proximities, and the variance-covariance matrix derived from the phylogenetic tree). A significant phylogenetic signal was detected for most acoustic parameters on the 12S dataset, across statistical tests and genetic distance models, both for the entire sample of 90 species and within clades in several cases. A further analysis on a subset of 49 species using genetic distances derived from rhodopsin and from 12S broadly confirmed the results obtained on the larger sample, indicating that the phylogenetic signals observed in these acoustic parameters can be detected using a variety of genetic distance models derived either from a variable mitochondrial sequence or from a conserved nuclear gene. We found a robust relationship, in a large number of species, between anuran phylogenetic relatedness and acoustic similarity in the

  6. Computer models for fading channels with applications to digital transmission

    Science.gov (United States)

    Loo, Chun; Secord, Norman

    1991-11-01

    The authors describe computer models for Rayleigh, Rician, log-normal, and land-mobile-satellite fading channels. All computer models for the fading channels are based on the manipulation of a white Gaussian random process. This process is approximated by a sum of sinusoids with random phase angle. These models compare very well with analytical models in terms of their probability distribution of envelope and phase of the fading signal. For the land mobile satellite fading channel, results of level crossing rate and average fade duration are given. These results show that the computer models can provide a good coarse estimate of the time statistic of the faded signal. Also, for the land-mobile-satellite fading channel, the results show that a 3-pole Butterworth shaping filter should be used with the model. An example of the application of the land-mobile-satellite fading-channel model to predict the performance of a differential phase-shift keying signal is described.

  7. Forensic Computing (Dagstuhl Seminar 13482)

    OpenAIRE

    Freiling, Felix C.; Hornung, Gerrit; Polcák, Radim

    2014-01-01

    Forensic computing} (sometimes also called digital forensics, computer forensics or IT forensics) is a branch of forensic science pertaining to digital evidence, i.e., any legal evidence that is processed by digital computer systems or stored on digital storage media. Forensic computing is a new discipline evolving within the intersection of several established research areas such as computer science, computer engineering and law. Forensic computing is rapidly gaining importance since the...

  8. Computer-Based Molecular Modelling: Finnish School Teachers' Experiences and Views

    Science.gov (United States)

    Aksela, Maija; Lundell, Jan

    2008-01-01

    Modern computer-based molecular modelling opens up new possibilities for chemistry teaching at different levels. This article presents a case study seeking insight into Finnish school teachers' use of computer-based molecular modelling in teaching chemistry, into the different working and teaching methods used, and their opinions about necessary…

  9. Towards a Computational Model of the Self-attribution of Agency

    NARCIS (Netherlands)

    Hindriks, K.V.; Wiggers, P.; Jonker, C.M.; Haselager, W.F.G.; Mehrotra, K.G.; Mohan, C.K.; Oh, J.C.; Varshney, P.K.; Ali, M.

    2011-01-01

    In this paper, a first step towards a computational model of the self-attribution of agency is presented, based on Wegner’s theory of apparent mental causation. A model to compute a feeling of doing based on first-order Bayesian network theory is introduced that incorporates the main contributing

  10. Towards a computational model of the self-attribution of agency

    NARCIS (Netherlands)

    Hindriks, K.V.; Wiggers, P.; Jonker, C.M.; Haselager, W.F.G.; Olivier, P.; Kray, C.

    2007-01-01

    In this paper, a first step towards a computational model of the self-attribution of agency is presented, based on Wegner’s theory of apparent mental causation. A model to compute a feeling of doing based on first-order Bayesian network theory is introduced that incorporates the main contributing

  11. Ewe: a computer model for ultrasonic inspection

    International Nuclear Information System (INIS)

    Douglas, S.R.; Chaplin, K.R.

    1991-11-01

    The computer program EWE simulates the propagation of elastic waves in solids and liquids. It has been applied to ultrasonic testing to study the echoes generated by cracks and other types of defects. A discussion of the elastic wave equations is given, including the first-order formulation, shear and compression waves, surface waves and boundaries, numerical method of solution, models for cracks and slot defects, input wave generation, returning echo construction, and general computer issues

  12. Soft Tissue Biomechanical Modeling for Computer Assisted Surgery

    CERN Document Server

    2012-01-01

      This volume focuses on the biomechanical modeling of biological tissues in the context of Computer Assisted Surgery (CAS). More specifically, deformable soft tissues are addressed since they are the subject of the most recent developments in this field. The pioneering works on this CAS topic date from the 1980's, with applications in orthopaedics and biomechanical models of bones. More recently, however, biomechanical models of soft tissues have been proposed since most of the human body is made of soft organs that can be deformed by the surgical gesture. Such models are much more complicated to handle since the tissues can be subject to large deformations (non-linear geometrical framework) as well as complex stress/strain relationships (non-linear mechanical framework). Part 1 of the volume presents biomechanical models that have been developed in a CAS context and used during surgery. This is particularly new since most of the soft tissues models already proposed concern Computer Assisted Planning, with ...

  13. A Review of Multimedia Glosses and Their Effects on L2 Vocabulary Acquisition in CALL Literature

    Science.gov (United States)

    Mohsen, Mohammed Ali; Balakumar, M.

    2011-01-01

    This article reviews the literature of multimedia glosses in computer assisted language learning (CALL) and their effects on L2 vocabulary acquisition during the past seventeen years. Several studies have touched on this area to examine the potential of multimedia in a CALL environment in aiding L2 vocabulary acquisition. In this review, the…

  14. An approximate fractional Gaussian noise model with computational cost

    KAUST Repository

    Sørbye, Sigrunn H.

    2017-09-18

    Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood-based approach is ${\\\\mathcal O}(n^{2})$, exploiting the Toeplitz structure of the covariance matrix. In most realistic cases, we do not observe the fGn process directly but only through indirect Gaussian observations, so the Toeplitz structure is easily lost and the computational cost increases to ${\\\\mathcal O}(n^{3})$. This paper presents an approximate fGn model of ${\\\\mathcal O}(n)$ computational cost, both with direct or indirect Gaussian observations, with or without conditioning. This is achieved by approximating fGn with a weighted sum of independent first-order autoregressive processes, fitting the parameters of the approximation to match the autocorrelation function of the fGn model. The resulting approximation is stationary despite being Markov and gives a remarkably accurate fit using only four components. The performance of the approximate fGn model is demonstrated in simulations and two real data examples.

  15. Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations

    Science.gov (United States)

    Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa

    2013-01-01

    The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…

  16. Validation of a phytoremediation computer model

    Energy Technology Data Exchange (ETDEWEB)

    Corapcioglu, M Y; Sung, K; Rhykerd, R L; Munster, C; Drew, M [Texas A and M Univ., College Station, TX (United States)

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg[sub -1

  17. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  18. Depth-Averaged Non-Hydrostatic Hydrodynamic Model Using a New Multithreading Parallel Computing Method

    Directory of Open Access Journals (Sweden)

    Ling Kang

    2017-03-01

    Full Text Available Compared to the hydrostatic hydrodynamic model, the non-hydrostatic hydrodynamic model can accurately simulate flows that feature vertical accelerations. The model’s low computational efficiency severely restricts its wider application. This paper proposes a non-hydrostatic hydrodynamic model based on a multithreading parallel computing method. The horizontal momentum equation is obtained by integrating the Navier–Stokes equations from the bottom to the free surface. The vertical momentum equation is approximated by the Keller-box scheme. A two-step method is used to solve the model equations. A parallel strategy based on block decomposition computation is utilized. The original computational domain is subdivided into two subdomains that are physically connected via a virtual boundary technique. Two sub-threads are created and tasked with the computation of the two subdomains. The producer–consumer model and the thread lock technique are used to achieve synchronous communication between sub-threads. The validity of the model was verified by solitary wave propagation experiments over a flat bottom and slope, followed by two sinusoidal wave propagation experiments over submerged breakwater. The parallel computing method proposed here was found to effectively enhance computational efficiency and save 20%–40% computation time compared to serial computing. The parallel acceleration rate and acceleration efficiency are approximately 1.45% and 72%, respectively. The parallel computing method makes a contribution to the popularization of non-hydrostatic models.

  19. Computer-assisted detection of pulmonary embolism: evaluation of pulmonary CT angiograms performed in an on-call setting

    International Nuclear Information System (INIS)

    Wittenberg, Rianne; Peters, Joost F.; Sonnemans, Jeroen J.; Prokop, Mathias; Schaefer-Prokop, Cornelia M.

    2010-01-01

    The purpose of the study was to assess the stand-alone performance of computer-assisted detection (CAD) for evaluation of pulmonary CT angiograms (CTPA) performed in an on-call setting. In this institutional review board-approved study, we retrospectively included 292 consecutive CTPA performed during night shifts and weekends over a period of 16 months. Original reports were compared with a dedicated CAD system for pulmonary emboli (PE). A reference standard for the presence of PE was established using independent evaluation by two readers and consultation of a third experienced radiologist in discordant cases. Original reports had described 225 negative studies and 67 positive studies for PE. CAD found PE in seven patients originally reported as negative but identified by independent evaluation: emboli were located in segmental (n = 2) and subsegmental arteries (n = 5). The negative predictive value (NPV) of the CAD algorithm was 92% (44/48). On average there were 4.7 false positives (FP) per examination (median 2, range 0-42). In 72% of studies ≤5 FP were found, 13% of studies had ≥10 FP. CAD identified small emboli originally missed under clinical conditions and found 93% of the isolated subsegmental emboli. On average there were 4.7 FP per examination. (orig.)

  20. Reflexive Photography, Attitudes, Behavior, and CALL: ITAs Improving Spoken English Intelligibility

    Science.gov (United States)

    Wallace, Lara

    2015-01-01

    Research in the field of Computer-Assisted Language Learning (CALL) has frequently taken a top-down approach when investigating learners' attitudes and behavior, both in the course as well as for their personal use. Suggestions are given for use of technology, and future research (Beatty, 2010; Levy & Stockwell, 2006). One perspective that has…